I'll give you the informal answer.
First, construct a function $f : \mathbb{N} \to Type$ defined by $\DeclareMathOperator{Id}{Id}$
$$f(0) = \bot$$
$$f(S(n)) = \top$$
Then, prove $\prod n, m : \mathbb{N}, (\Id_\mathbb{N}(n, m) \to f(n) \to f(m))$ using path induction.
Then if we have $\Id(S(x), 0)$, then we have $f(S(x)) \to f(0)$. That is, we have $\top \to \bot$.
Now $\top$ is a true proposition, so we can conclude $\bot$. Or in other words, there is some element $* : \top$, so therefore we can use our function $\top \to \bot$ to get an element of $\bot$.
In order to give you a formal answer, you would need to include your rule for path induction.
Edit: let's proceed with a slightly more formal proof.
To define $f : \mathbb{N} \to Type$, we define
$$f :\equiv \lambda n . rec[z.Type](n, \bot, xy. \top)$$
Then, we define $g : \prod n, m : \mathbb{N}, (\Id_\mathbb{N}(n, m) \to f(n) \to f(m))$ by
$$g :\equiv \lambda n . \lambda m . \lambda p . J[n, m, z . f(n) \to f(m)](p; x . \lambda y . y)$$
Now, we define $h : \prod x : \mathbb{N}, \Id(S(x), 0) \to \bot$ by
$$h :\equiv \lambda x . \lambda p . g(S(x), 0, p)(*)$$
To proceed with maximum formality and with no intermediate definitions at all, we have
$$\lambda x . \lambda p . (\lambda n . \lambda m . \lambda . p . J[n, m, z . (\lambda n . rec[z.Type](n, \bot, xy. \top))(n) \to (\lambda n . rec[z.Type](n, \bot, xy. \top))(m)](p; x . \lambda y . y)) ((S(x), 0, p)(*) : \prod x : \mathbb{N}, \Id(S(x), 0) \to \bot$$
What you mentioned is usually called the substitution rule. And it's fine for logic in general. But if you want to get into categorical logic you need something that fits better with categorical notions.
In this case you get that the equality is left adjoint to contraction. Let me explain a bit further.
Assume that for each list of variables you can construct the set of all formulas that depend on those variables. Call that $P(xs)$ where $xs$ is a list of variables.
Now $P$ is actually a functor $:Var \rightarrow HA$ where $HA$ is the category of Heyting algebras and their corresponding morphisms. And you may think of what I wrote $Var$ as $FinSet$ or $\mathbb N$ or whatever implementation you like.
An important detail about this, is that we don't mean free variables exactly. The formula $x=y$ can be thought of as depending only on $x$ and $y$, or depending also on another variable say $z$, and as many as we wanted, as so happens with functions.
Consider a map in $Var$ that sends the list $[x,y]$ to the list $[x]$. Call it $\pi$.
Now what happens when we send $\pi$ through $P$?
We get that $P\ \pi : P\ [x,y] \rightarrow P\ [x]$.
Given a formula that depends on $x$ and $y$, $P\ \pi$ lets us think of it as applied to $x$ twice. i.e: $\varphi(x,y) \mapsto \varphi(x,x)$. Called a "contraction".
What would a left adjoint to it look like?
$$(P\ [x,y])\big(L\ \varphi(x) , \psi(x,y)\big)
\cong
(P\ [x])\big(\varphi(x) , P\ \pi\ \psi(x,y)\big)$$
Remember we are in $HA$ here so arrows are "proofs" which I'll write as $\vdash$, and functors that map these arrows as proof trees, as is usual.
Let $L\ \varphi(x) :\equiv \varphi(x)\land x=y$.
If we pick $\varphi(x) :\equiv \top$ we get
$$\dfrac{\vdash \psi(x,x)}{x=y \vdash \psi(x,y)}$$
which semantically is exactly the same as what you have. (Here it actually goes not only from top to bottom of the tree but also from bottom to top, but you can deduce one from the other anyway.)
Hope this helps, feel free to ask anything if I went too quickly at some point.
Best Answer
That looks right.
However, I have to say this exercise looks bizarre to me. Note that Rijke's "variable conversion rule" can also be derived from the term conversion rule together with substitution, so we might as well have taken the term conversion rule as the primitive one instead of the variable conversion rule. This is what's is done in nearly all presentations of dependent type theory that I've seen; I don't know why Rijke did it a different way.