Ultimately, I think the reason is that $x$ is a differentiable function of $t$ which particularly means that as $\Delta t\rightarrow 0$, $\Delta x\rightarrow 0$. That said I really don't like this approach to the chain rule. It is a bit cobbled together.
I feel that this is the better way to do chain rule (which is much clearer from the get go, I think). Let $f$ and $g$ be differentiable and $g$ non-constant. If $g$ is the constant function, then clearly $(f\circ g)'$ is zero since $f\circ g$ would be constant. So trivially it is true that $(f\circ g)' = f'(g(x))g'(x)$ since $g' = 0$. I'll assume they're both defined on all of $\mathbb{R}$ to avoid annoyances with domains and ranges.
Then we wish to evaluate $(f\circ g)'(x)$. From the definition of the derivative, we have
$$ (f\circ g)'(x) = \frac{d}{dx}f(g(x)) = \lim_{\Delta x\rightarrow 0} \frac{f(g(x+\Delta x))-f(g(x))}{\Delta x}.$$
Let's multiply by a clever form of $1$ to make things easier on ourselves, particularly we'll multiply by
$$\frac{g(x+\Delta x)-g(x)}{g(x+\Delta x)-g(x)}.$$
This is where I required that $g$ be non-constant. If it were constant, the above expression would make no sense since we would be dividing $0$ by $0$. Note that this resembles the terms inside of $f$. This is not by accident. Additionally we have that $g(x+\Delta x)\approx g(x)+g'(x)\Delta x$ for small $\Delta x$ (this is what the derivative is for - linear approximations). If $g'(x) = 0$, then $g(x+\Delta x)\approx g(x)$ and in this case, we have
$$\lim_{\Delta x\rightarrow 0}\frac{f(g(x+\Delta x))-f(g(x))}{\Delta x} = 0.$$
Again, this is clearly equal to $f'(g(x))g'(x)$ since $g' = 0$. If $g'(x)\neq 0$, we can make use of our clever form of $1$ to get
$$(f\circ g)'(x) = \lim_{\Delta x\rightarrow 0}\frac{f(g(x+\Delta x))-f(g(x))}{g(x+\Delta x)-g(x)}\frac{g(x+\Delta x)-g(x)}{\Delta x}.$$
Now both pieces look eerily like a derivative (which is what we want), except that the first piece has $g$ in it. However if $\Delta x\rightarrow 0$, we know that $g(x+\Delta x)\rightarrow g(x)$ since $g$ is differentiable (and therefore continuous). Clearly $\lim_{\Delta x\rightarrow 0}\frac{g(x+\Delta x)-g(x)}{\Delta x}$ exists since $g$ is differentiable so we need only to argue that $\lim_{\Delta x\rightarrow 0}\frac{f(g(x+\Delta x)-f(g(x))}{g(x+\Delta x)-g(x)}$ is well-defined. By limit theorems we know that if both limits exists, we can distribute the limit to each piece and evaluate.
So we want to argue that
$$\lim_{\Delta x\rightarrow 0}\frac{f(g(x+\Delta x))-f(g(x))}{g(x+\Delta x)-g(x)}$$
is well-defined. Using our approximation for $g(x+\Delta x)$ from above, we have
$$\lim_{\Delta x\rightarrow 0}\frac{f(g(x)+g'(x)\Delta x)-f(g(x))}{g(x)+g'(x)\Delta x-g(x)}.$$
Cancelling appropriate terms we have
$$\lim_{\Delta x\rightarrow 0}\frac{f(g(x)+g'(x)\Delta x)-f(g(x))}{g'(x)\Delta x}.$$
Repeating the same logic as above with $f(g(x)+g'(x)\Delta x)$, we have that $f(g(x)+g'(x)\Delta x)\approx f(g(x))+f'(g(x))g'(x)\Delta x$. And so we get
$$\lim_{\Delta x\rightarrow 0}\frac{f(g(x))+f'(g(x))g'(x)\Delta x-f(g(x))}{g'(x)\Delta x} = f'(g(x)).$$
Since the limit of the first piece makes sense and the limit of the second piece makes sense, we can distribute the limits to get that
$$(f\circ g)'(x) = \lim_{\Delta x\rightarrow 0}\frac{f(g(x+\Delta x))-f(g(x))}{g(x+\Delta x)-g(x)}\lim_{\Delta x\rightarrow 0}\frac{g(x+\Delta x)-g(x)}{\Delta x} = f'(g(x))g'(x)$$
by our above calculations. So in each case that emerged we had that $(f\circ g)'(x) = f'(g(x))g'(x)$ and so we conclude that the chain rule holds.
Best Answer
Everything in the "proof" will depend on your definition of the function $e^x$. I will choose the definition $$ e^h \overset{\text{def}}{=} \sum_{k=0}^\infty \frac{h^k}{k!}. $$ Using this, one sees that $$ \frac{e^h - 1}{h} = \frac{\sum_{k=0}^\infty \frac{h^k}{k!} - 1}{h} = \sum_{k=1}^\infty \frac{h^{k-1}}{k!} = 1 + h \sum_{k=0}^\infty \frac{h^k}{(k+2)!}. $$ If you have studied convergence tests, you know that the last series on the right converges for all $h \in \mathbb R$, hence taking the limit when $h \to 0$, the RHS goes to $1$ because the series will converge to something and the $h$ factored out will make the product go to zero.
Another way to do this would be to show that this series is an analytic function and is its own Taylor expansion around zero (this is not hard to do using convergence tests), so to differentiate it you can go term by term and readily see that its derivative is itself.
A third approach, which will sound a little stupid and meaningless but is nonetheless funny, is choosing another definition for $e^x$ : consider the differential equation $$ f'(x) = f(x), \quad f(0) = 1. $$ Using differential equation theory it is really not hard at all to show that the solutions to the equation (without the initial condition) is a one-dimensional vector space and there exists an unique element of this vector space which satisfies the initial condition $f(0) = 1$, because the solutions are of the form $Cg(x)$ for some solution $g(x)$. Let $exp(x)$ be defined as a solution to this differential equation satisfying the initial condition. Then clearly $exp'(x) = exp(x)$. Then you can easily see that $exp'(x)$ is a differentiable function, so by induction $exp(x)$ is an infinitely differentiable function with Taylor expansion $$ \exp(x) = \sum_{k=0}^\infty \frac{x^k}{k!}. $$ I mentioned it to show the importance of which definition one decides to choose ; it can change the whole structure of an argument.
Hope that helps,