Let $F:X\to\mathbb R^n$ defined as follows
$$
F(x)=\big(f_1(x),\ldots,f_n(x)\big).
$$
Clearly $F$ is a linear transformation and its range $Y$ is linear subspace of $\mathbb R^n$.
We shall show that the range is the whole $\mathbb R^n$. If it is not, then we can find a vector $v\in\mathbb R^n$, such that $v=(v_1,\ldots,v_n)\perp Y$, i.e.
$$
v\cdot y=0, \quad \text{for all}\,\,\,y\in Y.
$$
But this means that for every $x\in X$:
$$
0=v\cdot F(x)=v\cdot \big(f_1(x),\ldots,f_n(x)\big)=v_1f_1(x)+\cdots+v_nf_n(x),
$$
which means that the $f_1,\ldots,f_n$ are linearly dependent, a contradiction. Hence $F$ is onto. Let $x_1,\ldots,x_n$, such that $F(x_j)=e_j$, where $e_1,\ldots,e_n$ is the standard basis of $\mathbb R^n$. Then the matrix
$$
\big(f_i(x_j)\big)_{i,j=1}^n,
$$
is the unit matrix, and its determinant is equal to $1$.
Note. The proof is valid even in the case in which $\mathbb R^n$ is replaced by $\mathbb C^n$.
Your proof is fine, I suppose. A minor reformulation if one wants to avoid the slightly informal "repeat the process" and "$\ldots$" could be to either write this as proof by induction ($c_n\ne 0$ leads to a contradiction; hence $c_n=0$. But then by induction hypotheses also $c_k=0$ for $1\le k\le n-1$). Or assume that not all $c_i$ are $=0$, let $m$ be maximal with $c_m=0$, then write $f_M$ in terms of the lower degree $f_i$ and arrive at the same contradiction.
Best Answer
The statement is incorrect as attested by the case $n=3, d=2$ and the linearly independent polynomials $$f_1=x_1^2,\quad f_2=x_1x_2,\quad f_3=x_2^2 $$ The jacobian determinant is identically zero.
Indeed its third column is zero, because the $f_i$'s do not not depend on $x_3$.