I became aware of this question by way of an answer on Meta and feel I must push back against the comment of DonAntonio and the answer of amWhy.
Uniqueness of the solution of the system
$$
\begin{aligned}
x_1&=3\\
x_2&=2\\
x_3&=3
\end{aligned}
$$
is obvious and needs no proof. What is there to prove? Is it conceivable that if you plug in numbers other than $3,$ $2,$ and $3$ for $x_1,$ $x_2,$ and $x_3$ you might obtain three true statements?
The determinant is a complicated object, and by bringing it in in this situation, you are making something simple appear much more difficult than it actually is.
Here's what I think you were probably getting at: Let's start with a simpler analogue. Is the solution of the equation $x=2$ unique? Of course is is: $2$ is the only solution. Now $x=2$ may be the end result of simplifying a more complicated equation, such as $13x=26.$ The latter is a special case of the general equation $ax=b.$ It is certainly the case that the latter has a unique solution if and only if $a\ne0.$ If $a=0,$ then there is no solution unless $b=0,$ in which case there are infinitely many solutions.
Likewise, the matrix equation $Ax=b,$ where $A$ is a square matrix and $x$ and $b$ are column vectors, has a unique solution if and only if $\det A\ne0.$ If $\det A=0,$ then it has either no solution or infinitely many solutions.
So it is helpful to introduce the determinant to make statements about the nature of the solution set of the general equation $Ax=b.$ But for concrete $A$ and $b,$ it is usually more efficient to row reduce the system than to compute $\det A.$ (More precisely, computing $\det A$ is best done by actually performing row reduction, but there is no need to mention determinants if you are row reducing to solve a concrete problem.) The end result of the row-reduction process will tell you whether there is a unique solution or not.
The only thing that might need proof is that the three row operations (swapping rows, multiplying a row by a non-zero number, adding a multiple of one row to another row) preserve the solution set. That is generally proved in a linear algebra course, and you can probably assume it from that point on. If not, let $S$ be a system and let $S'$ be the system that results from applying a row operation. You just need to prove that any solution to $S$ is a solution to $S',$ and that any solution to $S',$ is a solution to $S.$ This is straightforward, but it seems like overkill to do it in every row reduction problem you perform.
This is my first longer answer on math.stack, but you asked something I'm working in.
If you are interested in algorithmic results and real equations, then one could say that for polynomials, everything is decidable and for more general functions, almost everything is undecidable. In particular, the whole first-order theory of real numbers with multiplication is decidable; so you can include polynomial equations, quantifiers, disjunctions etc.
However, once you allow to use arbitrary complicated functions containing compositions of polynomials and the sin function, then the existence of a solution is already algorithmically undecidable.
Common software packages are based on iterative methods, I think that most of them use some variants of Newton method; but the above result shows that you cannot have completeness results using such algorithms.
For compact domains and some reasonably computable input functions, there are still undecidable results, look at this question that I asked on mathoverflow.
If you have $n$ equations in $n$ variables $f(x)=0$ on a bounded box $B=[0,1]^n$ such that $f$ can be reasonably represented on a computer - for example, it is a combination of common functions - then you can always disprove the existence of a solution by interval arithmetic, if the solution doesn't exists; however, such algorithm would never terminate if the solution exists. On the other hand, you can prove the existence of a solution if $0\notin f(\partial B)$ and the degree $\deg(f,B)\neq 0$ (the degree can be computed); this is equivalent to nonextendability of $\partial B\to\mathbb{R}^n\setminus\{0\}$ to a continuous map $B\to\mathbb{R}^n\setminus\{0\}$. This is always the case if the solution is robust (i.e. resistant wrt. perturbations of $f$); for this statement and a bit of history of the problem, here is a paper on this topic. This is an analogy to the determinant, because if the determinant is nonzero, the solution is also "stable" wrt. small perturbations of the equations.
An interesting question is if, for a piece-wise linear map $f: K\to\mathbb{R}^n$ on a finite simplicial complex $K$ and a number $\alpha>0$, whether it holds that for each $\alpha$-perturbation $g$ of $f$ (that is, $\|g-f\|\leq \alpha)$, $g(x)=0$ has a solution in $K$. Surprisingly, this is decidable if $\dim K\leq 2n-3$ or $n$ is even, and undecidable for a fixed odd $n$ and arbitrary $K$. You can find the corresponding paper here.
Best Answer
Substituting the second equation for $y$ into the first equation, we get $$(x^3+1)^3 = x+2 \iff (x^3+1)^3 - x - 2 = 0.$$ Now consider the polynomial $f(x) = (x^3+1)^3 - x - 2$, for which any root yields a solution to the original system. Thus we want to analyze $f$ in such a way that we can prove there is only one real root. We can see that for any $x$ with $|x| \leq \frac{1}{2}$ we have that $$ |(x^3+1)^3 - x| \leq (|x|^3+|1|)^3 + |x| \leq (|.5|^3+1)^3 + |.5| < 2,$$ and so $f$ is negative on the interval $\left[-\frac{1}{2},\frac{1}{2}\right]$. Additionally, if $-1 < x < -\frac{1}{2}$ then we have that $$(x^3+1)^3 - x < ((-.5)^3+1)^3 + 1 < 2$$ and if $-2 < x < -1$ we have that $$(x^3+1)^3 - x < ((-1)^3+1)^3 + 2 = 2.$$ Thus $f$ is negative on the interval $\left(-2,\frac{1}{2}\right]$.
Now take the derivative to get $$ f'(x) = 9x^2(x^3+1)^2 - 1,$$ and notice that $f'(x) > 0$ for every $x \in (-\infty,-2]\cup \left(\frac{1}{2},\infty\right)$. Therefore $f$ is monotonically increasing in this region. All together, we see that $f$ is negative for $x \leq \frac{1}{2}$ and monotonically increasing for all $x > \frac{1}{2}$; thus $f$ has exactly 1 real root.