You can use newton's method to compute the digits of $\sqrt{(2)}$:
Let:
$$
f(x) = x^2 -2
$$
Define the iteration:
$$
x_0 = 1\\
x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}
$$
This will converge to $\sqrt{2}$ quadratically.
If you want to compute other square roots:
Consider:
$$g(x) = x^2 - a$$
Which has the iterants:
$$
x_{n+1}=\frac{1}{2}\left(x_n+\frac{a}{x_n}\right)
$$
As mentioned below.
There's also what's called the continued fraction expansion of an algebraic number. You can use a finite continued fraction expansion.
As an example:
$$
x_0 = 1 \\
x_1 = \frac{1}{2}\left(x_0 + \frac{2}{x_0}\right) =\frac{1}{2}\left( \large \textbf{1} + \frac{2}{ \large \mathbf{1}}\right) = \frac{3}{2}\\
x_2 = \frac{1}{2}\left(x_1 + \frac{2}{x_1}\right) = \frac{1}{2}\left( \large \mathbf{\frac{3}{2}} + \frac{2}{ \large \mathbf{\frac{3}{2}}}\right), \text{ etc. }
$$
Added
Since we are using Newton's method, and you are wondering why it converges to the root of $f(x)$,
Note the following:
$\textbf{Theorem} $:
Suppose that the function $f$ has a zero at $\alpha$, i.e., $f(\alpha) = 0$
If $f$ is continuously differentiable and its derivative is nonzero at $\alpha$, then there exists a neighborhood of $\alpha$ such that for all starting values $x_0$ in that neighborhood, the sequence ${x_n}$ will converge to $\alpha$.
So if we choose our starting guess appropriately, Newton's method always converges to the root of the equation if $f$ has these properties .
The underlying issue here is that (assuming you want to stay within the real numbers) when $c<0$, the function $c^x$ is undefined for most values of $x$. Specifically, it's undefined unless $x$ is a rational number whose denominator is odd. There is no continuous/differentiable function underlying the places where it is defined.
Therefore, there is no possible guess-and-check algorithm that gradually becomes more accurate. First, guess-and-check algorithms require an underlying continuous function. Second, the value you're seeking might simply not be defined.
So the need to determine whether the exponent is a fraction with odd denominator, which in other contexts might be considered inelegant, here is simply a necessary step in the problem you're trying to solve. (And really, people shouldn't be inputting $c^x$ when $c<0$ and $x$ is a decimal ... they're just asking for trouble, for all the reasons mentioned above.)
Best Answer
It's not that you can't do it (if you're willing to accept answers that are non-real complex numbers). It's that there are multiple possible answers and no obvious way to choose among those answers that preserves certain "nice" properties we want those answers to have.
That's not a problem with with positive reals. When we take a rational power $s=r^{\frac pq}$, we're really saying that $s^q=r^p$. There are multiple complex numbers $s$ (in fact, $q$ of them) that satisfy this equation, but exactly one of them will always be a positive real number and that's the "obvious" choice. Importantly, this choice lets us extend the definition of $r^t$, where $t \in \Bbb R \setminus \Bbb Q$, in a "nice" (continuous) way when $r$ is a positive real number.
If you try this when $r$ is a negative real number, there is often no obvious choice because there are no negative real solutions for $s$ unless $q$ is odd. And it turns out that there's no choice that always lets you extend the definition of exponentiation beyond rational exponents in a "nice" way, although any choice you make will allow you to extend the definition of exponentiation to almost all complex numbers. These wrinkles are usually taught in advanced undergraduate or graduate classes, so for high school classes by far the more prudent course is to simply say "Don't do it."