The expression you give is the derivative of the function $x \mapsto x^{123}$. There are a few ways that you can attack this, however nearly all of them are ultimately going to rest on an induction argument somewhere along the line. The other answers cite the binomial theorem, which is usually proved by induction. I'll give another argument, which does not rely on the binomial theorem.
First, let us recall the principle of mathematical induction. The basic idea is that if we want to show that some proposition $P$ holds for all natural numbers, then we (1) prove that $P(1)$ holds, then (2) prove that if $P(k)$ happens to hold for some $k$, then it must be that $P(k+1)$ holds. This is sufficient to prove that $P(n)$ holds for all $n$.
The intuition is as follows: suppose that we have shown that $P(1)$ holds and that $P(k) \implies P(k+1)$. Then, with $k=1$, we have that $P(2) = P(1+1)$ holds. But now $P(2)$ holds, and so with $k=2$, we have that $P(3) = P(2+1)$ holds. But now $P(3)$ holds, and so...
The page to which I linked above describes this as setting up a sequence of dominoes, which is a nice analogy.
With respect to the current problem, we wish to show that
$$ \frac{\mathrm{d}}{\mathrm{d}x} x^n = n x^{n-1} $$
for all $n \in \mathbb{N}$. To do this, I am going to first assume that a basic theory of limits has been developed for functions on $\mathbb{R}$. With the basic arithmetic of limits assumed, we first prove a little lemma:
Lemma (The Product Rule): Let $f$ and $g$ be differentiable functions. Then
$$ \frac{\mathrm{d}}{\mathrm{d}x} \left[ f(x) g(x) \right]
= \left[ \frac{\mathrm{d}}{\mathrm{d}x} f(x) \right] g(x) + f(x) \left[ \frac{\mathrm{d}}{\mathrm{d}x} g(x) \right]. $$
That is, $(fg)' = f'g + fg'$.
Proof: By definition of the derivative
\begin{align}
\frac{\mathrm{d}}{\mathrm{d}x} \left[ f(x) g(x) \right]
&= \lim_{h\to 0} \frac{f(x+h)g(x+h) - f(x)g(x)}{h} \\
&= \lim_{h\to 0} \frac{f(x+h)g(x+h) - f(x)g(x+h) + f(x)g(x+h) - f(x)g(x)}{h} \tag{1} \\
&= \lim_{h\to 0} \left[ \frac{f(x+h)-f(x)}{h}g(x+h) + f(x) \frac{g(x+h) - g(x)}{h} \right] \\
&= \left[ \lim_{h\to 0} \frac{f(x+h)-f(x)}{h} \right] \left[ \lim_{h\to 0} g(x+h) \right] + f(x) \left[ \lim_{h\to 0} \frac{g(x+h)-g(x)}{h} \right] \tag{2} \\
&= \left[ \frac{\mathrm{d}}{\mathrm{d}x} f(x) \right] g(x) + f(x) \left[ \frac{\mathrm{d}}{\mathrm{d}x} g(x) \right].
\end{align}
At (1), we are just adding zero in a clever way. At (2), we use the fact that addition and multiplication are continuous. In elementary calculus classes, these properties are usually taught as specific rules to be memorized, e.g. "the limit of a sum is the sum of the limits." Finally, the last line follows from the definition of the derivative and the fact that if $g$ is differentiable, then it is also continuous, and so $\lim_{h\to 0} g(x+h) = g(x)$. //
We can now prove the desired result:
Theorem (The Baby Power Rule): Let $n \in \mathbb{N} = \{1,2,\dotsc\}$. Then
$$
\frac{\mathrm{d}}{\mathrm{d}x} x^n = n x^{n-1}.
$$
Proof: The proof is by induction. As a basis for induction, we show that the result holds for $n=1$:
$$ \frac{\mathrm{d}}{\mathrm{d}x} x^1
= \lim_{h\to 0} \frac{(x+h)^1 - x^1}{h}
= \lim_{h\to 0} \frac{x+h-x}{h}
= \lim_{h\to 0} \frac{h}{h}
= 1,
$$
which is the desired result, and provides a basis for induction.
For induction, suppose that
$$
\frac{\mathrm{d}}{\mathrm{d}x} x^k = k x^{k-1}
$$
for some $k\in\mathbb{N}$. This assumption is called the induction hypothesis. The goal now is to show that this assumption implies that
$$
\frac{\mathrm{d}}{\mathrm{d}x} x^{k+1} = (k+1) x^{k}.
$$
This follows from the product rule, shown above:
\begin{align}
\frac{\mathrm{d}}{\mathrm{d}x} x^{k+1}
&= \frac{\mathrm{d}}{\mathrm{d}x} \left[ x \cdot x^{k} \right] \\
&= \left[ \frac{\mathrm{d}}{\mathrm{d}x} x \right] x^{k} + x \left[ \frac{\mathrm{d}}{\mathrm{d}x} x^k \right] \tag{product rule} \\
&= 1 \cdot x^{k} + x \left[ \frac{\mathrm{d}}{\mathrm{d}x} x^k \right] \tag{base case} \\
&= x^k + x \cdot kx^{k-1} \tag{induction hypothesis} \\
&= x^k + k x^k \tag{algebra} \\
&= (k+1) x^k, \tag{more algebra}
\end{align}
which is the desired result. That is
$$
\frac{\mathrm{d}}{\mathrm{d}x} x^k = k x^{k-1}
\implies
\frac{\mathrm{d}}{\mathrm{d}x} x^{k+1} = (k+1) x^{k},
$$
which completes the induction proof. //
Since the point $n\pi+(\pi/4)$ under consideration, say $a$, is irrational we have $f(a) =\cos a$ and thus $$|f(x) - f(a) |=|\sin x - \cos a|=|\sin x - \sin a|$$ as $\cos a =\sin a$ when $x$ is rational and $$|f(x) - f(a) |=|\cos x-\cos a|$$ if $x$ is irrational.
Now observe that both the differences $|\sin x-\sin a|$ and $|\cos x - \cos a|$ never exceed $|x-a|$. Why??
Well, $$|\sin x - \sin a|=|2\cos((x+a)/2)\sin((x-a)/2)|\leq 2|(x-a)/2|=|x-a|$$ and similarly one can handle the other difference.
Hence $$0\leq |f(x) - f(a) |\leq |x-a|$$ If you are aware of definition of limit then the above inequality allows you to take $\delta =\epsilon $ and show that $\lim_{x\to a} f(x) =f(a) $.
On the other hand if you are not aware of definition of limit you can use Squeeze theorem to conclude $\lim_{x\to a} f(x) =f(a) $. And therefore the function is continuous at all points $a$ of the form $a=n\pi+(\pi/4)$.
Best Answer
After reading some comments by OP to the other answer, I think that I need to expand my comment (to OP's question) into a full answer.
The fundamental issue here is that OP thinks $\lim_{x \to a}f(x)$ to be on the same level as $f(a)$. But this is not the right approach. In fact the limit may exist (and often does) even when $f(a)$ is not defined.
OP reasons that $\lim_{h \to 0}(2x + h)$ only approaches $8$ and is not exactly $8$ when $x = 4$. I think a crudely correct statement nearest in meaning to the last sentence is this:
Exression $2x + h$ approaches $8$ and is not exactly $8$ when $x = 4$ and $h$ approaches $0$. On the other hand the expression $\lim_{h \to 0}(2x + h)$ is a number which is dependent on $x$ so that it is $8$ when $x = 4$ but it is something totally independent of $h$.
It is important to understand that $L = \lim_{x \to a}f(x)$ is something totally independent of $x$ and instead dependent on $a$ and $f$ (this part is not that difficult to believe) but at the same time time it is also totally independent of $f(a)$ (this is very hard to accept for beginners, in fact $f(a)$ may or may not be defined). The dependence of $L$ on $a$ and $f$ is related to values of $f$ at points near $a$ and the dependence is defined using the usual technical definition of limit involving $\epsilon, \delta$.
The issue with slope of tangent is that many beginners think that there is a definition of tangent to a curve at a point which uses ideas fundamentally different from the concept of derivative. This is another deep misconception and probably it stems from the fact that before calculus the only curve-tangent stuff is the "tangent to a circle" which has a definition not dependent on derivative. A tangent to a circle at point $P$ can be defined as a line passing through $P$ and perpendicular to the radius passing through $P$ (it can also be defined as a line which intersects the circle at only one point and that point is $P$). For curves other than circle such a definition is not available.
Even in the case of circle the definition based on geometry matches with the definition based on derivative. For a general curve the slope of tangent to a curve is defined by the derivative of an appropriate function and hence by definition the derivative equals the slope of tangent.