Note that $a_0$ doesn't really have a separate formula:
If you plug in $n=0$ into $a_n$ you get the same integral.
For an intuition as to why these formula are what they are, say you wanted to calculate a Fourier series for $\cos(nx$). You know that all the coefficients except for $b_n$ must be zero. So how do you make that happen? You use the fact that:
$$\int_{-\pi}^\pi \cos(nx)\cos(mx)dx=0 \ \ \bigg| \ \ m\neq n$$
$$\frac{1}{\pi}\int_{-\pi}^\pi \cos(nx)\cos(mx)dx=1 \ \ \bigg| \ \ m= n=0$$
Doing the same thing for $\sin$, you'll see that only way you can get the coefficients to be consistent with $f(x)=\cos(nx)$ or $f(x)=\sin(nx)$ is to choose the formula's you brought up.
$\newcommand{\Vector}[1]{\mathbf{#1}}\newcommand{\vece}{\Vector{e}}$The linked questions provide good answers, but may be at the technical end of "intuitive". Here's a fast-and-loose conceptual motivation:
If $\bigl(V, \langle\ ,\ \rangle\bigr)$ is an $N$-dimensional real inner product space, and if $\{\vece_{n}\}_{n=1}^{N}$ is an (ordered) orthonormal basis, then an arbitrary vector $v$ in $V$ may be written as a linear combination
$$
v = \sum_{n=1}^{N} \langle v, \vece_{n}\rangle \vece_{n}.
\tag{1}
$$
Indeed, $\{\vece_{n}\}$ is a basis of $V$, so there exist real coefficients $a_{k}$ such that
$$
v = \sum_{k=1}^{N} a_{k} \vece_{k}.
\tag{2}
$$
Taking the inner product of each side with $\vece_{n}$ gives
$\langle v, \vece_{n}\rangle = a_{n}$ because the basis $\{\vece_{n}\}$ is orthonormal.
Loosely, one might expect a similar conclusion to hold if $V$ is infinite-dimensional. Getting the definitions and hypotheses right, and proving a version of (1) in this new setting, is why any "honest" answer is bound to be technical. Phrases in quotes below are not mathematically correct, and therefore require careful inspection and/or justification.
Intuitively, let $L > 0$ be real, let $V$ be "the space of real-valued functions" on $[-L, L]$, and define an "inner product" by
$$
\langle f, g\rangle = \frac{1}{L} \int_{-L}^{L} f(t) g(t)\, dt.
$$
The functions
$$
C_{n}(t) = \begin{cases}
1/\sqrt{2}, & n = 0, \\
\cos(n\pi t/L), & n > 0;
\end{cases}\qquad
S_{n}(t) = \sin(n\pi t/L),\quad n > 0;
$$
turn out (by elementary calculus and trigonometry) to form an "orthonormal basis" of $V$.
Loosely, we expect that if $f$ is a function, we can express $f$ as an infinite sum of these basis functions, and the coefficients are the inner products of $f$ with the basis elements, i.e. (for $n > 0$),
\begin{align*}
a_{0} &= \langle f, 1\rangle = \frac{1}{L} \int_{-L}^{L} f(t)\, dt, \\
a_{n} &= \langle f, C_{n}\rangle = \frac{1}{L} \int_{-L}^{L} f(t) \cos(n\pi t/L)\, dt, \\
b_{n} &= \langle f, S_{n}\rangle = \frac{1}{L} \int_{-L}^{L} f(t) \sin(n\pi t/L)\, dt, \\
f(t) &= \frac{a_{0}}{2} + \sum_{n=1}^{\infty} (a_{n} C_{n}(t) + b_{n} S_{n}(t), \\
&= \frac{a_{0}}{2} + \sum_{n=1}^{\infty} a_{n} \cos(n\pi t/L) + b_{n} \sin(n\pi t/L).
\end{align*}
(The "special" factor of $1/2$ on the constant term arises because $C_{0} = 1/\sqrt{2} \neq 1$.)
Best Answer
This answer is mostly for students who used an algebra approach. I don't know if Fourier himself thought up the series this way, but it is common today. I left a lot of steps out and mainly showed ideas that I struggled with when I first tried to motivate the Fourier Series.
I'll start off by observing a trigonometric polynomial:
$T(x) = c_0 + c_1 \cos(x) + c_2 \cos(2x)+...+c_n \cos(n x) + d_1 \sin(x) + ... + d_n \sin(n x)$
where $c_n$ and $d_n$ are some non-zero value. The goal is write the orthogonal basis, from there I can find the coefficients. So, I need declare the inner product: $<\textbf{f},\textbf{g}> = \int_0^{2\pi} f(x)g(x)dx$
Before I can obtain a orthogonal basis, I should first get the orthonormal basis by using the Gram–Schmidt process. Where $\|\textbf{g}_0\| = \|\textbf{g}_1\| =\|\textbf{g}_2\| = ... =\|\textbf{g}_n\| = 1$.
$\|\textbf{f}\|^2 = <\textbf{f}><\textbf{f}> = 2\pi$
Thus, $\|\textbf{f}\| = \sqrt{2\pi}$
and
$e_0 = \frac{\textbf{f}}{\|\textbf{f}\|}$
$\textbf{g}_0 = e_0 = \frac{1}{\sqrt{2\pi}}$
$\textbf{g}_1 = e_1 = \frac{1}{\sqrt{\pi}}\cos(x)$ ...
$\textbf{g}_n = e_n = \frac{1}{\sqrt{\pi}}\cos(nx)$...
$\textbf{g}_{n+1} = e_{n+1} = \frac{1}{\sqrt{\pi}}\sin((n+1)x)$...
$\textbf{g}_{2n} = e_{2n} = \frac{1}{\sqrt{\pi}}\sin(nx)$
The orthogonal basis yields:
$a_0 = \frac{2}{\sqrt{2\pi}}<\textbf{f},\textbf{g}_0>$ (Use 2 because it makes generalizing the coefficients possible.)
$a_1 = \frac{1}{\sqrt{\pi}}<\textbf{f},\textbf{g}_1>$...
$a_n = \frac{1}{\sqrt{\pi}}<\textbf{f},\textbf{g}_n>$...
$b_n = \frac{1}{\sqrt{\pi}}<\textbf{f},\textbf{g}_{2n}>$
Now that everything is divided into sines and cosines I can get the coefficients:
$a_n = \frac{1}{\pi}\int_0^{2\pi} f(x)\cos(nx)dx $
and
$b_n = \frac{1}{\pi}\int_0^{2\pi} f(x)\sin(nx)dx $