As for the utility of inner product spaces: They're vector spaces where notions like the length of a vector and the angle between two vectors are available. In this way, they generalize $\mathbb R^n$ but preserve some of its additional structure that comes on top of it being a vector space. Familiar friends like Cauchy-Schwarz, the parallelogram rule, and orthogonality all work in inner product spaces.
(Note that there is a more general class of spaces, normed spaces, where notions of length make sense always, but an inner product cannot necessarily be defined.)
The dot product is the standard inner product on $\mathbb R^n$. In general, any symmetric, positive definite matrix will give you an inner product on $\mathbb C^n$. And you can have inner products on infinite dimensional vector spaces, like
$$ \langle \, f, \, g \, \rangle = \int_a^b \ f(x)\overline{g(x)} \, dx$$
for $f, g$ square-integrable functions on $[a,b]$.
This becomes useful, for example, in applications like Fourier series where you want a basis of orthonormal functions for some function space (it's not just the trigonometric functions that work).
For finite dimensional spaces, the answer is "yes"; this is a consequence of the Gram-Schmidt orthonormalization process: every finite dimensional inner product space over $\mathbb{R}$ or over $\mathbb{C}$ has an orthonormal basis.
Now let $\mathbf{V}$ be an inner product space, and let $\mathbf{v}_1,\ldots,\mathbf{v}_n$ be an orthonormal basis. Then $T\colon\mathbf{V}\to \mathbf{F}^n$ given by $T\mathbf{v}_i = \mathbf{e}_i$ (i.e., $T$ maps each vector in $\mathbf{V}$ to its coordinate vector relative to the orthonormal basis $\mathbf{v}_1,\ldots,\mathbf{v}_n$) is an invertible linear transformation such that for all $\mathbf{x},\mathbf{y}\in\mathbf{V}$, $\langle \mathbf{x},\mathbf{y}\rangle = T\mathbf{x}\cdot T\mathbf{y}$, where the right hand side is the standard dot product on $\mathbf{F}^n$ ($\mathbf{F}=\mathbb{R}$ or $\mathbb{C}$).
Best Answer
Inner products arise in a variety of areas of mathematics. In particular, vector spaces with inner products defined on it are usually called $\textit{inner product spaces}$, and this is very important when studying functional analysis. For vectors in $\mathbb{R}^{d}$, for example, an inner product can be defined as follows: if $x=(x_{1},...,x_{n}),y=(y_{1},...,y_{n})\in \mathbb{R}^{n}$ we set $$<x,y>:= \sum_{i=1}^{n}x_{i}y_{i}$$ Note that this is not the only possible inner product in $\mathbb{R}^{n}$. It may occur that a vector space has more than one inner product. See this discussion, for example. There are a lot of "good properties" that inner product spaces share and one of the most important properties is orthogonality. This is very commonly used, for instance, not just in mathematics but in physics and other applied sciences. Another very important (and interesting) property is that once you have an inner product on a vector space, you can readily define a $\textit{norm}$ from it. For instance, we can define a norm $||\cdot||$ in $\mathbb{R}^{d}$ by setting $$||x|| := \sqrt{<x,x>}= \sqrt{\sum_{i=1}^{n}x_{i}^{2}}$$ This very definition works in general, even if we are dealing with more abstract spaces. Vector spaces which have a norm defined on it are called $\textit{normed spaces}$. These spaces are studied in functional analysis courses as well. It is worth mentioning that Hilbert and Banach spaces (you may also have heard about these) are special cases of inner product spaces and normed spaces, respectivelly.
As an application, I can say that inner products (and Hilbert spaces, in particular) are very important in quantum mechanics, for example. In short, this is because different states of a system are orthogonal, so the notion of orthogonality plays an important role on the theory.