What are the bases of a function space (Hilbert space)

fourier analysisfunctional-analysishilbert-spaceslinear algebraquantum mechanics

1. Motivation

I was learning about the Hilbert space and function spaces, i.e., roughly speaking, infinite-dimensional vector spaces.

Let's now think about ordinary 3D Euclidean vectors. A vector $\vec{x}$ may be given by

$$ \newcommand\mycolv[1]{\begin{bmatrix}#1\end{bmatrix}} \vec{x} = \mycolv{1\\3\\2}$$

and this is equivalent to saying that $\vec{x} = 1 \hat{i} + 3 \hat{j} + 2 \hat{k}$. So even if we think of this vector as an ordered triple, $(1, 3, 2)$, then a mathematical structure behind is saying that a vector can be demonstrated in the form of a linear combination of basis vectors.

2. My understanding of function spaces

As someone studying engineering and hasn't been exposed to rigorous mathematical proofs of linear algebra, I understood, through quite an intuitive approach, why function spaces are infinite-dimensional vector spaces. I considered the inner product defined, for example, of functions $\phi: R \rightarrow C$ and $\psi: R \rightarrow C$.

$$\langle \phi | \psi \rangle = \int \phi^*(x) \psi(x) \ dx$$

Intuitively speaking this is about adding up all the differentials $\phi^*(x) \psi(x) \ dx$, which is analogous to the scalar product in Euclidean space, so I thought each and every function values corresponding to individual (though an infinite number of) elements of the domain should all be the components of vectors $\phi(x)$ and $\psi(x)$. (And thus both function spaces should be of an infinite dimension.) So, like the below:

$$ \newcommand\mycolv[1]{\begin{bmatrix}#1\end{bmatrix}} |\psi(x) \rangle = \mycolv{…\\\psi(a-2\epsilon) \\ \psi(a-\epsilon) \\\psi(a)\\\psi(a + \epsilon) \\ \psi(a+2 \epsilon) \\ …}$$

for some $a \in R$.

3. Question

It's fine up to here. But, there should at least be an infinite number of linearly independent basis vectors that make up the actual infinite-dimensional column vector, i.e., $|\psi(x)\rangle$. As if $\hat{i}, \hat{j}, \hat{k}$ corresponded respectively to the coefficients $1, 3, 2$, there should be basis vectors that correspond to each of $\psi(a), \psi(a+\epsilon), \psi(a-\epsilon), $ and so on. What are they?

Best Answer

Your intuition that a function space is an infinite dimensional vector space with each point in the domain corresponding to a coordinate is correct.

The interesting function spaces come with a norm. Then a basis is a set of vectors such that every vector in the space is the limit of a unique infinite sum of scalar multiples of basis elements - think Fourier series.

The uniqueness is captures the linear independence.

These vector spaces also have infinite bases such that every element is a finite linear combination of scalar multiples of basis vectors, but those bases are not useful in analysis. See https://en.wikipedia.org/wiki/Basis_(linear_algebra)#Analysis