Can the modulus function be defined as the square root of the (standard) inner product of something with itself

absolute valuelinear algebramatricesvectors

Recently, I have been thinking about the similarities between the modulus of a real number, and that of a complex number. In both cases the modulus of the number gives us its "distance" from the origin. Since the concept of a complex number seems quite similar to the concept of a vector in $ℝ^2$, I then thought about the modulus of a vector $v$ which to me seems indistinguishable from its magnitude, $||v||$, which is equal to the square root of the squares of its components (from distance formula). Using this definition, we can generalise $v$ to be a vector in $ℝ^n$.

I soon realised that this modulus is the same as the square root of the dot-product of $v$ with itself.

Upon further research, I found that the dot product of 2 vectors is the "standard" inner product of 2 vectors. Which leads me to the question:
Can the modulus of "something" be defined as the square root of that thing's standard inner product with itself?

This definition seems to make sense for the moduli of real numbers, vectors, even complex numbers (have some doubts about the last one as I'm not sure what the standard inner product of a complex number with itself would be). But it would also mean that the modulus of a matrix is equal to the the square-root of the sum of the squares of all its elements (because the standard inner product between matrices is defined), which is a result that I'm not quite able to connect to the idea of modulus being a "distance."

Sorry for the long query, I realise that I am probably overthinking this, but any help in clearing this up will be vastly appreciated.

Thanks.

Best Answer

Inner products are a far more general concept than you realize. And norms (I'm sorry, but I'm not sure if I've ever heard them called "moduli" before) are more general yet.

In $\Bbb R^n$ the sum of the products of the components easily defines an inner product. This definition also includes your matrix inner product because the set of real $n\times m$ matrices is effectively just $\Bbb R^{nm}$. But, in general, vector spaces do not come with a "standard" inner product. For example the set of all functions from $\Bbb R \to \Bbb R$ is a vector space over the real numbers: the sum of two functions gives another function, and multiplying a function by a fixed real number also defines another function. But the "sum of products" definition doesn't work when you have to sum up uncountably many non-zero numbers.

There are other means of defining inner products. Another infinite dimensional vector space is the set $C[0,1]$ of continuous real-valued functions defined on the interval $[0,1]$. Once again, it isn't possible to define an inner product in the same sense as finite dimensional vector spaces because you would have to sum up infinitely many numbers. But, there is an equivalent to summation that can be used here. We can define the inner product of two functions $f, g \in C[0,1]$ by $$\langle f, g \rangle = \int_0^1 f(t)g(t)\,dt$$ which works fine. But the point here is, $C[0,1]$ didn't just come with this inner product. We had to add it. And we could have chosen a different one: If $h(t) \in C[0,1]$ is postive everywhere, $\int_0^1f(t)g(t)h(t)\,dt$ also defines an inner product between all functions $f,g$.

Inner products and not just part of vector spaces. We have to add them, and we have choices in doing so. However, once we do choose an inner product, you are correct that we automatically get a norm from it by defining $\|f\| = \sqrt{\langle f, f\rangle}$. However, we don't need an inner product to get a norm. We can also just define one directly. It needs to have three properties:

  • if $v \ne \mathbf 0$, then $\|v\| > 0$
  • for any scalar $c$, $\|cv\| = |c|\|v\|$.
  • (triangle inequality): $\|u + v\| \le \|u\| + \|v\|$

Any operation $\|\cdot\|$ defined any vector space with these properties is a norm, and we regularly define norms that do not correspond to any inner product. For instance, on $C[0,1], \|f\|_\infty = \max\{|f(t)| : t \in [0,1]\}$ also defines a norm, which is distinct from the norm defined from the inner product given above. In fact, there is no inner product that has $\|f\|_\infty$ as its norm. We know this because $\|f\|_\infty$ does not satisfy the parallelogram law, while all inner product norms do.

Finally, when defining inner products on vector spaces over the complex numbers instead of the real numbers, we require them to be conjugate-symmetric instead of just being symmetric: A complex inner product must satisfy

  • $\langle u,v + w\rangle = \langle u,v\rangle + \langle u,w\rangle$
  • for $c \in \Bbb C, \langle u, cv\rangle = c\langle u,v\rangle$
  • $\langle u,v\rangle = \overline{\langle v,u\rangle}$
  • $\langle v,v\rangle > 0$ for all $v \ne \mathbf 0$

From this it also follows that $\langle u + w,v\rangle = \langle u,v\rangle + \langle w,v\rangle$, and that $\langle cu,v\rangle = \overline c \langle u,v\rangle$. Note that the conjugate symmetry requires $\langle v,v\rangle$ to be real. But that it is positive still needs to be a separate condition.