[Math] Does the phrase “orthogonal” mean the same thing when used in the terms “orthogonal function” and “orthogonal vector”

fourier seriesfunctionsorthogonality

I was reading about Fourier series when I came across the term "orthogonal" in relation to functions.

http://tutorial.math.lamar.edu/Classes/DE/PeriodicOrthogonal.aspx#BVPFourier_Orthog_Ex2

I've never heard of this. The idea that two vectors are orthogonal makes sense to me because I can imagine, for instance, $\vec{a}=(1,0)$ and $\vec{b}=(0,1)$, such that $\vec{a} \cdot \vec{b} = (1)(0) + (0)(1) = 0$.

But no simple picture comes to mind for functions. Wikipedia wasn't very helpful for me.

http://en.m.wikipedia.org/wiki/Orthogonal_functions

Can someone explain what this concept is and give a simple example?

Remark:

My intuition says maybe intersecting lines would be an example of two orthogonal functions.

$f(x) = x$ $g(x) = -x$

But that's just a shot in the dark and I don't think that makes sense because the integral is just $\int -x^2 = – \frac{x^{3}}{3} + C$, which isn't zero.

Best Answer

The concept that connects the two notions of orthogonality is an inner product. I'll explain what an inner product is, what it means for orthogonality, and how this more-abstract version of orthogonality relates to the one you're familiar with.

To give a technically correct definition of an inner product I would need to define a vector space, but for this problem that might be overkill. Roughly speaking, a vector space is some collection whose elements can be added, subtracted, and multiplied by real numbers. The set of real-valued functions on $\mathbb R$, for instance, is a vector space, since we know how to add, subtract, and scale real-valued functions.

Note: from now on, when I say vector I'll mean an element of a vector space. So a function is a vector in the vector space of real-valued functions.

An inner product takes in two vectors and returns a scalar; in addition, it must satisfy the following axioms. Think of it as a way to multiply two vectors and return a scalar.

  1. $\langle ax_1+bx_2,y\rangle = a\langle x_1,y\rangle + b\langle x_2,y\rangle$ (linear in first variable [by the next axiom, also linear in the second variable])
  2. $\langle x,y\rangle=\langle y,x\rangle$ (symmetric)
  3. $\langle x,x\rangle = 0$ if and only if $x=0$ (positive-definite)

You are already familiar with one type of inner product: the dot product, which is an inner product on the vector space of row vectors. We say that two row vectors $\vec a$ and $\vec b$ are orthogonal if $\vec a\cdot\vec b=0$. This suggests that a way to make sense of orthogonality in general vector spaces:

Two vectors $v$ and $w$ are orthogonal if $\langle v,w\rangle=0$.

Fourier series start to enter the picture when we look at the vector space of continuous, real-valued functions defined on $[-\pi,\pi]$, say. We can make an inner product on this space by defining $$ \langle f,g\rangle = \int_{-\pi}^{\pi}f(x)g(x)\,dx. $$ (Do you believe that this is an inner product?) Under this inner product, $\sin(x)$ and $\cos(x)$ are orthogonal functions. All I mean is that $$ \int_{-\pi}^\pi \cos x\sin x\,dx = 0, $$ which is true because $\cos x\sin x = \frac12 \sin 2x$. It is possible to show more generally that $$ \int_{-\pi}^\pi \cos(nx)\cos(mx)\,dx = \int_{-\pi}^\pi \sin(nx)\sin(mx)\,dx = 0 $$ if $n\neq m$, and $$ \int_{-\pi}^\pi \sin(nx)\cos(mx)\,dx = 0 $$ always. ($m$ and $n$ are integers.) Here's another example: the functions $1$ and $x$ are orthogonal. So are the functions $x^n$ and $x^m$ whenever $n+m$ is odd (and $n$ and $m$ are nonnegative). Why?