Direct sum of even and odd functions make whole vector space

vector-spacesvectors

Let $V$ be the $\mathbb{R}$-Vector space. How would you show that, for $V_1$ being the set of even functions and $V_2$ being the set of odd functions, the following is true: $$V = V_1 \oplus V_2 $$

I know that $V_1$ and $V_2$ are subspaces, and understand how the only function that can be both even and odd is the zero vector.

I also came across a post with a similar question here, but I don't fully understand the provided answer.

Best Answer

This is a very general phenomenon which is unrelated to even or odd functions. Let $V$ be a (say, real or complex) vector space and $T\colon V\to V$ be a linear operator, not equal to ${\rm Id}_V$, but with $T^2={\rm Id}_V$. Then the minimal polynomial of $T$ equals $p(t) = t^2-1$. Since this polynomial splits as $(t-1)(t+1)$, we necessarily have that $T$ is diagonalizable, with eigenvalues $1$ and $-1$, and $V = V_+ \oplus V_-$, where $V_+$ is the eigenspace associated to $1$ and $V_-$ is associated to $-1$. What's more, we can actually say what the decomposition of any $v\in V$ relative to this direct sum is: $$v = \frac{v+Tv}{2} + \frac{v-Tv}{2}.$$If one cannot guess the above, do it systematically: write $v= v_+ + v_-$ with $Tv_+ = v_+$ and $Tv_- = -v_-$. Then $$\begin{cases} v=v_++v_- \\ Tv = v_+ - v_-\end{cases} \implies v_+ =\frac{v+Tv}{2}\quad\mbox{and}\quad v_-=\frac{v-Tv}{2}.$$

Examples:

  1. $V$ is the vector space of all functions $W \to \mathbb{R}$, where $W$ is any non-trivial vector space (in particular, we may take $W=\mathbb{R}$), and let $T\colon V\to V$ be defined as $T(f)(x) = f(-x)$. This fits the bill and one may write $$f(x) = \frac{f(x)+f(-x)}{2} + \frac{f(x)-f(-x)}{2},$$for all $f\in V$ and $x\in W$. So any function $W\to \mathbb{R}$ is uniquely expressed as the sum of an even function with an odd function. This works for complex-valued functions too.

  2. $V$ is the space of all real $n\times n$ matrices, and $T\colon V\to V$ is the transposition $T(A) = A^T$. Then $$A = \frac{A+A^T}{2} + \frac{A-A^T}{2}$$says that every real $n\times n$ matrix can be uniquely expressed as the sum of a symmetric matrix with a skew-symmetric matrix.

  3. $V$ is the space of all complex $n\times n$ matrices (regarded as a real vector space), and $T\colon V\to V$ is the conjugate-transposition $T(A) = A^\dagger = \overline{A^T}$ (warning: this is $\mathbb{R}$-linear but not $\mathbb{C}$-linear). Then $$A = \frac{A+A^\dagger}{2} + \frac{A-A^\dagger}{2}$$says that every complex $n\times n$ matrix can be uniquely expressed as the sum of a hermitian matrix with a skew-hermitian matrix.

Related Question