You can't 'subtract' the spaces, ie, $A\oplus B=A\oplus C$ does not mean $B=C$.
Concrete example:
Choose $V=\mathbb{R}^2$. Take $A = \mathbb{sp}\{e_1\}$, $B = \mathbb{sp}\{e_2\}$, $C = \mathbb{sp}\{e_1+e_2\}$. Take $V$ as the two dimensional subspace in question.
More generally:
Choose $a,b$ to be non-zero elements of $A,B$ respectively. Let $S = \mathbb{sp}\{a,b\}$.
Then it should be clear that $S$ is a 2-dimensional subspace of $V$, and that $A \cap S = \mathbb{sp}\{a\}$, $B \cap S = \mathbb{sp}\{b\}$. It remains to be shown that $S \cap C$ is 1-dimensional.
First, $S \cap C$ cannot be 2-dimensional, since if it was, we would have $a \in C$, which would contradict $V=A\oplus C$.
Finally, since $A\oplus B=A\oplus C$, we can write $a+b = \lambda a + \mu c$, where $\mu \neq 0$. Then we have $c = \frac{1}{\mu}((1-\lambda)a+b)$, and clearly $c \in S \cap C$. Hence $S \cap C$ is 1-dimensional.
The notation $U \oplus W$ is somewhat overloaded.
When $U$ and $W$ are subspaces of $V$, $V=U \oplus W$ means that $V=U+W$ and that $U\cap W=0$. We say that $V$ is the internal direct sum of $U$ and $W$.
When $W=U$, you cannot have $V=U \oplus W$ because $U\cap W=U$, unless $U=W=0$.
The other meaning of $U \oplus W$ is a new vector space built from $U$ and $W$. In this context this is $U \times W$ and indeed in this space $U \times W = U' \oplus W'$, where $U'=U \times 0$ and $W' = 0 \times W$. We say that $V$ is the external direct sum of $U$ and $W$.
So $V\oplus V\oplus \cdots \oplus V$ is to be read as an external direct sum, in which case it's is better expressed as the cartesian product $V\times V\times \cdots \times V$.
The dimension of $V\times V\times \cdots \times V$ is clearly $n \dim V$ because if $B$ is a basis for $V$ then $B \times 0 \times \cdots \times 0 \cup 0 \times B \times 0 \times \cdots \times 0 \cup \cdots \cup 0 \times 0 \times \cdots \times B$ is a basis for $V\times V\times \cdots \times V$.
Best Answer
Suppose you are able to write $f$ as the sum of an even function $g$ and an odd function $h$; then, for every $t\in\mathbb{R}$, \begin{align} f(t)&=g(t)+h(t)\\ f(-t)&=g(-t)+h(-t)=g(t)-h(t) \end{align} Can you go on?