Proof that all basis holds same number of elements

linear algebra

I know that the site has evidence of this statement, but I want to check my own proof.

$\textbf{Lemma 1}$
Uniqueness of the zero vector.

$\textbf{Proof}$: $\overline{\theta_1} = \overline{\theta_1} + \overline{\theta_2} = \overline{\theta_2} + \overline{\theta_1} = \overline{\theta_2}$

$\textbf{Lemma 2}$
For any vector there is a single inverse vector.

$\textbf{Proof}$: $-\overline{x_1} = -\overline{x_1} + \overline{\theta} = – \overline{x_1} + (\overline{x} -\overline{x_2}) = – \overline{x_1} + \overline{x} -\overline{x_2} = \overline{\theta} -\overline{x_2} = -\overline{x_2}$

$\textbf{Lemma 3}$
The product of any vector by the neutral element of the additive group of the field is the zero vector.

$\textbf{Proof}$: $0 \cdot \overline{x} = 0 \cdot \overline{x} + \overline{\theta} = 0 \cdot \overline{x} + (\overline{x} – \overline{x}) = \overline{x}(0 + 1) – \overline{x} = \overline{x} – \overline{x} = \overline{\theta}$

$\textbf{Lemma 4}$:
The product of the zero vector on any field element is equal to the zero vector.

$\textbf{Proof}$: $\alpha\cdot\overline{\theta} = \alpha\cdot(0\cdot\overline{x}) = (\alpha \cdot 0)\cdot \overline{x} = 0\cdot \overline{x}$

$\textbf{Lemma 5}$:
If $\lambda \neq 0$ and $\overline{x} \neq \overline{\theta}$ the $\lambda\cdot\overline{x}\neq\overline{\theta}$.

$\textbf{Proof}$: Suppose it is not a true and we can have $\lambda\cdot\overline{x} = \overline{\theta}$, where $\lambda\neq 0$ and $\overline{x}\neq\overline{\theta}$. $\overline{x} = 1\cdot\overline{x} = (\frac{1}{\lambda}\cdot\lambda)\overline{x} = \frac{1}{\lambda} \cdot (\lambda\cdot \overline{x}) = \frac{1}{\lambda}\cdot \overline{\theta} = \overline{\theta}$ — contradiction.

$\textbf{Lemma 6}$:
In any set of linearly independent vectors, there is no zero vector.

$\textbf{Proof}$: $\sum_{i=1}^{n}\lambda_i\overline{x_i} = \overline{\theta}$ and one of the $\exists \overline{x_j}\in \{\overline{x_n}\}, \overline{x_j} = \overline{\theta}$ $\Rightarrow$ $\lambda_j$ can be $\neq 0$ $\Rightarrow$ $\{\overline{x_n}\}$ — linear dependent.

$\textbf{Lemma 7}$
The representation of the zero vector in any basis is unique.

$\textbf{Proof}$: Assume that it is not true: $\sum_{i=1}^{n}\lambda_i\overline{x_i} = \overline{\theta}$, where one of the $\lambda_j \neq 0$ $\Rightarrow$ $\lambda_j\overline{x_j} = 0$, but by Lemma 5 and Lemma 6 it is impossible.

$\textbf{Theorem}$: All basis holds same number of elements.

$\textbf{Proof}$: If we have two basis $\{\overline{e_n}\}$ and $\{\overline{e'_m}\}$, where $m > n$. So we can represent the vectors of the second basis by vectors of the first basis: $\overline{e'_k} = \sum_{i=1}^{n}\lambda_{ik}\overline{e_i}$, where $k=1,\cdots,m$ and for any vector $\overline{x}$ in this space will be true that $\overline{x} = \sum_{k=1}^{m}\lambda_k\overline{e'_k} = \sum_{k=1}^{m}\lambda_k\sum_{i=1}^{n}\alpha_{ik}\overline{e_i} =\sum_{i=1}^{n}(\sum_{k=1}^{m}\lambda_k \alpha_{ik})\overline{e_i} $. The zero vector $\overline{\theta}$ has the unique representation in the basis $\{\overline{e_i}\}$, $\overline{\theta} = \sum_{i=1}^{n}0_i\overline{e_i}$. Therefore, the condition $\sum_{k=1}^{m}\lambda_k\overline{e'_k} = 0$ is equivalent to system of linear equations for $\lambda_k$: $\sum_{k=1}^{m}\lambda_k\alpha_{ik} = 0$, where $i=1,\cdots,n$. Since the number of $\{\lambda_k\}=m$ is greater than the number of equations $=n$, this system has a non-zero solution. This contradicts to the Lemma 7.

Best Answer

You should reformulate Lemma 2, since you only proof the uniqueness, not the existence. In the proof of Lemma 2 it's pretty circuitous that you use these minus signs. Just name the two inverse vectors $x_1$ and $x_2$.

In the proof of Lemma 5 you should formulate the assumption and the contradiction and in the proof of Lemma 6 you're mixing up formulas and sentences.

The proof of Lemma 7 is not correct. You can only follow $$\lambda_j\cdot \overline{x_j}=0$$ from comparing the two representations, if you use that the representation of each basis vector is unique, which uses the assertion you want to proof.

Your proof of the theorem is correct, if Lemma 6 and 7 hold to be true, since there is a solution, hence for the under determined system there must be infinitely many.