[Math] Prob. 14, Sec. 2.10 in Erwin Kreyszig’s INTRODUCTORY FUNCTIONAL ANALYSIS WITH APPLICATIONS: application to a system of equations

analysisfunctional-analysislinear algebranormed-spacesproof-verification

Let $M$ be a non-empty subset of a normed space $X$, and let $M^a$ denote the subspace of the dual space $X'$ that consists of all those bounded linear functionals that vanish at each point of set $M$.

Now here's Prob. 14, Sec. 2.10 in Introductory Functional Analysis With Applications by Erwin Kreyszig:

If $M$ is an $m$-dimensional subspace of an $n$-dimensional normed space $X$, show that $M^a$ is an ($n-m$)-dimensional subspace of $X'$. Formulate this as a theorem about solutions of a system of linear equations.

My effort:

Since $X$ is finite-dimensional, every linear functional on $X$ is bounded; so we can write $X' = X^*$.

Let $\{e_1, \ldots, e_m \}$ be a basis for $M$; extend it to a basis $\{ e_1, \ldots, e_m, \ldots, e_n \}$ for $X$. Then each element of $X$ has a unique representation as a linear combination of the $e_j$s.

Suppose that $x \in X$ has the unique representation $$x = \sum_{j=1}^n \xi_j e_j.$$

Then, for any $f \in X'$, we have $$f(x) = \sum_{j=1}^n \xi_j f(e_j).$$
Now, for each $j= 1, \ldots, n$, let $f_j \in X'$ be deined as
$$f_j(x) = \xi_j.$$
Then we can write
$$f(x) = \sum_{j=1}^n f(e_j) f_j(x).$$
So $$f = \sum_{j=1}^n \alpha_j f_j, \ \mbox{ where } \ \alpha_j \colon= f(e_j) \ \mbox{ for each } \ j= 1, \ldots, n.$$
It can also be shown that the set $\{ f_1, \ldots, f_n \}$ is linearly independent and therefore a basis for $X^* = X'$.

Now suppose that $f \in M^a$. Then $$ f(e_j) = 0 \ \mbox{ for each } \ j= 1, \ldots, m.$$ So $$f(x) = \sum_{j=m+1}^n \xi_j f(e_j) = \sum_{j=m+1}^n \alpha_j f_j(x).$$
Thus each $f \in M^a$ can be written as
$$f = \sum_{j=m+1}^n \alpha_j f_j.$$
Moreover, the set $\{f_{m+1}, \ldots, f_n \}$, being a subset of a linearly independent set, is also linearly independent and hence forms a basis for $M^a$. So $M^a$ has dimention $n-m$.

Is the above proof correct?

Now it is my feeling that this result yields the following result:

Let $m < n$. Then any system of $m$ independent homogeneous simultaneous linear equations in $n$ unknowns (with real or complex numbers as co-efficients) has $n-m$ linearly independent solutions.

Is my conclusion correct?

But I'm not exactly sure how to relate the above formulation to this conclusion.

Best Answer

Your proof is correct. Well done!

As for the other result, here's one way to think about it:

Take $e_j$ to be the standard basis of $\Bbb R^n$, and take $f_j$ to be the corresponding dual basis. The system of equations $$ a_{11} x_1 + \cdots + a_{1n} x_n = 0\\ a_{21} x_1 + \cdots + a_{2n} x_n = 0\\ \vdots \\ a_{m1} x_1 + \cdots + a_{mn} x_n = 0 $$ can be rewritten as $$ (a_{11}f_1 + \cdots + a_{1n}f_n) x = 0\\ (a_{21}f_1 + \cdots + a_{2n}f_n) x = 0\\ \vdots\\ (a_{m1}f_1 + \cdots + a_{mn}f_n) x = 0 $$ That is, the solution set of the system of equation is the common zero set of the linearly independent set of functionals $\sum_{j=1}^n a_{ij}f_j$ where $i = 1,\dots,m$. That is, if $M$ denotes the solution set to the homogeneous system of equations, we have $$ M^a = \text{span}\left\{\sum_{j=1}^n a_{1j}f_j,\dots,\sum_{j=1}^n a_{mj}f_j\right\} $$ By your result, we may conclude $\dim(M) = n-m$, as desired.

Related Question