Every Ordered Basis for Dual Space is Dual Basis for Some Basis

dual-spaceslinear algebraproof-explanation

Let $V$ be a finite-dimensional vector space with dual space $V^∗.$
Then every ordered basis for $V^∗$ is the dual basis for some basis for $V.$

The proof of this result uses the two theorems below:

Theorem 2.24. Suppose that $V$ is a finite-dimensional vector space with the ordered basis $β = \{x_1, x_2,…,x_n\}.$ Let $f_i (1 ≤ i ≤ n)$ be the ith coordinate function with respect to $β$ as just defined, and let $β^∗ = \{f_1, f_2,…, f_n\}.$ Then $β^∗$ is an ordered basis for $V^∗,$ and, for any $f ∈ V^∗,$ we have
$f = n\sum_{i=1}^nf(x_i)f_i.$

Theorem 2.26. (For a vector $x ∈ V,$ we define $\hat x: V^∗ → F$ by $x(f) = f(x)$ for every $f ∈ V^∗.$) Let $V$ be a finite-dimensional vector space, and define $ψ: V → V^{∗∗}$ by $ψ(x) = \hat x.$ Then $ψ$ is an isomorphism.

The proof given is as follows:

Proof. Let $\{f_1, f_2,…, f_n\}$ be an ordered basis for $V^∗.$ We may combine Theorems 2.24 and 2.26 to conclude that for this basis for $V^∗$ there exists a dual basis $\{\hat x_1, \hat x_2,…,\hat x_n\}$ in $V^{∗∗},$ that is, $δ_{ij} = \hat x_i(f_j ) = f_j (x_i)$ for all $i$ and $j.$ Thus $\{f_1, f_2,…, f_n\}$ is the dual basis of $\{x_1, x_2,…,x_n\}.$


However, I don't understand the meaning of the line, "We may combine Theorems 2.24 and 2.26 to conclude that for this basis for $V^∗$ there exists a dual basis $\{\hat x_1, \hat x_2,…,\hat x_n\}$ in $V^{∗∗},$ that is, $δ_{ij} = \hat x_i(f_j ) = f_j (x_i)$ for all $i$ and $j.$" I don't get how the symbols $\hat x_1, \hat x_2,…,\hat x_n$ pop out of nowhere? How does the two theorems gurantee this comclusion? I am not quite getting it.

Best Answer

From $\{f_1,...,f_n\}$ which is an ordered basis for $V$, we may construct a dual basis for $V^{**}$ in the same way as we did in Theorem 2.24, i.e we consider the $ith$ coordinate function $g_i$ relative to the ordered basis $\beta'=\{f_1,...,f_n\}$ where $g_i(f)=a_i$, and $[f]_{\beta'}=\begin{pmatrix} a_1\\a_2\\.\\.\\.\\a_n\end{pmatrix},\forall f\in V^*.$ Thus, by Theorem 2.24, the set $\gamma=\{g_1,g_2,...,g_n\}$ so obtained is a dual basis of $\beta'.$ But, by Theorem 2.26,$\exists x_i\in V,$ ( where $i=1,2,...,n$) such that,$\psi(x_i)=\hat x_i=g_i.$ Since, $\psi$ is an isomorphism so, $\{x_1,...,x_n\}$ is a basis for $V$ and we can rewrite $\gamma$ as $\{\psi(x_1),\psi(x_2),...,\psi(x_n)\}=\{\hat x_1,\hat x_2,...,\hat x_n\}.$

Now, $g_i(f_j)=\delta_{ij}=\hat x_i(f_j)=f_j(x_i).$ So, $f_j(x_i)=\delta_{ij}.$ This means $f_j$ is the jth coordinate function with respect to the basis $\beta=\{x_1,x_2,...,x_n\}$ (as if, $x\in V$ is an arbitrary vector then, $\exists b_1,b_2,...,b_n\in F $ such that $x=\sum_{i=1}^nb_ix_i$ and $[x]_{\beta}=\begin{pmatrix}b_1\\b_2\\.\\.\\.\\b_n\end{pmatrix}$ due to which, $f_j(x)=\sum_{i=1}^nb_if_j(x_i)=b_j.$)

From Theorem 2.24, we may conclude $\beta'$ is indeed a dual basis of $\beta.$ This completes the proof.