Orthogonality – Question on S Perpendicular and Span Closure

functional-analysisinner-productslinear algebraorthogonality

This question was asked in my linear algebra quiz previous year exam and I was unable to solve it.

Let V be an inner ( in question it's written integer , but i think he means inner) product space and S be a subset of V. Let $\bar S$denote the closure of S in V with respect to topology induced by the metric given by inner product. Which of the following statements are true?

A $ S $=$ (S^{\bot})^{\bot}$

B $ \overline S$= $(S^{\perp})^{\perp}$

C $\overline {\text{span}(S)}$=$(S^{\bot})^{\bot}$

D $ S^{\bot} $=$ ((S^{\bot})^{\bot})^{\bot}$

I was completely blank on how can I approach this problem although I have studied linear algebra carefully. Can you please tell on how I should approach the problem .

Edit : I tried It again . I marked A ,D but answer is C,D. If A is false I don't see why D must be true. So, I think I am missing some concepts.

Best Answer

Recall that $S^\perp$ is defined as the set of all vectors in $V$ which are orthogonal to every vector in $S$.

In case $S$ is a singleton, say $S=\{s\}$, then we have $$ S^\perp = \{x\in V: \langle x, s\rangle =0\}, $$ so $S^\perp$ coincides with the null space of the continuous linear functional $$ x\in V \mapsto \langle x, s\rangle , $$ and for that reason $S^\perp$ is obviously a CLS (closed linear subspace).

For a general set $S$, one clearly has that $$ S^\perp = \bigcap_{x\in S} \{s\}^\perp, $$ so $S^\perp$ is the intersection of a family of CLS's, and hence itself a CLS.

Notice that the right-hand-side of (A), (B), (C) and (D) all refer to the "perp" of something, so are all CLS's.

Since the left-hand-side of (A) and (B) may or may not represent a CLS, then they can't always be true.


Point (C) is true. To see why, first notice that $$ S\subseteq (S^\perp)^\perp \tag 1 $$ for a pretty elementary reason (which nevertheless sounds a bit like a tongue-twister): every vector in $S$ is orthogonal to anything that is orthogonal to every vector in $S$.

We then see that (1) states that $S$ is contained in a CLS, and since $\overline{\text{span}}(S)$ is the smallest CLS containing $S$, it follows that $$ \overline{\text{span}}(S)\subseteq (S^\perp)^\perp $$

To prove the converse inclusion, pick any vector $x$ in $(S^\perp)^\perp$. A well known result about Hilbert spaces (which requires that $V$ be complete, and hence we need to assume it here) states that $x$ may be written as $$ x=u+v, $$ where $u$ is perpendicular to $\overline{\text{span}}(S)$, and $v$ velongs to $\overline{\text{span}}(S)$.

Notice that, in particular, $u$ is perpendicular to every vector of $S$, and hence $u\in S^\perp$.

On the other hand, since both $x$ and $v$ lie in $(S^\perp)^\perp$, we conclude that $u=x-v\in (S^\perp)^\perp$.

This implies that $u\in (S^\perp)^\perp \cap S^\perp$, so $u$ is perpendicular to itself, whence $u=0$ and then $$ x = u+v = v\in \overline{\text{span}}(S). $$ This concludes the proof of (C).


Regarding point (D) it is true even if $V$ is not complete. It is a consequence of the following much more general result:

Lemma. Let $V$ be any set (e.g. the inner-product space of interest here) and let $\lozenge$ be a symmetric relation on $V$ (e.g. $x\mathrel{\lozenge} y \Leftrightarrow x\perp y$). For each subset $S\subseteq V$ define $$ S^\lozenge = \{x\in V: x\mathrel{\lozenge} s, \text{ for all $s$ in S} \}. $$ Then $$ S^\lozenge = ((S^\lozenge)^\lozenge)^\lozenge, $$ for any $S$.

Proof. The tongue-twister above immediately implies that $$ S\subseteq (S^\lozenge)^\lozenge. \tag 2 $$ Plugging in $S^\lozenge$ in place of $S$ in (2), we get $S^\lozenge \subseteq ((S^\lozenge)^\lozenge)^\lozenge$.

Next observe that $$ S_1\subseteq S_2 \Rightarrow S_2^\lozenge\subseteq S_1^\lozenge, $$ and if this is applied to (2), we get $ ((S^\lozenge)^\lozenge)^\lozenge\subseteq S^\lozenge. $ QED

An interesting Corollary, in a totally distinct area of Math is:

Corollary. Given a ring $R$ and any subset $S\subseteq R$, define the commutant of $S$, denoted $S'$, to be the set formed by the elements of $R$ which commute with every element of $S$. Then $S'''=S'$.