Norms of vectors and $1\times n$ matrices

linear algebramatrix-norms

I have a question about vector and induced matrix norms. In our class, we were given an exercise to prove equivalences of norms of vectors and their transpose/conjugate transpose. This is the one example I have most of completed, but am currently stuck on.

Let $x\in\mathbb{F}^n.$ Show that $||x||_2=||x^*||_2$.

We have our definition for 2-norms of vectors in $\mathbb{F}^n$ which is $||x||_2=\left(\sum_{i=1}^n|x_i|^2\right)^\frac{1}{2}=\sqrt{x^*x}$ and also for a matrix $A\in\mathbb{F}^{m\times n}, ||A||_2=\max_{x\neq0}\frac{||Ax||_2}{||x||_2}$.

I have searched a bit on this, and couldn't find anything for taking a 2-norm of the transpose of a vector, which I assume is because it's probably trivial to do so, since we are showing it is equivalent.

My progress so far is that since $x^*\in\mathbb{F}^{1\times n}$, we just use the definition of a matrix norm. So we would have that $||x^*||_2=\max_{y\neq0}\frac{||x^*y||_2}{||y||_2}=max_{y\neq0}\frac{x^*y}{||y||_2}$, the R.H.S. is because $x^*y$ is a constant.

I can see that my intended result is that this is maximized when $y=x$, because then $$||x^*||_2=\frac{x^*x}{||x||_2}=\frac{||x||_2^2}{||x||_2}=||x||_2$$
which is what we are trying to show, but I'm not sure exactly how to formally prove that letting $y=x$ maximizes $\frac{||x^*y||_2}{||y||_2}$.

If anyone is interested, the other parts of our exercise are just showing that $||x||_1=||x^*||_\infty$, and $||x||_\infty=||x^*||_1$, but I found those to be a lot easier to show using just definitions.

Best Answer

We have $x^*y=\langle x, y\rangle$, with $\|x\|_2^2=x^*x=\langle x, x\rangle$, and therefore by Cauchy-Schwartz we obtain $$ |x^*y|\ =\ |\langle x, y\rangle|\ \le\ \|x\|_2\,\|y\|_2$$ So the operator norm of $x^*$ is at most $\|x\|_2$.
On the other hand, taking $y=x$, we get equality, so the norm is precisely $\|x\|_2$.

Related Question