Norms and convergence on direct product of Banach spaces

banach-spacesconvergence-divergencedirect-sumnormed-spaces

Let $(X_i, ||\cdot||_{X_i})$,  $i=1, 2$, be Banach spaces. Consider the direct product space $X:=X_1\times X_2$ with usual component-wise operations. We know we can endow this space with a norm (one possible choice is $||\cdot||_1:=||\cdot||_{X_1}+||\cdot||_{X_2}$).  
Let us suppose we have chosen some norm and that we keep it fixed. Denote this norm with $||\cdot||_X$.  Are the following claims true?

Claim 1.  For any $(x_1, x_2)\in X$ one has $||(x_1, x_2)||_X\geq ||x_i||_{X_i}$,  $i=1, 2$.

I know this claim is true for "usual" norms on $X$ like $||\cdot||_1, ||\cdot||_2$ and $||\cdot||_{\infty}$ but I don't seem to be able to prove it for an arbitrary norm.

Claim 2.  Let $\{(x_n,y_n)\}_{n=1}^{\infty}$ be some sequence in $X$. Then $(x, y)=\lim_{n\to\infty}(x_n, y_n)$ if and only if $x=\lim_{n\to\infty} x_n$ in $X_1$ and $y=\lim_{n\to\infty} y_n$ in $X_2$.

I've found one part of the "answer" for this last claim here:   Does convergence of vector sequence imply that of all components?. However, I am not exactly satisfied with it as I fail to see why the projections should have to be continuous in the topology of this, arbitrary, norm on $X$.
Also, I am aware that one direction in the second claim follows immediately from Claim 1 (should it be true).

I appreciate any help I can get!

Best Answer

For the first one, let $X_{1}=X_{2}=:Y$, we can create $\|(x_{1},x_{2})\|_{X}=\dfrac{1}{3}\|x_{1}\|_{Y}+\dfrac{1}{3}\|x_{2}\|_{Y}$, this is a norm in $X$., while $\|(x,x)\|_{X}=(2/3)\|x\|_{Y}<\|x\|_{Y}$ for $x\ne 0$.

For the second one. If $X$ is endowed with the usual product topology, choose a subbasic open set in $X$, it is of the form $G\times H$ for open $G$ in $X_{1}$, $H$ open in $X_{2}$, by this observation then it is true.

If $X$ is endowed with other topology, then the answer is negative. Consider $X_{1}=X_{2}=L^{1}(\mathbb{R})\cap L^{2}(\mathbb{R})$. We let $\|(f,g)\|_{X}=\|f\|_{L^{1}(\mathbb{R})}+\|g\|_{L^{1}({\mathbb{R}})}$, where we endow $X_{1}$ and $X_{2}$ both the $L^{2}(\mathbb{R})$ norm.