Given a subspace $U\subseteq X$, the space may be decomposed as $X=U\oplus U^\perp$. For example, in three dimensions, given a line $\ell$, there is a perpendicular plane $\pi$, and every vector may be uniquely written as a vector from $\ell$ plus a vector from $\pi$. Two key properties of the inner product are that it is linear in both arguments (or, if you're working with a complex vector space, it is conjugate-linear in the left or right argument if you are a physicist or mathematician respectively) and that it evaluates to zero whenever its arguments are orthogonal, by definition of orthogonal.
(By the way, it's a nice exercise to prove $X=U\oplus U^\perp$.)
In particular, given any $x,y\in X$, we may write $x=u+u'$ and $y=v+v'$ where $u,v\in U$ and $u',v'\in U^\perp$, and then $\langle x,y\rangle=\langle u+u',v+v'\rangle=\langle u,v\rangle+\langle u,v'\rangle+\langle u',v\rangle+\langle u',v'\rangle$ which simplifies to just $\langle u,v\rangle+\langle u',v'\rangle$ since the other two inner products evaluate to zero. More specifically we can say that $\|u+u'\|^2=\|u\|^2+\|u'\|^2$ when $u\in U,u'\in U^\perp$.
Take for instance $X=\mathbb{R}^2$ and $U$ the first coordinate axis. Then the orthogonal projection just shift's every vector's tail vertically until it's on the first coordinate axis. If our vector is $(3,4)$ then the projection $P$ pushes it down to $(3,0)$, which is the $U$-component of $(3,4)$, and the $U^\perp$ component of $(3,4)$ must then clearly be $(0,4)$. Draw all this out - it's a right triangle by virtue of the fact the projection is orthogonal. This is the picture behind $\|Px\|\le \|x\|$.
Indeed the identity $\|u+u'\|^2=\|u\|^2+\|u'\|^2$ is none other than the Pythagorean theorem!
To prove it formally, you've started out on the right path. It's easier to work with the square of magnitude since it comes from a linear gadget (the inner product), and an inequality of nonnegative reals is equivalent to what you get by squaring. So work with $\|x\|^2$. First note that we can decompose $x=Px+Qx$ (the $U\oplus U^\perp$ decomposition of $x$, where $Q:X\to U^\perp$ is the orthogonal projection onto $U^\perp$), and now since $Px\perp Qx$ we can say
$$\| x\|^2=\|Px+Qx\|^2=\|Px\|^2+\|Qx\|^2\ge \|Px\|^2 \implies \|x\|\ge \|Px\|. $$
Note that it's a necessary condition that $P$ is an orthogonal projection. Otherwise, if we pick an $x$ that is orthogonal to $\ker P$ but not in $U$, the vectors $x$ and $Px$ are once again the edges of a right triangle but with $Px$ the hypotenuse and $x$ a side, giving $\|Px\|>\|x\|$ in that case.
If you have, say a $4$ by $4$ square matrix that's represendted by $A=[c_1,c_2,c_3,c_4]$ where $c_i$'s are colums, then you now that given a column vector $v$, $Av=\sum_i c_iv_i$ where $v_i$ are elements of $v$.
So by finding basese for $M^T$ and $M$ you know basis for whole space since $\mathbb{R}^4=M^T \oplus M$ and to find projection of some vector $v$ onto these spaces you need to represent $v$ in terms of these basis vectors. So you are trying to find $a_i$ such that $[v_1,v_2,v_3,v_4] a = v$ where $a$ is column vector with elements $a_i$ and $v_i$ are bases vectors in column form.
Since you know $v_i$'s are linearly independent you know $A=[v_i]$ is an invertible matrix, and inverse can easily be found using Gaussian elimination - so you can easily find $a$, and you are done
Best Answer
If $P$ is orthogonal then $\langle Px, y \rangle = \langle Px, Py \rangle + \langle Px, y- Py \rangle = \langle Px, Py \rangle $. Switching $x,y$, we obtain $\langle x, Py \rangle = \langle Px, Py \rangle $. Thus, $\langle Px, y \rangle = \langle x, Py \rangle$ which implies $\langle Px, y \rangle = \langle P^*x, y \rangle$ and further implies $\forall x,y\, \langle (P-P^*)x, y \rangle =0$. Taking $y = (P-P^*)x$, we find $\| (P-P^*)x\| = 0$ for all $x$ which implies $P-P^* = 0$ and therefore $P = P^*$.
If $P^* = P$ then $$\langle Px, y- Py \rangle = \langle x, P^*y- P^*Py \rangle = \langle x, Py- P^2y \rangle = \langle x, Py- Py \rangle =0,$$ since $P^2 = P$.