A self-adjoint operator $S : X \to X$ (where $X$ is an inner product space) is an operator such that for all $x,y \in X$, we have $$\langle Sx,y \rangle = \langle x,Sy\rangle.$$ This is a generalization of a real, symmetric matrix.
One important property of such operators is that the eigenvalues of a self-adjoint operator are necessarily real. Indeed, if $k$ is any eigenvalue with corresponding (normalized) eigenvector $v$, we see $$k = k\langle v,v \rangle = \langle kv, v \rangle = \langle Sv, v \rangle = \langle v,Sv \rangle = \langle v, kv \rangle = \overline k \langle v, v \rangle = \overline k$$ showing that $k$ is real.
Another important property (perhaps the most important property) of self-adjoint operators is that the eigenvectors of a self-adjoint operator can be taken to form an orthonormal basis for the ambient space (here I am assuming you are working in a finite dimensional space, but a similar statement still holds in infinite dimension, we just need to generalize the idea of a basis a bit and we need completeness). That is, we can take $k_1, \ldots, k_n$ to be the eigenvalues of $S$ (possible with repetitions) with corresponding orthonormal eigenvectors $v_1,\ldots, v_n$ forming a basis for $X$. Then for any $v \in X$, there are scalars $\alpha_1, \ldots, \alpha_n$ so that $v = \alpha_1 v_1 + \cdots + \alpha_nv_n.$ Using linearity of the inner product, we see $$\langle v, v\rangle = \sum^n_{i=1} \sum^n_{j=1} \alpha_i \overline \alpha_j \langle v_i, v_j \rangle.$$ But by orthonormality, $\langle v_i, v_j \rangle = 0$ when $i \neq j$ and $\langle v_i, v_i \rangle = 1$. Thus the above sum becomes $$\langle v, v\rangle = \sum^n_{i=1} \alpha_i \overline \alpha_i = \sum^n_{i=1} \lvert \alpha_i \rvert^2.$$ Similarly, since $$Sv = S(\alpha_1v_1 + \cdots \alpha_n v_n) = \alpha_1 k_1 v_1 + \cdots + \alpha_n k_n v_n $$we have $$\langle Sv, v\rangle = \sum^n_{i=1} \sum^n_{j=1} k_i \alpha_i \overline \alpha_j \langle v_i, v_j \rangle = \sum^n_{i=1} k_i \lvert \alpha_i \rvert^2.$$ Clearly if $k_i \ge 0$ for all $i=1,\ldots, n$ then $$\langle Sv, v\rangle = \sum^n_{i=1} k_i \lvert \alpha_i \rvert^2 \ge 0.$$ Also, if $k_i \le 1$ for all $i = 1,\ldots, n$, then $$\langle Sv, v\rangle = \sum^n_{i=1} k_i \lvert \alpha_i \rvert^2 \le \sum^n_{i=1} \lvert \alpha_i \rvert^2 = \langle v , v \rangle.$$ Conversely, if the given condition holds for all vectors $v$, then applying the condition to the eigenvectors gives $$0 \le \langle Sv_i, v_i \rangle \le \langle v_i, v_i \rangle \,\,\,\, \implies \,\,\,\, 0 \le \langle k_i v_i, v_i \rangle \le \langle v_i, v_i \rangle$$ whence pulling the $k_i$ out of the inner product gives $0 \le k_i \le 1.$
If $A,B$ are positive, commuting operators, then $AB$ is positive. This is because the unique positive $\sqrt{A}$ must also commute with $B$ and, hence,
$$
\langle ABx,x\rangle = \langle \sqrt{A}Bx,\sqrt{A}x\rangle =
\langle B\sqrt{A}x,\sqrt{A}x\rangle \ge 0.
$$
This result is useful in what follows.
Suppose $A$ is selfadjoint. Let $P=\frac{1}{2}(|A|+A)$ and $N=\frac{1}{2}(|A|-A)$, where $|A|$ is the unique positive square root of $A^2$. Then $PN=NP=0$ and $A=P-N$. This is the desired decomposition of $A$, and the trick is to show that $P,N$ are positive operators.
Let $E$ be the orthogonal projection onto $\mathcal{N}(|A|+A)$. Then $(|A|+A)E=0$ gives $E(|A|+A)=0$ by taking adjoints. And $(|A|+A)(|A|-A)=0$ gives $E(|A|-A)=|A|-A$. Hence,
$$
2EA=E(|A|+A)-E(|A|-A) = A-|A| \\
|A| = (I-2E)A \\
2E|A| = E(|A|+A)+E(|A|-A)=|A|-A \\
A = (I-2E)|A|.
$$
These two equations are consistent because $(I-2E)^2=I-4E+4E=I$ establishes $I-2E$ as its own inverse. Taking adjoints of the above equations shows that $E$ commutes with $A$ and with $|A|$, which is useful in what follows. Now the operators $P$ and $N$ may be written as
$$
P=\frac{1}{2}(|A|+A)=\frac{1}{2}(|A|+(I-2E)|A|)=(I-E)|A|, \\
N=\frac{1}{2}(|A|-A)=\frac{1}{2}(|A|-(I-2E)|A|)=E|A|
$$
Because $E$ commutes with $A$, then $E$ must also commute with $A^2$ and, hence, also with $|A|=(A^2)^{1/2}$. By the result of the first paragraph, $P=(I-E)|A|$ and $N=E|A|$ are positive.
Best Answer
To be clear and correct in the following it is assumed that $\,W$ is positive$\,$ signifies that
Ad $1)\:\:$ If $\,T,S\,$ are positive and they commute, then $\sqrt S\,$ ( = the unique positive square root, being a power series in $S$) also commutes with $T$, that is $T\sqrt S = \sqrt S\,T$. Then $$\langle TS\alpha|\alpha\rangle \:=\: \langle T\sqrt S\,\alpha|\sqrt S\,\alpha\rangle> 0\,$$ for any $\alpha\neq0\,$.
Ad $2b)\:\:$ Decomposing the identity along the subspace $V$ as $\,I=(I-E)+E\,$, you may, thanks to orthogonality, take summand-wise the positive square root: $$\begin{align} T \: & =\: \sqrt{I+E} \;=\;\sqrt{(I-E)+2E}\\[1ex] & =\: (I-E) + \sqrt2\,E \;=\; I + \big(\sqrt 2 -1\big) E \end{align}$$