If $\mathbf{A}$ is semi-positive, we have that the sequence $\{\mathbf{A}_k\} = \{\mathbf{A}+\frac{1}{k}\mathbf{I}\}$ consists of positive matrices1 . What's more, $\mathbf{A}_k \to \mathbf{A}$ in operator norm, and with $\mathbf{A}_k = \mathbf{L}_k\mathbf{L}_k^T$, we have that $\mathbf{L}_k$ converges to a limit we name $\mathbf{L}$. Finally, $\mathbf{L}$ has the desired properties, that is $\mathbf{A} = \mathbf{L}\mathbf{L}^T$.
Therefore, an efficient algorithm would be to add a small diagonal $\varepsilon \mathbf{I}$ to $\mathbf{A}$ so that $\mathbf{A} + \varepsilon \mathbf{I}$ is numerically positive definite, and then perform the Cholesky decomposition. We finally remark that computing the eigenvalues of $\mathbf{A}$ asks for more floating point operations.
1 proof sketch on wikipedia
That follows is a particular constructible solution in $T$, an upper triangular matrix with non-negative diagonal.
There is a unique hermitian $\geq 0$ $H$ s.t. $A=H^2$; let $rank(H)=rank(A)=r$. We consider the standard (Maple) $QR$ decomposition of $H$: $H=QR$ where $Q\in M_{n,r},R\in M_{r,n}$, the diagonal of $R$ being positive; moreover $Q^*Q=I_r$. The matrix $T=\begin{pmatrix}R\\0_{n-r,n}\end{pmatrix}$ is upper triangular with non-negative diagonal. Clearly $T^*T=R^*R$ and $A=H^2=R^*Q^*QR=R^*R$ and we are done.
EDIT. Answer to epsilone. This is the QR decomposition for singular real matrices (cf. Horn-Johnson, Matrix analysis, p.112 or Robbin, Matrix algebra using MINImal MATlab p.362). Assume for instance $r=3$; then the method gives the real decomposition $H=[Q_1,Q_2,Q_3][R_1^T,R_2^T,R_3^T]^T=[Q_1R_1+Q_2R_2+Q_3R_3]$ where the $Q_i$ are orthogonal vectors and $R$ is "triangular" with every $i,R_i[i]\not= 0$. If $R_i[i]<0$, then change $Q_i,R_i$ with $-Q_i,-R_i$ and we are done.
Maple 18 gives the result $Q',R'$ that is not exactly in the above form; to obtain $Q,R$, it suffices to keep the first $r$ columns of $-Q'$ and the first $r$ rows of $-R'$ (curiously, note the signum "-"). The method uses Gram Schmidt process and Householder transformations.
Example: Let $A=B^TB$ with the unknown matrix $B=\begin{pmatrix}5&1&3&-2\\4&-3&-4&3\end{pmatrix}$; we find $R\approx -\begin{pmatrix}-6.4&1.09&0.156&-0.312\\0&-2.97&-5&3.59\end{pmatrix}$.
I forget: when the matrices are complex, then replace the eventual factors $-1$ with factors $e^{\pm i\theta}$.
Best Answer
From Wikipedia:
$A=LL^*\implies x^*Ax=(L^*x)^*(L^*x)\ge 0$
Since $L$ is invertible, $L^*x\ne 0$ unless $x=0$, so $x^*Ax>0\ \forall\ x\ne 0$