Edit: I think that the first answer below is more efficient and easier to write. Nevertheless, I have added a more concrete approach which might be closer to what you were after.
By some sort of transitivity: recall that every open or closed subset of a locally compact space is a locally compact space with the induced topology. We will use both, for open and for closed.
I consider every space here equipped with the topology induced by the Euclidean norm of $\mathbb{C}^{n\times n}$. Hence every space is Hausdorff. This is good to know, even though your question does not specifically ask about this aspect.
Like every finite-dimensional vector space over $\mathbb{R}$ or $\mathbb{C}$, $\mathbb{C}^{n\times n}$ is locally compact when equipped with the topology induced by any norm.
Clearly, $\mathcal{H}_n^+\mathbb{C}$, the set of positive semidefinite matrices, is closed in the locally compact $\mathbb{C}^{n\times n}$. So $\mathcal{H}_n^+\mathbb{C}$ is a locally compact space.
Now $\mathcal{P}=\{A\in \mathcal{H}_n^+\mathbb{C}\;;\det A>0\}$. By continuity of the determinant, it follows that $\mathcal P$ is open in the locally compact $\mathcal{H}_n^+\mathbb{C}$. Hence $\mathcal P$ is a locally compact space. QED.
Parametrized alternative: fix $A_0$ hermitian definite positive, and denote $\{t^0_1,\ldots,t_n^0\}$ its (positive) eigenvalues. Now let $\epsilon:=\min t_j^0/2>0$. Then denote $U_n$ the unitary group and $\mathcal{H}_n^{++}$ the cone of positive definte hermitian matrices. Now consider the map
$$
\phi:U_n\times \prod_{j=1}^n[t_j^0-\epsilon,t_j^0+\epsilon]\longrightarrow \mathcal{H}_n^{++}
$$
which sends $(U,t_1,\ldots,t_n)$ to $U\mbox{diagonal}\{t_1,\ldots,t_n\}U^*$. Since $U_n$ is compact, the domain is compact. And since $\phi$ is continuous, the range of $\phi$ is a compact subset of $\mathcal{H}_n^{++}$ containing $A_0$. So it only remains to check that this is a neighborhood of $A_0$ in $\mathcal{H}_n^{++}$. To that aim, note that it contains
$$
\phi(U_n\times \prod_{j=1}^n(t_j^0-\epsilon,t_j^0+\epsilon))\ni A_0
$$
i.e. the set of hermitian definite positive matrices with spectrum $\{t_1,\ldots,t_n\}$ such that, up to a permutation, $|t_j-t_j^0|<\epsilon$ for every $j=1,\ldots,n$. By continuity of polynomial roots over $\mathbb{C}$ applied to the characteristic polynomial, this is open in $\mathcal{H}_n^{++}$. QED.
$\langle v,v\rangle >0$ is just a statement about real numbers. One of the two axioms for a Hermitian form is
$$\langle u,v\rangle=\overline{\langle v,u\rangle}.$$
For $u=v$ this implies
$$\langle v,v\rangle=\overline{\langle v,v\rangle}\in\mathbb R$$
since a complex number equal to its own conjugate must be real.
Best Answer
Over the complex numbers (or any other algebraically closed field with $\operatorname{char} k\neq 2$), every invertible matrix has a square root. In fact, over $\mathbb C$, since every invertible matrix has a logrithm, we can take a one parameter family of matrices $e^{t\log A}$, and taking $t=1/2$ yields a square root of $A$. To see the existance of matrix logrithms, it suffices to show that $I+N$ has a logrithm, where $N$ is nilpotent, and this follows from Taylor series (similar to Ted's proof of the existence of sqare roots).
Thus, we can determine if a matrix $A$ has a square root by restricting to $\displaystyle\bigcup_n \ker A^n$, which is the largest subspace on which $A$ acts nilpotently. In what follows, we will assume that $A$ is nilpotent.
Up to conjugation, $A$ is determined by its Jordan normal form. However, equivalent to JNF for a nilpotent matrix is the data $a_i'=\dim \ker A^i$ for all $i$. This is obviously an increasing sequence. Less obvious is that the sequence $(a_i)$ where $a_i=a'_i-a'_{i-1}$ is a decreasing sequence, and hence forms a partition of $\dim V$ where $A:V\to V$. We note that this data is equivalent to the data in JNF, as $a_i-a_{i+1}$ will be the number of Jordan blocks of size $i$. More explicitly, a jordan block of size $k$ corresponds to the partition $(1,1,1,1,1\ldots, 0,0,0,\ldots)$ with $k$ $1's$, and if a nilpotent matrix $A=\oplus A_i$ is written in block form where each block $A_i$ corresponds to a partition $\pi_i$, then $A$ corresponds to the partition $\pi=\sum \pi_i$, where the sum is taken termwise, e.g. $(2,1)+(1,1)+(7,4,2)=(10,6,2)$.
Moreover, $A^2$ corresponds to the partition $(a_1+a_2, a_3+a_4,\ldots, a_{2i-1}+a_{2i}, \ldots).$ Because every matrix will be conjugate to a JNF matrix and $\sqrt{SAS^{-1}}=S\sqrt{A}S^{-1}$, we see that a matrix will have a square root if and only if the corresponding partition has a "square root."
The only obstruction to a partition having a square root is if two consecutive odd entries are equal. Otherwise, we can take one (of many) square roots by replacing each $a_i$ with the pair $\lceil a_i/2 \rceil, \lfloor a_i/2 \rfloor$.