The statement isn't true in general. Consider $v_1, v_2 \in \mathbb{C}^2$ given by$$v_1 = \begin{bmatrix} 1 \\ 1\end{bmatrix}, v_2 = \begin{bmatrix} i \\ 0\end{bmatrix}$$
Then $\dim_{\mathbb{R}}\big[\operatorname{span}_{\mathbb{R}}\{v_1, v_2\}\big] = 2 = \dim_{\mathbb{C}}\big[\operatorname{span}_{\mathbb{C}}\{v_1, v_2\}\big]$ but
$$V^*V = \begin{bmatrix} 2 & i \\ -i & 0\end{bmatrix}$$
isn't real.
The other direction on the other hand is true. Assume that
$V^*V$ is real. WLOG assume that
$v_1, \ldots, v_r$ are linearly independent over
$\mathbb{R}$ for some
$1 \le r \le d$. We claim that they are also linearly independent over
$\mathbb{C}$.
Assume $\sum_{i=1}^r \alpha_iv_i = 0$ for some scalars $\alpha_i \in \mathbb{C}$. Scalar multiplying this by $v_j$ we get
$$\sum_{i=1}^r\alpha_i\underbrace{\langle v_i, v_j\rangle}_{\in\mathbb{R}} = 0, \quad\forall j =1, \ldots, r$$
In particular $$\sum_{i=1}^r(\operatorname{Im}\alpha_i)\langle v_i, v_j\rangle = 0, \quad\forall j =1, \ldots, r$$
Multiplying this by $\operatorname{Im}\alpha_j$ we obtain
$$\left\langle\sum_{i=1}^r(\operatorname{Im}\alpha_i)v_i, (\operatorname{Im}\alpha_j)v_j \right\rangle = \sum_{i=1}^r(\operatorname{Im}\alpha_i)(\operatorname{Im}\alpha_j)\langle v_i, v_j\rangle = 0, \quad\forall j =1, \ldots, r$$
Summing this over $j=1, \ldots, r$ we get
$$\left\|\sum_{i=1}^r(\operatorname{Im}\alpha_i)v_i\right\|^2 = \left\langle \sum_{i=1}^r(\operatorname{Im}\alpha_i)v_i, \sum_{i=1}^r(\operatorname{Im}\alpha_j)v_j\right\rangle = 0$$
so $\sum_{i=1}^r(\operatorname{Im}\alpha_i)v_i = 0$ which implies $\operatorname{Im}\alpha_i = 0$ for $i=1, \ldots, r$ since $v_1, \ldots, v_r$ are linearly independent over $\mathbb{R}$. Hence $\alpha_i \in \mathbb{R}$ and so $\alpha_i = 0$ for $i=1, \ldots, n$.
Therefore
\begin{align}
\dim_{\mathbb{R}}\big[\operatorname{span}_{\mathbb{R}}\{v_1, \ldots, v_n\}\big] &= \dim_{\mathbb{R}}\big[\operatorname{span}_{\mathbb{R}}\{v_1, \ldots, v_r\}\big] \\
&= r\\
&= \dim_{\mathbb{C}}\big[\operatorname{span}_{\mathbb{C}}\{v_1, \ldots, v_r\}\big] \\
&= \dim_{\mathbb{C}}\big[\operatorname{span}_{\mathbb{C}}\{v_1, \ldots,v_n\}\big]
\end{align}
Best Answer
Briefly, $$x_{\text{new}} = \begin{bmatrix} \Re(x)^T & \Im(x)^T \end{bmatrix}^T \in \mathbb{R}^{2n}$$ where $\Re(x) \in \mathbb{R}^{n} $ is the real part of $x$ and $\Im(x) \in \mathbb{R}^{n}$ is the imaginary part of $x$.
A general complex system $$Cz = h,$$ where $C \in \mathbb{C}^{n \times n}$ and $h \in \mathbb{C}^n$ can be expressed as $$(A+iB)(x+iy) = f + ig$$ where $A, B \in \mathbb{R}^{n \times n}$ and $f, g \in \mathbb{R}^n$. We see that this linear system is satisfied if and only if \begin{align} Ax - By &= f \\ Bx + Ay &= g \end{align} or equivalently $$ \begin{bmatrix} A & -B \\ B & A \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = \begin{bmatrix} f \\ g \end{bmatrix}. $$ In this manner, a complex linear system can be solved using only real arithmetic. This is useful in the context of computing, say, eigenvectors of real matrices or matrix pencils for which the eigenvalues can be complex. Here the use of real arithmetic saves arithmetic operations and storage when some of the selected eigenvalues are real.