This is a result of using Cramer's rule to calculate the inverse of $\mathbf{X}^{\prime}\Sigma^{-1}\mathbf{X}$.
Note that the matrix $(\mathbf{X}^{\prime}\Sigma^{-1}\mathbf{X})^{-1}$ is the covariance matrix of the parameters $\beta_i$. So
$$
\text{Var}(\beta_1) = (\mathbf{X}^{\prime}\Sigma^{-1}\mathbf{X})^{-1}_{1,1}
$$
The first element in the matrix above is the variance of this parameter $\beta_1$. Now to calculate this value we can use Cramer's rule. To use Cramer's rule to find the inverse of a matrix $A$ we have
$$
A^{-1} = \frac{1}{\text{det}(A)}\text{Adj}(A)
$$
In this case $A=\mathbf{X}^{\prime}\Sigma^{-1}\mathbf{X}$, and the element we are seeking in $\text{Adj}(A)$ is $|F|$.
Cramer's rule is a very ineffective way of computing an inverse compared to standard methods. This rule usually pops up in situations like these, where one needs an expression for a specific element in the inverse.
Question: In the setup above, are conditions (1) and (2) satisfied?
Answer: No, in general the conditions are not satisfied.
The following example provides a proof of the answer.
\begin{align*}
X &= \begin{bmatrix}
1 & 0 \\
1 & 0 \\
0 & 1 \\
0 & 1 \\
\end{bmatrix},
Y = \begin{bmatrix}
1 \\2 \\3\\4
\end{bmatrix}, \Sigma = \begin{bmatrix}
1 & 0&0&0 \\
0&5&0&0 \\
0&0&5&0\\
0&0&0&5
\end{bmatrix}.
\end{align*}
Notice that $\Sigma, X'X$ and $X'\Sigma^{-1}X$ are all diagonal matrices with non-zero, positive, elements on the diagonals. Thus, they are all positive definite and have the standard basis vectors as eigenvectors. That is, they satisfy the setup and condition 1). It is easy to check that the OLS and GLS estimates are different (see code below). Thus, condition 2) must not hold. Let's see why.
In this example, $k=2$ so the columns of $H$ are two eigenvectors of $\Sigma$. Let $A=[a_1, a_2]$. Then $X = HA$ implies that $Ha_1 = x_1 = [1,1,0,0]'$. The eigenvectors of $\Sigma$ are the standard basis vectors, say $e_i$, and, thus, it must be that $H = [e_1, e_2]$ up to reordering of the columns. But then $x_2 =[0,0,1,1]' \notin \mathrm{span}(H)$, i.e. we cannot pick $a_2$ to satisfy the requirement that $X=HA$. We conclude condition 2) is not satisfied.
The following R code snippet shows that the GLS estimates, in this case WLS because of the diagonal covariance matrix, differ from the OLS estimates.
X <- matrix(c(1,1,0,0,0,0,1,1), ncol = 2); Y <- 1:4; E <- diag(c(1, 5, 5, 5))
coef(lm(Y ~ X - 1)
>X1 X2
>1.5 3.5
coef(lm(Y ~ X - 1, weights = 1/diag(E)))
>X1 X2
>1.66667 3.50000
Best Answer
My guess is that the numbers are too big (the determinant is large) and you're running into a computational problem.
I was able to replicate your error by running:
The problem is numerical. You might be able to solve it by making some transformation of your $X$ matrix that makes the numbers smaller but allows you to work out what $\left(X'X\right)^{-1}$ is.