Here's what you can do. Let $A=(a_{ij})_{1\le i\le n, 1\le j\le m}$ be your matrix. I assume that all the entries are in some (not necessarily f.d.) extension field of $\Bbb{Q}$.
Let $V_j$ be the $\Bbb{Q}$-span of all the entries $a_{ij}, 1\le i\le n$. That is, the span of the entries on column $j$. Note that the space $V_j$ is finitely generated. Therefore we can find a basis $\mathcal{B}_j$ of $V_j$ over $\Bbb{Q}$. We can now obviously write all the entries $a_{ij}$, $1\le i\le n$, as $\Bbb{Q}$-linear combinations of numbers in $\mathcal{B}_j$.
Here's how you can answer your question using Gaussian elimination (over $\Bbb{Q}$):
- Replace column $j$ of $A$ with $|\mathcal{B}_j|$ columns simply by replacing each entry $a_{ij}$ with its vector of coordinates with respect to $\mathcal{B}_j$. Call the resulting matrix $\tilde{A}$.
- Perform the usual Gaussian elimination on $\tilde{A}$. Observe that as all the entries of $\tilde{A}$ are in $\Bbb{Q}$, this process will never take you outside of $\Bbb{Q}$.
- You can then read the row rank over $\Bbb{Q}$ of $A$ as the rank of $\tilde{A}$. This is because the bases $\mathcal{B}_j$ faithfully represent linear (in)dependencies over $\Bbb{Q}$. Similarly, you can find a basis for the row space of $A$ over $\Bbb{Q}$ simply by rewriting the $j$th blocks of the non-zero rows of the (reduced) row echelon form of $\tilde{A}$ as elements of $V_j$.
- As usual, the rank of $\tilde{A}$ is also equal to the size of its largest non-vanishing minor.
As a first example consider the one I proffered in the comments (to calibrate my understanding of the question):
$$
A=\left(\begin{array}{c}1\\ \sqrt{2}\\ \sqrt{3}\\ \sqrt{6} \end{array}\right).
$$
There is a single column. The space $V_1$ is the degree four extension field $F=\Bbb{Q}(\sqrt2,\sqrt3)$. From the first course on field extensions we know that
$\mathcal{B}_1=\{1,\sqrt2,\sqrt3,\sqrt6\}$ is a $\Bbb{Q}$-basis of $F$. Writing
the elements of $A$ in terms of this basis thus leads to the matrix
$$
\tilde{A}=\left(\begin{array}{rrrr}1&0&0&0\\ 0&1&0&0\\ 0&0&1&0\\ 0&0&0&1 \end{array}\right).
$$
We immediately see that $\tilde{A}$ is already in the row echelon form. Therefore $\tilde{A}$ has full rank, and we can conclude that the row rank of $A$ over $\Bbb{Q}$ is four.
As another example let's consider the matrix
$$
B=\left(\begin{array}{rr}
1&1\\
\sqrt2&i\\
3&1+i\\
2+\sqrt2&2i\end{array}\right).
$$
Here the entries on the first column are in the span of $\mathcal{B}_1=\{1,\sqrt2\}$
while the entries on the second column are in the span of $\mathcal{B}_2=\{1,i\}$.
Replacing the entries of $B$ with their respective coordinate vectors gives us the matrix
$$
\tilde{B}=\left(\begin{array}{rrrr}
1&0&1&0\\
0&1&0&1\\
3&0&1&1\\
2&1&0&2
\end{array}\right).
$$
Subtracting the appropriate multiples of the first row from the others gives
$$
\rightarrow
\left(\begin{array}{rrrr}
1&0&1&0\\
0&1&0&1\\
0&0&-2&1\\
0&1&-2&2
\end{array}\right).
$$
Further subracting the prescribed multiples of the second row from those below it yields
$$
\rightarrow
\left(\begin{array}{rrrr}
1&0&1&0\\
0&1&0&1\\
0&0&-2&1\\
0&0&-2&1
\end{array}\right).
$$
From here we see that the last row is a replica of the third, and will vanish in the next step. We can conclude that $\tilde{B}$ has rank $3$, and therefore also the the row rank of $B$ over $\Bbb{Q}$ is also $3$. Indeed, if we denote by $u_i$ the $i$th row of $B$, we easily see that $u_3-u_1=(2,i)=u_4-u_2,$ so the rows of $B$ are linearly dependent over $\Bbb{Q}$.
Running this algorithm will make it necessary to correctly identify the dimensions of the spaces $V_j$ (after which finding bases for them is usually straightforward). Some basic facts about algebraic extensions suffice to handle my example cases, but I do not know, whether such techniques will be available in the examples that interest you the most.
In the second example from the question the space $V_1$ consists of degree one polynomials with coefficients from $\Bbb{Q}$, so $\mathcal{B}_1=\{1,x\}$.
We see that $V_2=\sqrt{2}V_1$, so we can use $\mathcal{B}_2=\sqrt2\mathcal{B}_1$. Finally we see that $V_3$ is the 1-dimensional space spanned by $e^x$. For this matrix, call it $C$, we thus get
$$
\tilde{C}=\left(\begin{array}{rrrrr}
1&1&0&1&1\\
1&0&1&0&1\\
0&0&0&0&1\end{array}\right).
$$
Looking at columns $2,3$ and $5$ we immediately see that $\tilde{C}$ has full rank. Therefore the row rank of $C$ over $\Bbb{Q}$ is also $3$.
Best Answer
Let's consider the matrix:
$$A = \begin{bmatrix}a & b & c\\d & e & f\\g & h & i\end{bmatrix}$$
The cofactors along the first row are:
$$C_{1,1} = \begin{vmatrix}e & f\\h & i\end{vmatrix}$$ $$C_{1,2} = -\begin{vmatrix}d & f\\g & i\end{vmatrix}$$ $$C_{1,3} = \begin{vmatrix}d & e\\g & h\end{vmatrix}$$
And we have that:
$$\det(A) = aC_{1,1} + bC_{1,2} + cC_{1,3}$$
But now consider the equation:
$$dC_{1,1} + eC_{1,2} + fC_{1,3}$$
That would be the cofactor expansion of the matrix: $$B = \begin{bmatrix}d & e & f\\d & e & f\\g & h & i\end{bmatrix}$$
Since that matrix has the same cofactors along the first row as $A$.
But since $B$ has two identical rows, we know that its determinant is zero, so:
$$\det(B) = dC_{1,1} + eC_{1,2} + fC_{1,3} = 0$$
I think the key idea is that, since the cofactors computed along a row (or column) do not use the values of that row (or column), then replacing that row (or column) does not change the cofactors computed along it.
That is why the linear combination $$\sum_{j=1}^{n} a_{k,j} C_{i,j}$$ is the same as the determinant of matrix $A$ with row $i$ replaced with row $k$.
EDIT adding additional details.
The question was about why it is true that:
"If elements of a row (or column) are multiplied with cofactors of any other row (or column), then their sum is zero."
To show why that is true, I first chose a row for the cofactors (in the example above, row 1), then a different row to multiply them by (in the example above, row 2).
Then I showed that the product is the same as the determinant of a different matrix (matrix $B$) in which row 1 is replaced by row 2.
So $B$ was not some random matrix. It was a matrix determined by our choices of
Once we have chosen those two rows (or columns), we can see that the product of the cofactors for row (or column) $i$ by row (or column) $j$ is the same as the cofactor expansion of a different matrix, in which row (or column) $i$ is replace by row (or column) $j$, so that it has two copies of that row (or column): one in row (or column) $i$ and another in row (or column) $j$.
Hence, that matrix has a determinant of zero, so the equation is equal to zero.