What is the minimum value of $k$ such that every non-singular $n\times n$ real matrices can be made singular by switching EXACTLY $k$ entries with ZERO ?
[Math] Making non-singular matrices singular
linear algebraoptimization
Related Solutions
If $A$ is a nonsingular matrix with rows $r_1,r_2,\ldots,r_n$, then $\{r_2,\ldots,r_n\}$ spans an $(n-1)$-dimensional subspace $P$ of $\mathbb R^n$. At least one of the standard basis vectors $e_1,e_2,\ldots,e_n$ is not in $P$, say $e_i$. Then $\{e_i,r_2,r_3,\ldots,r_n\}$ is a basis of $\mathbb R^n$, and it follows that there is a real number $c$ such that $r_1-ce_i$ is in $P$. The matrix $A'$ with rows $(r_1-ce_i),r_2,r_3,\ldots,r_n$ is singular, and it is obtained from $A$ by subtracting $c$ from the entry in the first row and $i^\text{th}$ column.
Here's a way to rephrase this somewhat more geometrically. The subspace $P$ is a hyperplane that divides $\mathbb R^n$ into two half-spaces, and $r_1$ lies in one of these halves. The line through $r_1$ in the direction of a vector $v$ has the form $\{r_1+tv:t\in\mathbb R\}$. This line is parallel to $P$ only if $v$ is in $P$; otherwise, the line will cross $P$. Since $P$ can't be parallel to all of the coordinate directions (or else it would fill up all of $\mathbb R^n$), there must be a line of the form $\{r_1+te_i:t\in\mathbb R\}$ that crosses $P$, where $e_i$ is the standard basis vector with a $1$ in the $i^\text{th}$ position and $0$s elsewhere. This means that there exists $t_0\in \mathbb R$ such that $r_1+t_0e_i\in P$. And then, linear dependence of the vectors $r_1+t_0e_i,r_2,\ldots,r_n$ means that the matrix with those rows is singular.
Choose any matrix with rank $n-1$ that does not have any of the standard unit vectors in its column space.
Added in response to the comment by alex.jordan.
Let $A$ be an $n \times n$ matrix with $rank(A) = n-1$ such that there are vectors $a, \, \tilde a$ with $Aa = 0, \tilde a^T A = 0$ that have all entries $ a_i , \tilde a_i \ne 0$. Then any matrix $B$ that differs from $A$ in exactly one entry has full rank, i.e. $\det B \ne 0$.
To prove this, consider such a $B$. After permuting rows and columns and rescaling, we may assume that $B_{1,1} = A_{1,1} + 1$.
First note that the first column of $A$ is a linear combination of columns $2, \dots, n$, since $Aa = 0$ and $a_1 \ne 0$. The column space of $B$ certainly contains the column space of $A$ and thus $rank(B) \ge n-1$.
If $rank(B) < n$, then $A$ and $B$ must therefore have the same column space. Hence the standard unit vector $e_1$ is in the column space of $A$, that is $Ac = e_1$ for some vector $c$. But then $0 = \tilde a^TAc = \tilde a e_1 = \tilde a_1$, contradicting the assumptions for the left and right null vectors of $A$.
Therefore $rank(B) > n-1$ and $\det B \ne 0$.
Best Answer
You can always make an $n \times n$ matrix singular by making $n$ entries ( all in one row or one column) zero. For a matrix $(a_{ij})$ whose entries are algebraically independent, it is impossible to do with fewer than $n$. You can prove (e.g. using Hall's marriage theorem) that if you make fewer than $n$ elements zero there will still be a permutation $\pi$ of $[1,\ldots,n]$ such that all $a_{i,\pi(i)}$ are unaffected, so the corresponding term in the determinant will still be there and can't be cancelled by the other terms.
EDIT: This is, of course, over a field that has $n^2$ algebraically independent elements. You can do it over any infinite field, with a bit more work, by constructing inductively a sequence $c_j$ of nonzero elements such that for each $n$, all terms of the following form are distinct and nonzero: $\sum_{i=1}^k \prod_{j \in S_i} c_j$ where $S_1, \ldots, S_k$ are distinct subsets of $\{1, \ldots, n\}$, $1 \le k \le 2^n$. On the other hand, it doesn't work over finite fields. Thus over $GF(2)$ you can always make a nonsingular matrix singular by changing one element to $0$.