Linear combination of the rows of a matrix with all resulting components different to zero

combinationslinear algebramatrices

I have a matrix with n rows and every column has at least one non-zero element. I want to produce a linear combination of the rows (preferably one that contains the smallest coefficients possible) such that the resulting combination has no zero components. Is there a way to obtain this analytically?

I have come up with an algorithm that yields an answer but I am not sure if it is the most efficient/optimal solution: Firstly it selects a row and adds it to the first row of the matrix. It keeps adding this row until no components get canceled out. It then moves to the next row and repeats the previous steps until the end of the matrix.

Is there a better way?

Best Answer

Suppose $A$ is your $m \times n$ matrix. From what I've gathered from our comments, what you want to be doing is projecting the vector $j \in \Bbb{R}^m$ containing only $1$s onto the rowspace of $A$. In this way, you'll obtain a linear combination that is as close to $j$ as possible, in terms of the Euclidean $2$-norm.

I suggest considering $(A^\top)^+ = (A^+)^\top$, the Moore-Penrose pseudoinverse of $A^\top$. Essentially, we want a least squares approximation of the matrix equation $$A^\top x = j$$ where $x$ is an arbitrary $m \times 1$ column vector, consisting of coefficients in the linear combination of rows (also, $j$ is considered a column vector). Then, if $w$ is an arbitrary $m \times 1$ column vector, choosing $$x = (A^\top)^+j + (I - (A^\top)^+A^\top)w$$ will produce an optimal linear combination. In particular, we can choose $w = 0$, and simply use $$x = (A^\top)^+j.$$ If the rows of $A$ are linearly independent, then the pseudoinverse is straight forward to compute: $$(A^\top)^+ = (AA^\top)^{-1}A.$$ Thus, you may want to consider removing rows that are linearly dependent on the others.