Submatrix of signed incidence matrix of a graph containing a cycle

algebraic-graph-theorygraph theoryinductionlinear algebramatrices

Let $G=(V,E)$ be a (simple) graph. Write $V = \{v_1,\cdots, v_n\}$ and $E = \{e_1,\cdots, e_m\}.$ Let $I(G)$ be the incidence matrix of $G$ (i.e. a $|V|\times |E|$ matrix whose entries $(i,j)$ are $1$ if vertex $i$ is incident with edge $j$ and $0$ otherwise). Let $S(G)$ be any signed incidence matrix of $G$, obtained by arbitrarily switching one of the two $1$'s in each column to a $-1$ (say edges start at the rows with $-1$ in the column corresponding to those edges).

Prove that if $F = \{f_1,\cdots, f_k\}$ is a subset of $E(G) = E,$ then $F$ corresponds to a linearly dependent set of columns of $S(G)$, denoted $S_F$, iff the subgraph $G_F = (V_F, F)$ of $G$ contains a cycle, where $V_F = \{v \in V : v\cap e \neq \emptyset\}$ for some $e\in F$.

I think I know how to prove that reverse implication, but I'm not sure how to prove the forward one. Below is what I've come up with so far.

Now suppose $F $ corresponds to a linearly dependent set of columns of $S(G).$ We construct a cycle in the graph $G_F$ by induction. For the base case, note that if $F$ is linearly dependent, we necessarily have that $|F| \geq 3$. Clearly it doesn't hold if $|F| = 1$, as $\{f_1\}$ is linearly independent, and if $|F| = 2,$ then the columns corresponding to $f_1$ and $f_2$ are linearly independent because otherwise the two edges $f_1$ and $f_2$ would have the same endpoints, which contradicts the fact that they are distinct edges (indeed a componentwise matching of the columns corresponding to $f_1$ and $f_2$ with the zero vector shows that if $c_1$ and $c_2$ are constants so that $c_1 col(f_1) + c_2 col(f_2) = 0, c_1 = c_2 = 0$, where $col(f_i)$ is the column index corresponding to $f_i$). So $|F|\geq 3,$ and we can find constants $y_1,\cdots, y_k,$ not all zero, so that $y_1 col(f_1)+\cdots y_k col(f_k) = 0.$ Each row of $S(G_F)$ must have two or $0$ nonzero entries, as otherwise there would be at least one row with only one nonzero entry in a column $f_i$, so $y_i$ would need to be zero, and we know that the two remaining vectors are linearly independent by the above reasoning. So the base case holds. Now assume for some $k\geq 3$ that if $F$ corresponds to a linearly dependent set of columns, the subgraph $G_F$ contains a cycle. Let $F' = \{f'_1,\cdots, f'_{k+1}\}$ be a set of cardinality $k+1$ so that $F'$ corresponds to a linearly dependent set of columns of $S(G).$ Thus by definition we can find a column $c_{f'_e}$ that is a nontrivial linear combination of other columns, say columns $c_{f'_1},\cdots, c_{f'_l}.$

The problem I'm having is that removing a column may very well result in a linearly independent set of columns, which means I can't use the inductive hypothesis.

Best Answer

I think it is easier to show this directly, rather than via induction. If $S_F$ is linearly dependent, then we can choose vectors $c_1, \dots, c_k \in S_F$, corresponding to edges $f_1, \dots, f_k \in F$, such that there exists a linear combination $\sum \alpha_i c_i = 0$. We may assume that $\alpha_i \neq 0$ for all $i$; otherwise, we just leave out the corresponding vector.

Now in the subgraph of $G$ induced by $f_1, \dots f_k$ every vertex must have degree at least two, for otherwise the corresponding coordinate could not sum to $0$ in the above linear combination. (You have made essentially the same observation in the base case of your induction proof idea.) But a graph with minimum degree two always contains a cycle; or said another way, a graph without a cycle – a forest – always contains a vertex of degree one – a leaf. (Unless it has no edges at all, in which case the minimum degree is even lower.)