Linear Algebra – Equivalence of Linear Congruences Over a Finite Ring

abstract-algebrafinite-ringslinear algebramodular arithmeticnumber theory

Let $n,N$ be two natural numbers. Let $a=(a_1, …, a_n),b=(b_1, …, b_n)\in (\mathbb{Z}/N\mathbb{Z})^n$ with $\gcd(a_1, …, a_n, N) =1$ and $\gcd(b_1, …, b_n, N) =1$. Define $H_a = \{(x_1,…,x_n)\in(\mathbb{Z}/N\mathbb{Z})^n : a_1x_1+…+a_nx_n \equiv 0 \mod N\}$ and similarly define $H_b$. What is a necessary and sufficient condition for $H_a = H_b$ to hold? I suspect the condition is $(a_1, …, a_n)= k\cdot(b_1, …, b_n)$ for some $ k \in (\mathbb{Z}/N\mathbb{Z})^*$ since it's a sufficient condition and for $n=1$ I know it's also necessary. Can someone help me treat the $n \geq 2$ case?

Best Answer

Let $R = \Bbb Z/N\Bbb Z$. I'll closely follow this 2002 paper by V. P. Elizarov.

Lemma 4.2. If $a, b \in R$ and $\gcd(a,b)=d$ in $\Bbb Z$, then there exists a matrix $Q \in GL_2(R)$ s.t. $(a,b)Q = (d,0)$.

Proof. If $a = 0$ or $b = 0$, take $Q$ to be the identity or an antidiagonal matrix with $1$'s. Suppose $a, b \not = 0$. Over $\Bbb Z$, express $d = a u_1 + b v_1$ by Bezout's identity. If $a = df$ and $b = dg$, then $1 = f u_1 + g v_1$ in $\Bbb Z$.

Furthermore, denote $u = u_1 \mod N$ and $v = v_1 \mod N$ ($u,v \in R$); then we have $d = au+bv$ in $R$, $1=fu+gv$ in $R$. Take $Q = (\begin{smallmatrix}u&g\\v&-f\end{smallmatrix})$ (it seems that the second column in his paper was upside down). Then $AQ = (au + bv, ag-bf) = (1,0)$; the determinant is $-uf-vg=-1$ which means $Q$ is invertible.

Statement 4.3. If $\gcd(a_1, \ldots, a_n) = d$, then there exists $V \in GL_n(R)$ s.t. $AV = (d, 0, \ldots, 0)$, where $A = (a_1,\ldots,a_n)$.

Proof. Start from $(a_{n-1}, a_n)$ and successively eliminate the nonzero elements using Lemma 4.2 (going backwards). Then take the product of these matrices, padded with blocks of identity transformations.

Special case of Statement 4.4. The set of solutions of $A (x_1, \ldots, x_n)^T = 0$ over $R$, where $\gcd(\{a_i\}, n) = 1$, is $V (0, *, \ldots, *)^T$ (the asterisks are arbitrary elements of $R$).

The proof is said to be "a direct check".


Denote $S = (0, *, \ldots, *)^T$. Then, for $A$ and $B$, we can find the corresponding matrices $V_A$ and $V_B$ and ask whether $V_A S = V_B S$, or equivalently, $V_B^{-1}V_A S = S$. Both sides have equal cardinalities, so we only need to check that $V_B^{-1} V_A e_k \in S$ for $k = 2, \ldots, n$ ($\{e_k\}$ is a "standard basis" of $S$, i.e. vectors of all zeros except one $1$).

Related Question