Is a single element linearly independent in an abelian group

abelian-groupsabstract-algebragroup-theoryproof-explanation

This is my definition of linearly independent elements in an abelian group:

Let $A$ be an abelian group, let $X \subseteq A$ be a subset and $x_1,\dots,x_k \in X$ with $x_i \neq x_j$ for all $1 \leq i \neq j \leq n$. The elements $x_1,\dots,x_k$ are defined to be linearly independent if they satisfy the following condition:

\begin{equation} n_1x_1 + \dots + n_kx_k = 0 \, , \, \text{with} \hspace{2mm} n_1,\dots,n_k \in \mathbb{Z} \implies n_i = 0 \hspace{2mm} \text{for all} \hspace{2mm} 1 \leq i \leq n\end{equation}

I'm proving theorem 1.6 of Hungerford's Algebra and got stuck at "Either $G \cap H = \{0\}$, in which case $G = \langle d_1x_1 \rangle$ and the theorem is true […]". Why is $d_1x_1$ linearly independent? I need this in order to say that $\{d_1x_1\}$ is a basis of $G$. I'm pretty sure this comes from the fact that $d_1x_1 \neq 0$, but linearly independence does not seem to be true because $nd_1x_1 = 0$ might be true even though $n \neq 0$. For example, if $d_1x_1$ is an element of order $n$. So, is the implication
\begin{equation}
n(d_1x_1) = 0 \quad \wedge \quad d_1x_1 \neq 0 \quad \implies \quad n = 0
\end{equation}

true or false in the context of abelian groups? What if we consider free abelian groups instead? If it's always false, then how is the linearly independence proven in Hungerford's proof?

Best Answer

Let me address your question as formulated in your comment, namely why a single nonzero element of a free abelian group automatically forms a linearly independent subset.

What I think you might be missing is that the summation operator $\sum_{i=1}^k s_i$ is defined just as well for $k=1$ as it is for $k \ge 2$. In fact, in a rigorous treatment, one would define the $k$-ary summation operator of any associative binary operation by induction on $k \ge 1$. The basis step of the induction is to define $\sum_{i=1}^1 s_i = s_1$. In the inductive step, assuming that $k \ge 2$ and that $\sum_{i=1}^{k-1} s_i$ is already defined, one defines $\sum_{i=1}^{k} s_i = \left(\sum_{i=1}^{k-1} s_i\right) + s_{k}$.

All that goes to say that the definition of linearly independence is about linear combinations, which are just special kinds of finite sums, and a sum of one object is allowed as part of this definition:

A subset of $k$ distinct elements $\{x_1,...,x_k\} \subseteq A$ (where $k \ge 1$) is said to be linearly independent if for any sequence of integers $(a_1,...,a_k)$, if $\sum_{i=1}^k a_i x_i=0$ then $a_i=0$ for each $1 \le i \le k$.

Related Question