I'd like to learn more about the automorphism group of direct products of cyclic groups where at least one factor is $\mathbb Z$. There seem to be many articles studying analogous questions for finite groups, but i am unable to find anything for products infinite groups, not even for cyclic ones. Does anyone know any references that study this or something more general?
Automorphism groups of direct products of non-finite cyclic groups
automorphism-groupgroup-theoryreference-request
Related Solutions
Ideally an automorphism $\phi$ of $H\times H$ would look like $(\begin{smallmatrix}\alpha & \beta \\ \gamma & \delta \end{smallmatrix})$, with
$$\begin{pmatrix}\alpha & \beta \\ \gamma & \delta \end{pmatrix}\begin{pmatrix} h_1 \\ h_2 \end{pmatrix} = \begin{pmatrix} \alpha(h_1)\beta(h_2) \\ \gamma(h_1)\delta(h_2) \end{pmatrix} \tag{$\circ$}$$
for some functions $\alpha,\beta,\gamma,\delta:H\to H$ (we're interpreting elements of $H^2$ as column vectors). This carries on the spirit of ${\rm Aut}(\Bbb Z/p\Bbb Z\times\Bbb Z/p\Bbb Z)\cong{\rm GL}_2(\Bbb Z/p\Bbb Z)$. If we restrict the domain or codomain to the subgroups $H\times 1$ or $1\times H$ we see that $\alpha,\beta,\gamma,\delta$ all must be endomorphisms.
Indeed by restricting the domain and projecting the codomain we see that $\alpha,\beta,\gamma,\delta$ can be determined directly from $\phi$. For the matrix to be an automorphism we must have
$$\begin{pmatrix}\alpha & \beta \\ \gamma & \delta \end{pmatrix} \begin{pmatrix} h_1h_2 \\ h_3h_4 \end{pmatrix} = \begin{pmatrix}\alpha & \beta \\ \gamma & \delta \end{pmatrix}\begin{pmatrix} h_1 \\ h_3 \end{pmatrix} \cdot \begin{pmatrix}\alpha & \beta \\ \gamma & \delta \end{pmatrix}\begin{pmatrix} h_2 \\ h_4 \end{pmatrix} $$
which becomes
$$\begin{pmatrix} \alpha(h_1h_2)\beta(h_3h_4) \\ \gamma(h_1h_2)\delta(h_3h_4) \end{pmatrix} = \begin{pmatrix} \alpha(h_1)\beta(h_3) \\ \gamma(h_1)\delta(h_3) \end{pmatrix} \cdot \begin{pmatrix} \alpha(h_2)\beta(h_4) \\ \gamma(h_2)\delta(h_4) \end{pmatrix}$$
which becomes
$$\begin{pmatrix} \alpha(h_1)\alpha(h_2)\beta(h_3)\beta(h_4) \\ \gamma(h_1)\gamma(h_2)\delta(h_3)\delta(h_4) \end{pmatrix} = \begin{pmatrix} \alpha(h_1)\beta(h_3)\alpha(h_2)\beta(h_4) \\ \gamma(h_1)\delta(h_3)\gamma(h_2)\delta(h_4) \end{pmatrix}.$$
Cancelling yields $\alpha(h_2)\beta(h_3)=\beta(h_3)\alpha(h_2)$ and $\gamma(h_2)\delta(h_3)=\delta(h_3)\gamma(h_2)$ for all $h_2,h_3\in H$, which is equivalent to $[\alpha(H),\beta(H)]=[\gamma(H),\delta(H)]=1$. Indeed these latter conditions guarantee that $\phi$ is in fact given by this matrix. One checks these conditions hold since $\alpha(H)$ and $\beta(H)$ are the images $\phi(H\times1)$ and $\phi(1\times H)$, and we know that $[H\times 1,1\times H]=1\times 1$ which carries over since $\phi$ is an automorphism (similar logic applies to $\gamma,\delta$). Since the conditions hold on the putative matrix entries determined from $\phi$, we conclude $\phi$ is in fact given by this matrix.
Now suppose $H$ is simple nonabelian.
Since $\alpha(H)\beta(H)=H$ and $[\alpha(H),\beta(H)]=1$, both $\alpha(H)$ and $\beta(H)$ are normal in $H$, and so $\alpha(H)$, $\beta(H)$ are both either trivial or $H$. It's not possible for both to be trivial since $(h_1,h_2)\mapsto\alpha(h_1)\beta(h_2)$ is surjective, nor is it possible for both to be $H$ since $[\alpha(H),\beta(H)]=1$. Therefore, one of $\alpha,\beta$ is an automorphism and the other is the trivial endomorphism (which I'll denote $0$). Same goes for $\gamma,\delta$ by the same logic. The matrix can't be $(\begin{smallmatrix}\alpha & 0 \\ \beta & 0\end{smallmatrix})$ or $(\begin{smallmatrix} 0 & \alpha \\ 0 & \beta \end{smallmatrix})$ however since these functions aren't $1$-$1$.
In conclusion, every automorphism of $H^2$ looks like $(\begin{smallmatrix}\alpha & 0 \\ 0 & \beta \end{smallmatrix})$ or $(\begin{smallmatrix} 0 & \alpha \\ \beta & 0\end{smallmatrix})$ for automorphisms $\alpha,\beta$.
Before moving on I want to discuss free products, semidirect products, and wreath products. The best way to intuitively understand the free product $A*B$ of two groups $A$ and $B$ is the set of all words formed from letters from the underlying sets of $A$ and $B$, with the understanding that concatenating elements from $A$ (or from $B$) together simplifies according to the original binary operation from in $A$ (or from in $B$). If a group $B$ acts on a group $A$ by automorphisms, then the semidirect product denoted by $A\rtimes B$ is the free product $A*B$ modulo the relations $b(a)=bab^{-1}$ (that is, conjugating $a$ by $b$ in the semidirect product amounts to applying $b$ as an automorphism to $a$). Since every element of the semidirect product looks like $ab$ for some $a\in A,b\in B$ uniquely, sometimes the semidirect product is constructed out of the set $A\times B$ by writing down what happens.
If a group $B$ acts by permutations on $\{1,\cdots,n\}$ then there is an induced action of $B$ by automorphisms on $A^n$: simply permute coordinates of any vector. Forming the resulting semidirect product $A^n\rtimes B$ yields the wreath product, denoted $A\wr B$. In particular, consider $H\wr C_2$ (where the nontrivial element of $C_2$ is the nontrivial swap of the two coordinates). Every element looks like either $(h_1,h_2)$ or $(h_1,h_2)\sigma$ with nontrivial multiplication rule $(h_1,h_2)\sigma(h_3,h_4)\sigma=(h_1h_4,h_2h_3)$ (since $\sigma=\sigma^{-1}$, this essentially conjugates $(h_3,h_4)$ by $\sigma$, which swaps the coordinates, and then we multiply $(h_1,h_2)(h_4,h_3)$).
I leave it to you to check that
$$\begin{pmatrix}\alpha & 0 \\ 0 & \beta\end{pmatrix}\mapsto (\alpha,\beta) \qquad \begin{pmatrix}0 & {\rm id}_H \\ {\rm id}_H & 0\end{pmatrix}\mapsto (e,e)\sigma $$
defines an isomorphism ${\rm Aut}(H^2)\cong {\rm Aut}(H)\wr C_2$.
For more information see Automorphisms of Direct Products of Finite Groups I and II. The first paper discusses a theorem with more "perpendicular" groups $H$ and $K$ hypothesized:
Theorem. If $H$ and $K$ have no common direct factor then
$${\rm Aut}(H\times K)\quad \cong\quad \left\{\begin{pmatrix}\alpha & \beta \\ \gamma & \delta \end{pmatrix} :~ \begin{matrix}\alpha\in{\rm Aut}(H) & \beta\in\hom(K,Z(H)) \\ \gamma\in\hom(H,Z(K)) & \delta\in{\rm Aut}(K)\end{matrix}\right\}.$$
Ledermann and B.H.Neumann ("On the Order of the Automorphism Group of a Finite Group. I", Proc. Royal Soc. A, 1956) have shown the following:
Theorem. Let $n > 0$. There exists a bound $f(n)$ such that if $G$ is a finite group with $|G| \geq f(n)$, then $|\operatorname{Aut}(G)| \geq n$.
An immediate consequence is that up to isomorphism, there are only finitely many finite groups $G$ with $|\operatorname{Aut}(G)| \leq n$. Hence for any finite group $X$, up to isomorphism there are only finitely many finite groups $G$ with $\operatorname{Aut}(G) \cong X$.
Among infinite groups this is no longer true, and indeed there are infinitely many groups $G$ with $\operatorname{Aut}(G) \cong \mathbb{Z} / 2 \mathbb{Z}$.
Then there is of course the question of determining all finite groups $G$ with given automorphism group $\operatorname{Aut}(G) \cong X$. For this, see for example
Iyer, Hariharan K. On solving the equation Aut(X)=G. Rocky Mountain J. Math. 9 (1979), no. 4, 653–670.
This paper gives a solution to the problem in some cases, and determines for example all $G$ with $\operatorname{Aut}(G) \cong S_n$. There is also a different proof of the fact that there are only finitely many groups with a given automorphism group (Theorem 3.1 there).
Best Answer
The analysis is the same, whether the groups are finite or infinite.
Suppose $G$ and $H$ are two abelian groups. A morphism $\varphi\colon G\times H\to G\times H$ corresponds to four morphisms, $$\begin{align*} \varphi_{G,G}\colon &\ G\to G\\ \varphi_{G,H}\colon &\ G\to H\\ \varphi_{H,G}\colon &\ H\to G\\ \varphi_{H,H}\colon &\ H\to H \end{align*}$$ by the rule that $$\varphi(g,h) = \Bigl( \varphi_{G,G}(h)\varphi_{H,G}(h), \varphi_{G,H}(g)\varphi_{H,H}(h)\Bigr);$$ namely, $\varphi_{G,G}$ is the restriction of $\pi_G\circ\varphi$ to $G$; $\varphi_{G,H}$ is the restriction of $\pi_G\circ\varphi$ to $G$, etc.
We can view this as a $2\times 2$ matrix acting on a column vector, $$\left(\begin{array}{cc} \varphi_{G,G} & \varphi_{H,G}\\ \varphi_{G,H} & \varphi_{G,G}\end{array} \right)\left(\begin{array}{c}g\\h\end{array}\right) = \varphi\left(\begin{array}{c}g\\h\end{array}\right).$$
It is now straightforward to verify that if $\varphi$ and $\theta$ are morphisms, then the matrix corresponding to $\varphi\circ\theta$ is precisely the "product" of the matrices corresponding to $\varphi$ and to $\theta$. So that the endomorphisms of $G\times H$ correspond to $$\left(\begin{array}{cc} \mathrm{Hom}(G,G) & \mathrm{Hom}(H,G)\\ \mathrm{Hom}(G,H) & \mathrm{Hom}(H,H) \end{array}\right).$$ When one of the factors is torsion (say $H$) and the other is torsionfree (say $G$), then one of the off-diagonal entries is necessarily trivial (in this case, $\mathrm{Hom}(H,G)$). And then it is straightforward to check that the endomorphism is an automorphism if and only if the diagonal entries are automorphisms.
For example, $\mathrm{Aut}(\mathbb{Z}\times C_2)$ will then correspond to matrices of the form $$\left(\begin{array}{cc} \psi & 0\\ \theta & 1\end{array}\right),$$ where $\psi$ is an automorphism of $\mathbb{Z}$ (either the identity, or the map that sends $a$ to $-a$), $1$ is the identity map of $C_2$ (its only automorphism), and $\theta\colon\mathbb{Z}\to C_2$ is a morphism; either the map that sends everything to $0$, or else the map that sends $1\in\mathbb{Z}$ to the generator of $C_2$. So you have a total of four automorphisms.
The automorphisms of $\mathbb{Z}\times C_4$, on the other hand, correspond to matrices of the form $$\left(\begin{array}{cc} \psi & 0\\ \theta & \phi \end{array}\right)$$ where $\psi$ is an automorphism of $\mathbb{Z}$ (either the identity or multiply-by-minus-one); $\phi$ is an automorphism of $C_4$ (either the identity, or the map sending each element to its inverse), and $\theta\colon\mathbb{Z}\to C_4$ is any of the four possible morphisms, sending $1$ to your favorite element of $C_4$. So here we have $2\times 2\times 4 = 16$ possible automorphisms.
For three or more factors, provided there are only finitely many factors, the answer is similar: the endomorphisms of $A_1\times\cdots\times A_n$ correspond to $n^2$ morphisms, $\varphi_{i,j}\colon A_i\to A_j$, and you can "arrange" them in a matrix form so that composition of endomorphisms corresponds to multiplication of matrices, and the automorphisms are precisely the invertible matrices. Having some groups be torsion and some torsionfree means that some of the entries must be equal to $0$ (the ones corresponding to morphisms from the torsion groups to the torsionfree groups).