Ideally an automorphism $\phi$ of $H\times H$ would look like $(\begin{smallmatrix}\alpha & \beta \\ \gamma & \delta \end{smallmatrix})$, with
$$\begin{pmatrix}\alpha & \beta \\ \gamma & \delta \end{pmatrix}\begin{pmatrix} h_1 \\ h_2 \end{pmatrix} = \begin{pmatrix} \alpha(h_1)\beta(h_2) \\ \gamma(h_1)\delta(h_2) \end{pmatrix} \tag{$\circ$}$$
for some functions $\alpha,\beta,\gamma,\delta:H\to H$ (we're interpreting elements of $H^2$ as column vectors). This carries on the spirit of ${\rm Aut}(\Bbb Z/p\Bbb Z\times\Bbb Z/p\Bbb Z)\cong{\rm GL}_2(\Bbb Z/p\Bbb Z)$. If we restrict the domain or codomain to the subgroups $H\times 1$ or $1\times H$ we see that $\alpha,\beta,\gamma,\delta$ all must be endomorphisms.
Indeed by restricting the domain and projecting the codomain we see that $\alpha,\beta,\gamma,\delta$ can be determined directly from $\phi$. For the matrix to be an automorphism we must have
$$\begin{pmatrix}\alpha & \beta \\ \gamma & \delta \end{pmatrix} \begin{pmatrix} h_1h_2 \\ h_3h_4 \end{pmatrix} = \begin{pmatrix}\alpha & \beta \\ \gamma & \delta \end{pmatrix}\begin{pmatrix} h_1 \\ h_3 \end{pmatrix} \cdot \begin{pmatrix}\alpha & \beta \\ \gamma & \delta \end{pmatrix}\begin{pmatrix} h_2 \\ h_4 \end{pmatrix} $$
which becomes
$$\begin{pmatrix} \alpha(h_1h_2)\beta(h_3h_4) \\ \gamma(h_1h_2)\delta(h_3h_4) \end{pmatrix} = \begin{pmatrix} \alpha(h_1)\beta(h_3) \\ \gamma(h_1)\delta(h_3) \end{pmatrix} \cdot \begin{pmatrix} \alpha(h_2)\beta(h_4) \\ \gamma(h_2)\delta(h_4) \end{pmatrix}$$
which becomes
$$\begin{pmatrix} \alpha(h_1)\alpha(h_2)\beta(h_3)\beta(h_4) \\ \gamma(h_1)\gamma(h_2)\delta(h_3)\delta(h_4) \end{pmatrix} = \begin{pmatrix} \alpha(h_1)\beta(h_3)\alpha(h_2)\beta(h_4) \\ \gamma(h_1)\delta(h_3)\gamma(h_2)\delta(h_4) \end{pmatrix}.$$
Cancelling yields $\alpha(h_2)\beta(h_3)=\beta(h_3)\alpha(h_2)$ and $\gamma(h_2)\delta(h_3)=\delta(h_3)\gamma(h_2)$ for all $h_2,h_3\in H$, which is equivalent to $[\alpha(H),\beta(H)]=[\gamma(H),\delta(H)]=1$. Indeed these latter conditions guarantee that $\phi$ is in fact given by this matrix. One checks these conditions hold since $\alpha(H)$ and $\beta(H)$ are the images $\phi(H\times1)$ and $\phi(1\times H)$, and we know that $[H\times 1,1\times H]=1\times 1$ which carries over since $\phi$ is an automorphism (similar logic applies to $\gamma,\delta$). Since the conditions hold on the putative matrix entries determined from $\phi$, we conclude $\phi$ is in fact given by this matrix.
Now suppose $H$ is simple nonabelian.
Since $\alpha(H)\beta(H)=H$ and $[\alpha(H),\beta(H)]=1$, both $\alpha(H)$ and $\beta(H)$ are normal in $H$, and so $\alpha(H)$, $\beta(H)$ are both either trivial or $H$. It's not possible for both to be trivial since $(h_1,h_2)\mapsto\alpha(h_1)\beta(h_2)$ is surjective, nor is it possible for both to be $H$ since $[\alpha(H),\beta(H)]=1$. Therefore, one of $\alpha,\beta$ is an automorphism and the other is the trivial endomorphism (which I'll denote $0$). Same goes for $\gamma,\delta$ by the same logic. The matrix can't be $(\begin{smallmatrix}\alpha & 0 \\ \beta & 0\end{smallmatrix})$ or $(\begin{smallmatrix} 0 & \alpha \\ 0 & \beta \end{smallmatrix})$ however since these functions aren't $1$-$1$.
In conclusion, every automorphism of $H^2$ looks like $(\begin{smallmatrix}\alpha & 0 \\ 0 & \beta \end{smallmatrix})$ or $(\begin{smallmatrix} 0 & \alpha \\ \beta & 0\end{smallmatrix})$ for automorphisms $\alpha,\beta$.
Before moving on I want to discuss free products, semidirect products, and wreath products. The best way to intuitively understand the free product $A*B$ of two groups $A$ and $B$ is the set of all words formed from letters from the underlying sets of $A$ and $B$, with the understanding that concatenating elements from $A$ (or from $B$) together simplifies according to the original binary operation from in $A$ (or from in $B$). If a group $B$ acts on a group $A$ by automorphisms, then the semidirect product denoted by $A\rtimes B$ is the free product $A*B$ modulo the relations $b(a)=bab^{-1}$ (that is, conjugating $a$ by $b$ in the semidirect product amounts to applying $b$ as an automorphism to $a$). Since every element of the semidirect product looks like $ab$ for some $a\in A,b\in B$ uniquely, sometimes the semidirect product is constructed out of the set $A\times B$ by writing down what happens.
If a group $B$ acts by permutations on $\{1,\cdots,n\}$ then there is an induced action of $B$ by automorphisms on $A^n$: simply permute coordinates of any vector. Forming the resulting semidirect product $A^n\rtimes B$ yields the wreath product, denoted $A\wr B$. In particular, consider $H\wr C_2$ (where the nontrivial element of $C_2$ is the nontrivial swap of the two coordinates). Every element looks like either $(h_1,h_2)$ or $(h_1,h_2)\sigma$ with nontrivial multiplication rule $(h_1,h_2)\sigma(h_3,h_4)\sigma=(h_1h_4,h_2h_3)$ (since $\sigma=\sigma^{-1}$, this essentially conjugates $(h_3,h_4)$ by $\sigma$, which swaps the coordinates, and then we multiply $(h_1,h_2)(h_4,h_3)$).
I leave it to you to check that
$$\begin{pmatrix}\alpha & 0 \\ 0 & \beta\end{pmatrix}\mapsto (\alpha,\beta) \qquad \begin{pmatrix}0 & {\rm id}_H \\ {\rm id}_H & 0\end{pmatrix}\mapsto (e,e)\sigma $$
defines an isomorphism ${\rm Aut}(H^2)\cong {\rm Aut}(H)\wr C_2$.
For more information see Automorphisms of Direct Products of Finite Groups I and II. The first paper discusses a theorem with more "perpendicular" groups $H$ and $K$ hypothesized:
Theorem. If $H$ and $K$ have no common direct factor then
$${\rm Aut}(H\times K)\quad \cong\quad \left\{\begin{pmatrix}\alpha & \beta \\ \gamma & \delta \end{pmatrix} :~ \begin{matrix}\alpha\in{\rm Aut}(H) & \beta\in\hom(K,Z(H)) \\ \gamma\in\hom(H,Z(K)) & \delta\in{\rm Aut}(K)\end{matrix}\right\}.$$
Best Answer
Let $G = {\rm AGL}_1(k)$. Then ${\rm Aut}(G) = {\rm A \Gamma L}_1(k) \cong G\langle \gamma \rangle$, where $\gamma$ is a generator of the group of field automorphisms of $k$. So if $|k|=p^e$ with $p$ prime, then $|\gamma|=e$ and you $\gamma:x \mapsto x^p$ for $x \in k$.
Here is a sketch proof. Any $\alpha \in {\rm Aut}(G)$ must fix the normal subgroup $k$ of the semidirect product and, since the complements are all conjugate in $G$, we can assume (by multiplying $\alpha$ by an inner automorphism) that if fixes the principal complement $k^\times$. Since $k^\times$ acts transitively by conjugation on $k \setminus \{0\}$, by multiplying $\alpha$ by an inner automorphism again, we can assume that $\alpha(1)=1$.
For $0 \ne a \in k < G$, I will use $\bar{a}$ to denote the corresponding element of the complement $k^\times$. So the semidirect product action is $\bar{a}b\bar{a}^{-1} = ab$.
Now, for $0 \ne a \in K$, using $\alpha(1)=1$, we have $$\alpha(a) = \alpha(\bar{a}1\bar{a}^{-1}) =\alpha(\bar{a})1\alpha(\bar{a})^{-1},$$ so $\overline{\alpha(a)} = \alpha(\bar{a})$. In other words $\alpha$ is acting in the same ways on $k -\{0\}$ and on $k^\times$.
So, for $a,b \in k \setminus \{0\}$, $$\alpha(a)\alpha(b) = \overline{\alpha(a)} \alpha(b) \overline{\alpha(a)} ^{-1} = \alpha(\bar{a}) \alpha(b)\alpha(\bar{a})^{-1} = \alpha(\bar{a}b\bar{a}^{-1}) = \alpha(ab)$$ and hence $\alpha$ is acting as a field automorphism of $k$.