By definition, an automorphism is an isomorphism from $G$ to $G$, while an isomorphism can have different target and domain.
In general (in any category), an automorphism is defined as an isomorphism $f:G \to G$.
Your definition of automorphism is a bit too strong, I feel. Leaving something "unchanged" in the sense that the numbers in each cell of a matrix is a VERY strong notion of equality. However, you're not using that definition of equality for graphs.
As corrected by Morgan Rodgers, strict equality actually works fine. The adjacency matrices are indeed unchanged under automorphisms when the adjacency matrices are constructed with the same vertex set and the vertex $v_i$ corresponds to column and row $i$ always.
Everything below still holds, but any mention of row-column swapping may be trivially true.
If a graph undergoes an automorphism, it has been turned into another graph with equivalent structure.
$G_1 = (V_1, E_1)$
$G_2 = (V_2, E_2)$
$f(G_1) = G_2$ is an automorphism if and only if:
- There exists a bijection $g : V_1 \to V_2$ such that $(x, y) \in E_1$ if and only if $(g(x), g(y)) \in E_2$
This is not the same as "$f(G_1) = G_2$ if and only if $V_1 = V_2$ and $e \in E_1 \iff e \in E_2$", which would be true equality. This stricter definition is closer to the one you were holding for adjacency matrices. Essentially, each index in each of the vertex and edge is "unchanged", therefore under this equality any swapping at all is not an automorphism. This is not what you intend at all, I'm fairly certain.
If you consider that you're already constructing equivalence classes between graphs (otherwise, the only automorphism on graphs is the identity map), then you can similarly construct an equivalence class on the adjacency matrices.
You define a new $=$ relation such that $A_1 = A_2$ if and only if $A_2$ can be attained by a sequence of $(i,j)$ row-column swaps (swap the $i,j$ rows and $i,j$ columns) from $A_1$. For any finite graph, this is verifiable exhaustively by trying every possible sequence of swaps to check equality. This equality relation is far more relaxed than cell-by-cell equality and it gives you exactly what you need to verify automorphisms between the graphs these adjacency matrices represent.
This equality relation is consistent with any automorphism on your graph. If your automorphism swaps the $i,j$ vertices with each other, this bijectively maps to an automorphism in your adjacency matrices by swapping the $i,j$ row-columns. The reverse also holds following similar logic.
Any automorphism on your graphs can be factored into a minimal chain of single-swap automorphisms. Any automorphisms on your adjacency matrices can be constructed by composition of single-swap automorphisms (closure property of automorphisms). As such, there is a unique automorphism on the graphs for each automorphism on the adjacency matrices. The reverse is also true following similar logic.
We now have a bijective map from the automorphism group of the graphs to the automorphism group of their adjacency matrices. Therefore, the groups are equivalent.
Best Answer
The problem is equivalent to check if two graphs have the same adjacency matrix up to permutations (that is, if $A_1 = P^{-1}A_2 P$ for some permutation matrix $P$). But that's not easy.
To start with, maybe you should look at an example: $$ \begin{bmatrix} 0 & 1 & 0 & 0 & 1 & 0 & 0 & 1 \\ 1 & 0 & 1 & 0 & 0 & 1 & 0 & 0 \\ 0 & 1 & 0 & 1 & 0 & 0 & 1 & 0 \\ 0 & 0 & 1 & 0 & 1 & 1 & 0 & 0 \\ 1 & 0 & 0 & 1 & 0 & 0 & 0 & 1 \\ 0 & 1 & 0 & 1 & 0 & 0 & 1 & 0 \\ 0 & 0 & 1 & 0 & 0 & 1 & 0 & 1 \\ 1 & 0 & 0 & 0 & 1 & 0 & 1 & 0 \\ \end{bmatrix} \qquad \begin{bmatrix} 0 & 1 & 1 & 0 & 1 & 0 & 0 & 0 \\ 1 & 0 & 1 & 0 & 0 & 0 & 1 & 0 \\ 1 & 1 & 0 & 1 & 0 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 & 1 & 1 & 0 & 0 \\ 1 & 0 & 0 & 1 & 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 1 & 0 & 0 & 1 & 1 \\ 0 & 1 & 0 & 0 & 0 & 1 & 0 & 1 \\ 0 & 0 & 0 & 0 & 1 & 1 & 1 & 0 \\ \end{bmatrix} $$ Can you turn one matrix into the other? How do you tell? You probably shouldn't try all $8!$ permutation matrices, one at a time. But you also can't notice any useful features just by looking. (For small examples like this one, the representation as a graph diagram is probably more useful than the adjacency matrix.)
This is not to say that adjacency matrices aren't helpful to look at. For example, we can look at their eigenvalues; the first matrix has eigenvalues $$3,\frac{1}{2} \left(-\sqrt{17}-1\right),\frac{1}{2} \left(-\sqrt{5}-1\right),\frac{1}{2} \left(-\sqrt{5}-1\right),\frac{1}{2} \left(\sqrt{17}-1\right),\frac{1}{2} \left(\sqrt{5}-1\right),\frac{1}{2} \left(\sqrt{5}-1\right),0$$ and the second has eigenvalues $$3,-\sqrt{2}-1,-\sqrt{3},\sqrt{3},-1,-1,1,\sqrt{2}-1$$ so they are not isomorphic. But this is not a foolproof method; it works most of the time, but the graph isomorphism problem was already easy most of the time, it's just hard in highly symmetric cases. Non-isomorphic graphs do exist whose adjacency matrices have the same eigenvalues.