To answer your second question first: an orthogonal matrix $O$ satisfies $O^TO=I$, so $\det(O^TO)=(\det O)^2=1$, and hence $\det O = \pm 1$. The determinant of a matrix tells you by what factor the (signed) volume of a parallelipiped is multipled when you apply the matrix to its edges; therefore hitting a volume in $\mathbb{R}^n$ with an orthogonal matrix either leaves the volume unchanged (so it is a rotation) or multiplies it by $-1$ (so it is a reflection).
To answer your first question: the action of a matrix $A$ can be neatly expressed via its singular value decomposition, $A=U\Lambda V^T$, where $U$, $V$ are orthogonal matrices and $\Lambda$ is a matrix with non-negative values along the diagonal (nb. this makes sense even if $A$ is not square!) The values on the diagonal of $\Lambda$ are called the singular values of $A$, and if $A$ is square and symmetric they will be the absolute values of the eigenvalues.
The way to think about this is that the action of $A$ is first to rotate/reflect to a new basis, then scale along the directions of your new (intermediate) basis, before a final rotation/reflection.
With this in mind, notice that $A^T=V\Lambda^T U^T$, so the action of $A^T$ is to perform the inverse of the final rotation, then scale the new shape along the canonical unit directions, and then apply the inverse of the original rotation.
Furthermore, when $A$ is symmetric, $A=A^T\implies V\Lambda^T U^T = U\Lambda V^T \implies U = V $, therefore the action of a symmetric matrix can be regarded as a rotation to a new basis, then scaling in this new basis, and finally rotating back to the first basis.
The inner product (dot product on $\mathbb{R}^n$ or $\mathbb{C}^n$) is defined as
$$ \langle x,y \rangle = x^* y = \sum_{k=0}^{n-1} \overline{x_k} y_k
$$
Some definitions switch them around ($x^\intercal \overline{y}$) but they are essentially the same.
$$ \langle Ax,y \rangle = (Ax)^* y = x^* A^* y = \langle x, A^* y \rangle
$$
Since you are using $\mathbb{R}^n$, the conjugation can be dropped without any problem.
Best Answer
The transpose of the transpose is the original matrix, as transposition is an involution.
So you're right, the transpose of the conjugate transpose of a matrix $A$ is just the conjugate $\bar A$, although I think this notation is not heavily used in linear algebra.