My attempt.
$A^{k}=POP^{-1}$.Here $O$ is an orthogonal matrix.I want to find an orthogonal matrix $O_{1}$ and $O_{1}^{k}=O.$ I think this could be done because invertible matrix always has a 'square root'.But even if this could be done and$A^{k}\sim O_{1}^{k}.$This will still not give any information about $A$.So how to use the information given by some canonical form of $A^{k}$ to find information about $A$?
Further attempt.
I think maybe I should consider this problem on $\mathbb{C}.$So $O$ is a special complex normal matrix.So by spectral theorem $O$ is diagonalizable on $\mathbb{C}$.This means $A^{k}$ is diagonalizable and its modulus of eigenvalues is 1.So is A.So $A\sim M=diag\{e^{i\theta_{1}},\cdots,e^{i\theta_{s}},\lambda_{s+1},\cdots,\lambda_{n}\}.$ It is clear that M is similar to an orthogonal matrix on $\mathbb{C}.$
This proof is quite strange since we ofen only consider $\mathbb{R}$ when talking about orthogonal matrix and the problem itself doesn't state the field we use.
Is there any better solution?
Best Answer
I assume $P$ is a real valued matrix. (If it requires $\mathbb C$ the below can be slightly altered to instead contemplate Hermitian forms.)
Consider the coordinate vector space given by $V=\mathbb R^n$ and a linear operator on this space given by $T:= P^{-1}AP$. It suffices to show that $T$ is similar to a real orthogonal matrix. Since $T^k$ is nonsingular, so is $T$.
With $\langle, \rangle$ denoting the standard real inner product, we define the following custom symmetric bilinear form. For $v,v' \in V$
$\langle v, v' \rangle_c := \frac{1}{k}\sum_{j=0}^{k-1}\langle T^j v, T^j v'\rangle$.
It is immediate that this form is positive definite. Further notice
$\langle Tv, Tv' \rangle_c $
$= \frac{1}{k}\sum_{j=0}^{k-1}\langle T^{j+1}v, T^{j+1}v'\rangle $
$= \frac{1}{k}\Big(\sum_{j=0}^{k-2}\langle T^{j+1}v, T^{j+1}v'\rangle\Big) + \frac{1}{k}\langle T^{k}v, T^{k}v'\rangle$
$= \frac{1}{k}\Big(\sum_{j=1}^{k-1}\langle T^{j}v, T^{j}v'\rangle\Big) + \frac{1}{k}\langle v, v'\rangle$
$= \frac{1}{k}\sum_{j=0}^{k-1}\langle T^j v, T^j v'\rangle$
$=\langle v,v' \rangle_c $
This implies $T$ is an orthogonal operator with respect to the custom bilinear form.
Now compute the image of $T$ with respect to a well chosen basis
$T\mathbf B=\mathbf BQ$
where $\mathbf B$ is selected to be some orthonormal basis with respect the the custom bilinear form and $Q$ is some matrix. Since our vector space is $V=\mathbb R^n$, we note that $\mathbf B$ may be also interpreted as an invertible matrix.
$\langle v, v' \rangle_c = \langle Tv, Tv' \rangle_c \longrightarrow$ $Q$ is orthogonal with respect to the standard inner product.
Finally
$T =T\big(\mathbf B\mathbf B^{-1}\big) = \big(T\mathbf B\big)\mathbf B^{-1}= \big(\mathbf BQ\big)\mathbf B^{-1}= \mathbf BQ\mathbf B^{-1}$
thus $T$ is similar to an orthogonal matrix
note
The above also gives a proof for why $M^k = I$ implies that $M$ is diagonalizable over $\mathbb C$, as $I$ is just a special case of a real orthogonal matrix. The above shows that $M$ is similar to a real orthogonal matrix which by spectral theorem is similar to a diagonal matrix (over $\mathbb C$). The standard proof of this result that you'll see on this site uses a minimal polynomial argument, though the minimal polynomial doesn't seem to apply as well to OP's question.