Usually, an eigendecomposition is the least efficient way to generate the inverse of a matrix, but in the case of the symmetric tridiagonal Toeplitz matrix, we have the nice eigendecomposition $\mathbf A=\mathbf V\mathbf D\mathbf V^\top$, where $$\mathbf D=\mathrm{diag}\left(1+2a\cos\frac{\pi}{n+1},\dots,1+2a\cos\frac{k\pi}{n+1},\dots,1+2a\cos\frac{n\pi}{n+1}\right)$$ and $\mathbf V$ is the symmetric and orthogonal matrix whose entries are $$v_{j,k}=\sqrt{\frac2{n+1}}\sin\frac{\pi jk}{n+1}$$ Thus, to generate the inverse, use $\mathbf A=\mathbf V\mathbf D^{-1}\mathbf V^\top$, and inverting a diagonal matrix is as easy as reciprocating the diagonal entries.
A construction can be found in lemma 5.2.2, pp.36-37 of Olga Ruff's master thesis The Jordan canonical forms of complex orthogonal and skew-symmetric matrices: characterization and examples.
To summarise, let $z=\frac{1-i}{2}$. Since $\pmatrix{z&\overline{z}\\ \overline{z}&z}^2=\pmatrix{0&1\\ 1&0}$, if we set $X$ to the $(2n+1)\times(2n+1)$ matrix
$$
\pmatrix{
z&&&&&&&&&&\overline{z}\\
&iz&&&&&&&&i\overline{z}\\
&&z&&&&&&\overline{z}\\
&&&iz&&&&i\overline{z}\\
&&&&\ddots&&\unicode{x22F0}\\
&&&&&\sqrt{(-1)^n}\\
&&&&\unicode{x22F0}&&\ddots\\
&&&i\overline{z}&&&&iz\\
&&\overline{z}&&&&&&z\\
&i\overline{z}&&&&&&&&iz\\
\overline{z}&&&&&&&&&&z},
$$
then
\begin{aligned}
X^2&=\operatorname{antidiag}(1,-1,1,-1,\ldots,1)=DR=RD,\text{ where}\\
D&=\operatorname{diag}(1,-1,1,-1,\ldots,1),\\
R&=\operatorname{antidiag}(1,1,\ldots,1).
\end{aligned}
Let $J=J_{2n+1}(0)$. Since $X$ is symmetric and $X^4=I$, we have
$$
(XJX^{-1})^T=X(X^2J^TX^2)X^{-1}
=XDRJ^TRDX^{-1}=XDJDX^{-1}=-XJX^{-1},
$$
i.e. $K=XJX^{-1}$ is skew-symmetric and similar to $J$.
We can prove by a parity argument that nilpotent Jordan blocks of even sizes are not similar to any complex skew-symmetric matrices. First, we need the following result of Horn and Merino (2009) (which is also part of lemma 5.1.2 in Olga Ruff's thesis).
Lemma. A complex square matrix $A$ is similar to a complex skew-symmetric matrix $K$ only if $SA$ is skew-symmetric for some complex symmetric matrix $S$.
Proof. If $A=P^{-1}KP$ where $K^T=-K$, then $A^T=-(P^TP)A(P^TP)^{-1}$. Hence $P^TPA$ is skew-symmetric. $\square$
Now suppose an $m\times m$ nilpotent Jordan block $J=J_m(0)$ is similar to a skew-symmetric matrix. By the above lemma, $SJ$ is skew-symmetric for some non-singular symmetric matrix $S$. Note that the first column of $SJ$ is zero. Therefore
$$
S_{1j}=(SJ)_{1,j+1}=-(SJ)_{j+1,1}=0 \textrm{ for all } j<m.\tag{1}
$$
Moreover, by the symmetry of $S$ and skew-symmetry of $SJ$,
$$
S_{ij}=S_{ji}=(SJ)_{j,i+1}=-(SJ)_{i+1,j}=-S_{i+1,j-1}.\tag{2}
$$
Equality $(1)$ means that all entries on the first row of $S$ except the rightmost one are zero. Equality $(2)$ means that if we travel down an anti-diagonal of $S$, the entries are basically constant but they have alternating signs. It follows from $(1)$ and $(2)$ that all entries of $S$ above the main anti-diagonal are zero and the main anti-diagonal of $S$ is $\left(s,-s,s,-s,\ldots,(-1)^{m-1}s\right)$ for some $s$. As $S$ is non-singular, $s$ must be nonzero. Yet, as $S$ is symmetric, the first and the last entries on the anti-diagonal must be equal. Hence $s=(-1)^{m-1}s$ and $m$ is odd.
Best Answer
In fact, for any matrix $A$, skew-symmetric or otherwise, there is some invertible matrix $B$ such that $AB$ is nilpotent if and only if $A=0$ or $A$ is not invertible.
First and foremost, it's an elementary fact about column-equivalence that, for any two matrices $X,Y\in\Bbb F^{n\times m}$, the following are equivalent:
As a side note, the various Gaussian-like algorithms can even provide such a $Z$ explicitly, given $X$ and $Y$.
Secondly, notice that a vector subspace $V\subseteq \Bbb F^d$ is the column space of some nilpotent matrix if and only if $V=0$ or $V\ne \Bbb F^d$. The "only if" is obvious, because nilpotent endomorphisms on a non-zero vector space cannot be surjective. For the "if" part, consider a basis $v_1,\cdots,v_d$ such that $v_1,\cdots, v_k$ is a basis of $V$. Then, consider the endomorphism $N$ such that $Nv_1=0$, $Nv_j=v_{j-1}$ for $1<j\le k+1$, and $Nv_j=0$ for $j>k+1$.
Second side note about effective computability: given $A$ you can find explicitly a basis that extends some basis of $\operatorname{col}A$, and then you can find explicitly this endomorphism.
The two facts together prove the result. Notice that skew-symmetric matrices in characteristic $\ne2$ have automatically $\det A=0$ if $d$ is odd.