Both eigenvalues and singular values are invariant to matrix transpose no matter a matrix is square or rectangular.
The definition of eigenvalues of $A$ (must be square) is the $\lambda$ makes
$$\det(\lambda I-A)=0$$
For $A^T$, $\det(\lambda I-A^T)=0$ is equivalent to $\det(\lambda I-A)=0$ since the determinant is invariant to matrix transpose. However, transpose does changes the eigenvectors.
It can also be demonstrated using Singular Value Decomposition. A matrix $A$ no matter square or rectangular can be decomposed as
$$A=U\Sigma V^T$$
Its transpose can be decomposed as $A^T=V \Sigma^T U^T$.
The transpose changes the singular vectors. But the singular values are persevered.
The inequality $\sigma_1(A)\ge\sigma_1(B)\ge\sigma_2(A)\ge\sigma_2(B)\ge\cdots\ge\sigma_{m-1}(A)\ge\sigma_{m-1}(B)\ge\sigma_m(A)$ holds if $A$ is a (square) positive semidefinite matrix. It doesn't hold in general, not even if $A$ is Hermitian. E.g. when
$$
A=\pmatrix{0&0&1\\ 0&0&0\\ 1&0&0},
$$
the three singular values of $A$ are $1,1,0$ but the two singular values of $B$ are $0,0$. In this counterexample, we also have $\sum_{i=2}^m\sigma_i(A)=1>0=\sum_{i=1}^{m-1}\sigma_i(B)$.
Another counterexample: let
$$
A=\pmatrix{0&3&0\\ 2&0&-2\\ 1&0&1}.
$$
The three singular values of $A$ are $3,2\sqrt{2},\sqrt{2}$ and the two singular values of $B$ are $\sqrt{5}$ and $0$. Here we have $\sigma_2(A)=2\sqrt{2}>\sqrt{5}=\sigma_1(B)$ and $\sum_{i=2}^m\sigma_i(A)=3\sqrt{2}>\sqrt{5}=\sum_{i=1}^{m-1}\sigma_i(B)$.
It is true that $\sigma_i(A)\le\sigma_{i-2}(B)$ for $3\le i\le\min\{m,n\}$. Actually, if we delete a row (resp. a column) of $A$ to obtain a matrix $C$, we get $\sigma_j(A)\le\sigma_{j-1}(C)$. Similarly, if we delete a column (resp. a row) of $C$ to obtain a matrix $B$, we get $\sigma_k(C)\le\sigma_{k-1}(B)$. Combine the two inequalities, we get $\sigma_i(A)\le\sigma_{i-2}(B)$.
Interestingly, the inequality $\sigma_j(A)\le\sigma_{j-1}(C)$ can be obtained from the interlacing inequality $\lambda_1(A)\ge\lambda_1(B)\ge\cdots\ge\lambda_{m-1}(A)\ge\lambda_{m-1}(B)\ge\lambda_m(A)$ for eigenvalues of Hermitian matrices. For a proof, see corollary 7.3.6 of Horn and Johnson's Matrix Analysis (2nd ed.).
Best Answer
There is almost no relationship. For example, if we take $$A = \begin{bmatrix}x & 1 \\ 0 & 0\end{bmatrix}, \quad B = \begin{bmatrix}0 & 0 \\ 1 & y \end{bmatrix}$$ then the singular values of $AB$ are the square roots of the eigenvalues of $$(AB)^{\mathsf T} AB = \begin{bmatrix}1 & y \\ y & y^2\end{bmatrix}$$ so they are $\sqrt{1+y^2}$ and $0$. Similarly, the singular values of $BA$ are $\sqrt{1+x^2}$ and $0$. Even in this simple example, the nonzero singular value in one case can vary pretty much independently of the other case. (They must both be at least $1$, but we can tweak that by changing the $1$ in the matrices to some small $\epsilon>0$.)
By taking determinants, we can conclude that the product of the singular values of $AB$ is $\det(AB)$, while the product of the singular values of $BA$ is $\det(BA)$. So if $A$ and $B$ are both square matrices, the singular matrices in both cases have an equal product $\det(A)\det(B)$, which is some amount of dependence.
On the other hand, by taking tensor products of the construction above, we can start with $2n \times 2n$ square matrices $A$ and $B$ where