This is not always true.
We can rewrite $\lambda_j\times \lambda_h'=\lambda'_j\times \lambda_h$ as $\frac{\lambda_j}{\lambda_j'} = \frac{\lambda_h}{\lambda_h'}$. If we can show from these $K$ equations that
$$
\frac{\lambda_1}{\lambda_1'} = \frac{\lambda_2}{\lambda_2'} = \dots = \frac{\lambda_K}{\lambda_K'}
$$
then there is a constant $C$ such that $\lambda_i = C \lambda_i'$ for all $i$; from knowing that $$\lambda_1 + \dots + \lambda_k = \lambda_1'+ \dots + \lambda_K'= 1,$$ we deduce that $C=1$ and therefore $\lambda_i = \lambda_i'$ for all $i$.
However, we cannot necessarily conclude that all the ratios $\frac{\lambda_j}{\lambda_j'}$ are equal. This depends on the $K$ specific equations we chose. Let $G$ be the graph with vertex set $\{1,\dots,K\}$ and an edge $hj$ whenever we choose the pair $(h,j)$ to form an equation. The requirement to have a unique solution is that $G$ must be connected. If so, for any $a,b \in \{1,\dots,K\}$ there is a path from $a$ to $b$ in $G$, and we get $\frac{\lambda_a}{\lambda_a'} = \dots = \frac{\lambda_b}{\lambda_b'}$ by transitivity along that path.
But here is an example (for $K=5$) without a unique solution. Choose the $5$ equations
\begin{align}
\lambda_1 \lambda_2' &= \lambda_1'\lambda_2 \\
\lambda_1 \lambda_3' &= \lambda_1'\lambda_3 \\
\lambda_1 \lambda_5' &= \lambda_1'\lambda_5 \\
\lambda_2 \lambda_5' &= \lambda_2'\lambda_5 \\
\lambda_3 \lambda_5' &= \lambda_3'\lambda_5 \
\end{align}
Together, these equations are equivalent to $\frac{\lambda_1}{\lambda_1'} = \frac{\lambda_2}{\lambda_2'} = \frac{\lambda_3}{\lambda_3'} = \frac{\lambda_5}{\lambda_5'}$, but they leave out $\lambda_4$ entirely. So all $5$-tuples $\lambda$ with
$$
(\lambda_1, \lambda_2, \lambda_3, \lambda_4, \lambda_5) = (A \lambda_1', A\lambda_2', A\lambda_3', B\lambda_4', A\lambda_5')
$$
satisfy these $5$ equations, and if $A(\lambda_1'+\lambda_2'+\lambda_3'+\lambda_5') + B \lambda_4'= 1$, then $\lambda_1 + \dots + \lambda_5 = 1$ also holds. For any $0 < A < \frac{1}{\lambda_1'+\lambda_2'+\lambda_3'+\lambda_5'}$, we can set $B = \frac{1 - A(\lambda_1'+\lambda_2'+\lambda_3'+\lambda_5')}{\lambda_4'}$ and get a valid solution this way.
This is essentially the only kind of counterexample, though: cases where some variables are left out entirely. If every variable appears in at least one equation, then:
- For every $i = 2,\dots,K-1$, either $\lambda_1 \lambda_i' = \lambda_1'\lambda_i$ or $\lambda_i\lambda_K' = \lambda_i'\lambda_K$ is an equation, forcing $\frac{\lambda_i}{\lambda_i'}$ to be equal to either $\frac{\lambda_1}{\lambda_1'}$ or $\frac{\lambda_K}{\lambda_K'}$.
- To get $K$ equations, we need to either include both such equations for some $i$, or else include the equation $\lambda_1 \lambda_K' = \lambda_1'\lambda_K$. In either case, we can conclude that $\frac{\lambda_1}{\lambda_1'} = \frac{\lambda_K}{\lambda_K'}$. Therefore all $\frac{\lambda_i}{\lambda_i'}$ are equal.
Best Answer
Question: "Please, could you help me with the part 2., especially writing the decomposition?"
Answer: There is an operator - the Reynolds operator - that might be helpful. Let $G$ be any finite group and $V$ any finite dimensional $k$-vector space with a representation $\rho: G \rightarrow GL_k(V)$. Define for any $v\in V$
$$R_V(v):=R(v):= \frac{1}{n}\sum_{g\in G} gv$$
wher $n:=\# G$ is the number of elements of $G$. The operator $R$ is in $R\in Hom_G(V,V)$ and $Im(R) \cong V^G$ is the sub-$k$-vector space of elements fixed by $G$. You get a short exact sequence of $G$-modules
$$S1\text{ } 0 \rightarrow ker(R) \rightarrow V \rightarrow V^G \rightarrow 0.$$
This is because $R^2=R$ is an idempotent endomorphism. The sequence $S1$ splits hence $V \cong V^G \oplus ker(R)$. This may be done to any $G$-module $V$.
Example: When you do this to your $G:=S_n$-module $V$ you get $V \cong V^G \oplus ker(R)$. Then you may do a similar procedure to $V^G$ and $ker(R)$ etc. If $V$ is irreducible it follows $ker(R)=(0)$ and $Im(R)=V$, hence $R$ is an automorphism of $G$-modules. Moreover if $V \cong V_1 \oplus V_2$ is a direct sum of $G$-modules it follows for any $(u,v)\in V_1 \oplus V_2$
$$R_V(u,v):=\frac{1}{n}\sum_g g(u,v)= \frac{1}{n}\sum_g (gu,gv)=$$
$$(\frac{1}{n}\sum_g gu, \frac{1}{n}\sum_g gv)=(R_{V_1}(u), R_{V_2}(v)).$$
Hence there is an equality $R_V \cong R_{V_1}\oplus R_{V_2}$. By Schur Lemma it follows $R_V \cong \lambda Id$ if $V$ is irreducible. Hence if $V \cong V_1 \oplus \cdots \oplus V_d$ is a decomposition of $V$ into irreducibles it follows $R_V \cong \oplus R_{V_i}$ and each $R_{V_i} = \lambda_i Id$ is multiplication with $\lambda_i$ - a complex number.
Note: If $V_{\lambda_i} \subseteq V$ is an eigenspace for $R$ with eigenvalue $\lambda_i$, it follows $V_{\lambda_i}$ is a $G$-submodule and there is a direct sum decomposition $V \cong V_{\lambda_i}\oplus W$ where $W$ is a "complementary" $G$-module. If $V \cong W^d$ where $W$ is irreducible, it follows $R$ acts on $V$ with one eigenvalue, hence $R$ does not detect the direct sum decomposition of $V$. Still $R$ gives some information on the decomposition of $V$. If $k[G]$ is the group algebra of $G$, you get an element
$$R:=\frac{1}{n}\sum_{g\in G} g \in Z(k[G])$$
in the center $Z(k[G])$ of $k[G]$.
If you choose any vector $v\in V$ and consider the $G$-module $V(v):=k[G]v \subseteq V$ generated by $v$, it follows there is a direct sum decomposition $V \cong V(v)\oplus W(v)$ where $W(v)$ is complementary. This gives an "algorithm" for calculation of the direct sum decomposition. You find some information below:
Regular representation and matrix coefficients