I'm assuming
- $\{X_j\}_{j\in \mathbb{N}}$ is an iid sequence, i.e. they are identically distributed and mutually--not pairwise as you have it-- independent.
- $N$ is independent of $\{X_j\}_{j\in \mathbb{N}}$.
By Adam's Law, the mean of $Y$ is
$$\small \begin{aligned} E\left[\sum_{j=1}^N X_j \right] &=E\left[E\left[\sum_{j=1}^N X_j\Bigg|N \right]\right]\\
&=E\left[\sum_{j=1}^NE\left[ X_j\big|N \right]\right]&&\qquad (\text{linearity of }E[\cdot ])\\
&=E\left[\sum_{j=1}^NE\left[ X_j \right]\right]&&\qquad (N,X_j\text{ indep.})\\
&=E\left[NE\left[ X_1 \right]\right]&&\qquad (\{X_j\}_{j\in \mathbb{N}}\text{ identically distrib.})\\
&=E\left[N\right]E\left[ X_1 \right],&&\qquad (\text{linearity of }E[\cdot ])\end{aligned}$$
which is a special case of Wald's equation.
By Eve's law, the variance of $Y$ is
$$ \small\begin{aligned}\text{Var}\left[\sum_{j=1}^N X_j \right]&=E\left[\text{Var}\left[\sum_{j=1}^N X_j\Bigg|N \right]\right]+\text{Var}\left[E\left[\sum_{j=1}^N X_j\Bigg|N \right]\right]\\
&=E\left[\text{Var}\left[\sum_{j=1}^N X_j\Bigg|N \right]\right]+\text{Var}\left[NE[X_1]\right]&&(\text{see above})\\
&=E\left[\text{Var}\left[\sum_{j=1}^N X_j \right]\right]+\text{Var}\left[NE[X_1]\right]&&(N,\{X_j\}_{j\in \mathbb{N}} \text{ indep.})\\
&=E\left[\sum_{j=1}^N\text{Var}\left[ X_j \right]\right]+\text{Var}\left[NE[X_1]\right]&& (\{X_j\}_{j\in \mathbb{N}} \text{ indep.})\\
&=E\left[N\text{Var}\left[ X_1 \right]\right]+\text{Var}\left[NE[X_1]\right]&& (\{X_j\}_{j\in \mathbb{N}} \text{ identically distrib.})\\
&=E\left[N\right]\text{Var}\left[ X_1 \right]+\text{Var}\left[N\right](E[X_1])^2&& (\text{linearity of } E[\cdot ])\\
&= E\left[N\right]E\left[ X_1^2 \right]+\left(\text{Var}\left[N\right]-E\left[N\right]\right)(E[X_1])^2.
\end{aligned}$$
For $N$ Poisson with rate parameter $\lambda$, we have $E[N]=\text{Var}[N]=\lambda$, so the above simplifies to
$$E[Y]=\lambda E[X_1]\\
\text{Var}[Y]=\lambda E\left[ X_1^2 \right],$$
as obtained in the wiki link.
Note the variance computation required stronger assumptions than the mean computation, which is typically the case to get a nice variance expression.
You just need to note that since $-\ln \phi_1(t)=-\ln\phi_2(t)$ you obtain
$$\lambda_N(1-\phi_Y(t))=\lambda_M(1-\phi_Z(t))$$
Inverse-Fourier transforming this equality gives a relation between the pdfs of $Y,Z$, $f_Y(x), f_Z(x)$:
$$(\lambda_N-\lambda_M)\delta(x)=\lambda_Nf_Y(x)-\lambda_M f_Z(x)$$
Integrate this to obtain a relationship between the cdf's
$$(\lambda_N-\lambda_M)\theta(x)=\lambda_NF_Y(x)-\lambda_MF_Z(x)$$
Since there is no point mass at the origin we have that $F_{Y,Z}(0^-)=F_{Y,Z}(0^+)$, so taking left and right limits around $x=0$ forces $\lambda_N=\lambda_M$, as it is easy to see. The equality of probability distributions follows trivially thereafter.
EDIT: In case the pdf of $Y,Z$ are not defined it is still possible to prove uniqueness. First, note that any random variable has a CDF, by definition. From well known theorems, it is seen that
Any single variable function that obeys the CDF axioms is right continuous everywhere on the real line (ref: Kun Il Park, Fundamentals of Probability and Stochastic Processes with Applications to Communications, p.79)
Any single variable function that is right continuous everywhere can be shown to be discontinuous only on a countable subset $B$ of the reals, see here
In light of these results, the following inversion formula gives the unique CDF assigned to a characteristic function
$$2F_X(x)-1=\frac{1}{\pi}\int_0^\infty\frac{e^{ixt}\phi_X(-t)-e^{-ixt}\phi_X(t)}{it}dt$$
and the integral on the RHS converges on $x\in B^c$.
Assuming $\phi_1(t)=\phi_2(t)$ yields the family of solutions
\begin{equation}\lambda_N-\lambda_M+2n\pi i=\lambda_N \phi_Y(t)-\lambda_M\phi_Z(t),~~ n\in \mathbb{Z}\tag{1}\end{equation}
Forming the linear combination $(it)^{-1}(e^{itx}(1)(-t)-e^{-itx}(1)(t))$ and integrating yields after some rearrangement the relation
$$\lambda_N F_Y(x)-\lambda_M F_Z(x)=(\lambda_N-\lambda_M)\theta(x)+n\pi i ~\text{sgn}(x)~~, x\in B^c$$
Obviously, since the LHS is purely real, this forces $n=0$. From this point on, one can use the argument presented pre-edit to conclude that $\lambda_N=\lambda_M~,~ F_Y(x)=F_Z(x) ~, ~x\in B^c$, and the right continuity to conclude that the jumps are also fixed to be the same, which implies that $F_Y(x)=F_Z(x) ~~,~ x\in \mathbb{R}$.
Best Answer
The proof as follow:
\begin{align*} P(T>t) &= 1 - P(T<t)\\ &= 1 - \frac{1\times e^{-\lambda t}}{1!}\\ &= 1 - e^{-\lambda t} \end{align*}
Since $P(T>t)$ equal to the CDF of exponential distribution, therefore, $T \sim Exp(\lambda)$