Example 18 in Ushakov's Selected Topics in Characteristic Functions gives two different characteristic functions $g_1, g_2:\mathbb R\to \mathbb R$ such that
the corresponding distributions have moments of all orders
$|g_1|=|g_2|$
The equality of absolute values implies that $g_1=g_2$ in a neighborhood of $0$ (hence the moments are equal).
After inspection I believe there are several typos/mistakes in their example, so I will deviate quite a bit from what's written in the book.
$\bullet$ Let $(a_n)_{n\geq 1}$ be a sequence of positive reals such that $\sum_{n=1}^\infty a_n <\infty$. Let $A=\sum_{n=1}^\infty a_n$.
Let $X_1,X_2$ be i.i.d with distribution $\mathcal U([-1,1])$. Recall that the c.f. of this distribution is $t\mapsto \frac{\sin(t)}{t}$. Let $(Z_n)_{n\geq 1}$ be i.i.d with the same distribution as $X_1+X_2$ and set $Z=\sum_{n=1}^\infty a_n Z_n$. Since $|Z_n|\leq 2$, the series $\sum_{n\geq 1} a_n Z_n$ is absolutely convergent, hence $Z$ is well-defined.
By dominated convergence, $$\phi_Z(t) = E(\lim_n e^{it\sum_{k=1}^n a_k Z_k})=\lim_n E( e^{it\sum_{k=1}^n a_k Z_k})=\lim_n \prod_{k=1}^n \left(\frac{\sin(a_kt)}{a_kt}\right)^2 = \prod_{n=1}^\infty \left(\frac{\sin(a_nt)}{a_nt}\right)^2$$
Note that $\phi_Z$ is $\geq 0$, even and integrable (since it is $\leq \frac{1}{a_1^2t^2}$). Integrability implies that the distribution of $Z$ is absolutely continuous, let $f_Z$ denotes its density. Note that the support of $f_Z$ is a subset of $[-2A,2A]$. Since $$f_Z(x)=\frac{1}{2\pi} \int e^{-itx} \phi_Z(t) dt =
\frac{\int \phi_Z}{2\pi} \int e^{itx} \frac{\phi_Z(-t)}{\int \phi_Z} dt=\frac{\int \phi_Z}{2\pi} \int e^{itx} \frac{\phi_Z(t)}{\int \phi_Z} dt$$
we conclude that $\frac{2\pi}{\int \phi_Z} f_Z = \phi_Y$ where $Y$ has density $\displaystyle t\mapsto \frac{\phi_Z(t)}{\int \phi_Z} $. This implies that the support of $\phi_Y$ is a subset of $[-2A,2A]$.
$\bullet$ Now comes the tricky part. Note that
$$\begin{aligned}
\phi_Y(t) + \frac 1{2i}(\phi_Y(t+4A)-\phi_Y(t-4A))
&= \int \frac{\phi_Z(x)}{\int \phi_Z}(\sin(4Ax)+1)e^{itx} dx \\
&= \int f_{T_1}(x) e^{itx} dx \\
&=\phi_{T_1}(t)
\end{aligned}$$
where $T_1$ is a r.v. with density $\frac{\phi_Z(x)}{\int \phi_Z}(\sin(4Ax)+1)$ (this function integrates to $1$ because $\phi_Z$ is even).
Similarly,
$$\begin{aligned}
\phi_Y(t) - \frac 1{2i}(\phi_Y(t+4A)-\phi_Y(t-4A))
&= \int \frac{\phi_Z(x)}{\int \phi_Z}(1-\sin(4Ax))e^{itx} dx \\
&= \int f_{T_2}(x) e^{itx} dx \\
&=\phi_{T_2}(t)
\end{aligned}$$
where $T_2$ is a r.v. with density $\frac{\phi_Z(x)}{\int \phi_Z}(1-\sin(4Ax))$.
By the remark on the support of $\phi_Y$,we have for $t\in[-2A,2A]$: $$\phi_{T_1}(t) = \phi_{T_2}(t)=\phi_Y(t)$$ and when $|t|>2A$ we have $$|\phi_{T_1}(t)| = |\phi_{T_2}(t)|$$
so that $|\phi_{T_1}(t)| = |\phi_{T_2}(t)|$ everywhere.
$\bullet$ Besides, for any $n\geq 1$, $$f_{T_1}(x) = O\left(\frac{1}{x^{2n}} \right)$$
hence $T_1$ has moments of every order, and similarly for $T_2$.
You can have two independent random variables with the same distribution. If they are independent, they obviously can't be identical.
Convergence in distribution just means the distribution converges. So if $X_n \xrightarrow d X$ and $Y$ is identically distributed to $X$ then $X_n \xrightarrow d Y$ as well. Weak convergence only depends on distributions and you can wildly change the functions as long as the distribution is the same.
For another example, if $X \sim \mathcal{N}(0,1)$ then $-X \sim \mathcal{N}(0,1)$ so if $X_n \xrightarrow d X$ then $X_n \xrightarrow d -X$. This is something that just doesn't happen for $L^1$ or almost-sure convergence unless $X = 0$.
We might say something like "the weak convergence topology is non-Hausdorff" meaning sequences can have more than one limit. Of course, that statement doesn't make any sense because the random variables need not share the same probability space, but that's what the quotation marks are for.
Best Answer
The characteristic function of $X$ is $\phi(t):=\exp (ix_0t-\gamma|t|)$. The desired result is $\phi^2(t)=\phi(2t)$.