Careful! Your $1-x^2$ denominator vanishes at $x=\pm 1$, which as you've noted are the places where $P(X=x)\ne 0$. Here's another approach, using the fact that $\lim_{T\to\infty}\int_{-T}^T e^{-ity}dt=0$ for all $y\ne 0$ (proof is an exercise; you should find the integral is $\frac{\sin Ty}{Ty}$). Since$$\frac{1}{2T}\int_{-T}^T\exp -itx\frac{\exp it+\exp -it}{2}dt=\frac{1}{4T}\int_{-T}^T(\exp -it(x-1)+\exp -it(x+1))dt,$$if $x\ne \pm 1$ the integral vanishes as $T\to\infty$, so that dividing out a multiple of $T$ gets $0$. But when $x=1$, what we get instead is$$\lim_{T\to\infty}\frac{1}{4T}\int_{-T}^T (1+\exp -2it)dt=\frac{1}{2}.$$The result $P(X=-1)=\frac{1}{2}$ follows similarly.
Use the definition of the characteristic function and separate the instances of expectation values for $n$ and $x_i$ to obtain:
$$M_y(v)=\mathbb{E}[e^{ivy}]=\mathbb{E}[e^{iv\sum_{i=1}^{n}{x_i}}]=\mathbb{E_n[E_{x_i}}[\prod_{i=1}^ne^{iv{x_i}}]]=\mathbb{E}_n[(\mathbb{E_x}[e^{ivx}])^n]$$
However
$$M_x(v)=\mathbb{E}[e^{ivx}]=\int_{0}^{\infty}e^{ivx}ae^{-ax}dx=\frac{a}{a-iv}$$
and now all that's left is to compute the expectation over the state space of the $n$ variable:
$$\begin{align}\mathbb{E}_n[(M_x(v))^n]&=\sum_{N=0}^{\infty}(M_x(v))^N\sum_{k=0}^{\infty}\frac{1}{ek!}\delta(N-k)\\&=\sum_{k=0}^{\infty}\frac{1}{ek!}\sum_{k=0}^{\infty}(M_x(v))^N\delta(N-k)\\
&=\sum_{k=0}^{\infty}\frac{1}{e}\frac{(M_x(v))^k}{k!}\\&=e^{M_x(v)-1}\end{align}$$
and hence we find that
$$M_y(v)=\exp\Big(\frac{iv}{a-iv}\Big)$$
EDIT: Explanation of the first line
Note that the variables $x_1,...,x_n,n$ are independent. This allows us to construct the joint probability distribution
$$f(X_1,..., X_N,N)=p_n(N)\prod_{i=1}^N p_{x_i}(X_i)$$
We can easily see that this is a distribution since trivially
$$\sum_{N=0}^{\infty}\int~\prod_{i=1}^N ~dX_i~f(X_1,..,X_N,N)=1$$
It is indeed a funny looking distribution, because it's state space varies with $N$, and hence that variable has to be chosen first! Nevertheless, with this expression we can compute expectation values like the characteristic function:
$$M_y(v)=\sum_{N=0}^{\infty}p_n(N)\prod_{i=1}^N dX_i \exp(iv\sum_{k=1}^N X_k)f(\{X_i\},N)=\mathbb{E}_N[\mathbb{E}_{\{x_i\}}[e^{iv\sum x_i}]]\\=\sum_{N=0}^{\infty}p_n(N)\Big(\int_{0}^{\infty} dXe^{itX}p_x(X)\Big)^N=\mathbb{E}_N[(M_x(v))^N]$$
The reason why the first line in the equation above is true is essentially because we can pull out $p_n(N)$ because the variables are independent, and view what's left as an expectation value taken over the rest of the variables.
Best Answer
This solution uses properties of the characteristic function. Suppose $\{X_1,\dots,X_n\}$ are independent random variables with respective characteristic functions $\phi_k$. Let $a_1,\dots,a_n$ be constants. Then,
$$\phi_{\sum_{k=1}^na_kX_k}(t) = E\left[e^{it\sum_{k=1}^na_kX_k}\right] = E\left[\prod_{k=1}^ne^{ita_kX_k}\right] = \prod_{k=1}^nE\left[e^{ita_kX_k}\right] = \prod_{k=1}^n\phi_k(a_kt).$$
The uniform distribution has characteristic function $\phi_1(t) = \frac{e^{-it}+e^{it}}{2} = \cos(t)$. The unit rate exponential distribution has characteristic $\frac{1}{1 - it}$.
$$\frac{\cos(t)}{1 + t^2} = \cos(t)\left(\frac{1}{1 - it}\right)\left(\frac{1}{1-i(-t)}\right),$$
which is the characteristic function of $X_1 + X_2 - X_3$ where $X_1,X_2,X_3$ are independent, $X_1$ is discrete uniform and $X_2$ and $X_3$ are unit rate exponential functions.
In general, when you are working with Fourier transforms, characteristic functions, Laplace transforms and other things in that vein, the first thing you want to do is try to use the fundamental properties of the transform to turn the function into something you recognize.