As @kimchi lover pointed out, your answer to (b) is not correct. $X(t) \sim N(0,2)$ would mean, in particular, that $X(t)$ has variance $2$ which contradicts your calculation in (a). Your answer to part (a) is correct.
Concerning part (c): There is the following statement
A random vector $(Y_1,\ldots,Y_n)$ is Gaussian if, and only if, $$\sum_{j=1}^n \alpha_j Y_j$$ is Gaussian for any choice of $\alpha_1,\ldots,\alpha_n \in \mathbb{R}$.
Fix $n \in \mathbb{N}$ and $0 \leq t_1 < \ldots < t_n$. Since
$$\sum_{j=1}^n \alpha_j X(t_j) = A \sum_{j=1}^n \alpha_j \cos(t_j) + B \sum_{j=1}^n \alpha_j \sin(t_j)$$
and $A$ and $B$ are independent Gaussian random variables, it follows from the above lemma that the random vector $(X(t_1),\ldots,X(t_n))$ is Gaussian. The distribution of a Gaussian random vector is uniquely determined by its mean vector and covariance matrix, i.e.
$$m := \begin{pmatrix} \mathbb{E}(X(t_1)) \\ \vdots \\ \mathbb{E}(X(t_n)) \end{pmatrix} \quad \text{and} \quad C := (c_{ij}))_{i,j=1,\ldots,n} := (\mathbb{E}(X(t_i)X(t_j))_{i,j=1,\ldots,n}.$$
Your calculations for part (a) show that $m=0$ and that $c_{ii} = 1$ for all $i=1,\ldots,n$. Hence, the only missing thing is $\mathbb{E}(X(t_i) X(t_j))$ for $i \neq j$. I leave it to you to calculate it; let me know if you run into trouble.
Correct result: $\mathbb{E}(X(t_i) X(t_j)) = \cos(t_i-t_j)$.
As mentioned in the comments, your error is that $$M_{X-\mu}(t/\sigma) = e^{-\mu t/\sigma}M_X(t/\sigma)$$
Fixing that gives a final expression of $M_Z(t) = e^{t^2/2},$ as expected.
Best Answer
I add little to Jethro's answer (which I do not understand exactly why was downvoted), but I want to put some order to the discussion in comments.
From the hypothesis we can conclude the following.