Are there two dependent, but uncorrelated random variables $X,Y\sim \mathcal {N}(0,1)$ such that their sum $X+Y$ is normal, i.e. $\mathcal {N}(0,2)$

measure-theoryprobabilityprobability distributionsprobability theory

Context: The question comes from another question I saw regarding the characterization of Brownian motion : does it hold that for all $t>s\ge 0$
$$\operatorname{Cov}(B_s, B_t-B_s) = 0\ \ \ \ \Longrightarrow\ \ B_s \perp \!\!\! \perp B_t-B_s \text{ (independent})$$ provided that $B_0 = 0$ and $B_t-B_s \sim \mathcal N(0,t-s)$ ? The asker did not assume that $(B_t)_{t\ge 0}$ is a Gaussian process. So I thought the answer is "no" because in general mere $\operatorname{Cov}(B_s,B_t-B_s) = 0$ does not imply $B_s \perp \!\!\!\perp B_t-B_s$ unless $(B_s,B_t-B_s)$ is jointly normal. (Some counterexamples can be found here in Wikipedia.)

In the linked page, we can find various pairs of dependent normal random variables $X,Y \sim \mathcal{N}(0,1)$ with $\operatorname{Cov}(X,Y) = 0$. Also we can generate many other examples using similar ideas and techniques. However, I have failed to find an example where $X+Y$ is also normally distributed, which must hold in the setting of the original question, i.e.
$$
\underbrace{B_1}_{ =X \sim \mathcal N(0,1)} + \underbrace{B_2-B_1}_{ =Y\sim \mathcal N(0,1)} = \underbrace{B_2}_{=X+Y\sim \mathcal N(0,2)}.
$$
So my question is: are there uncorrelated dependent r.v.'s $X,Y \sim \mathcal {N}(0,1)$ such that their sum $X+Y$ is also distributed normally, i.e. $\mathcal N(0,2)$ ? If there are, can we construct an example explicitly?

My thought: Essentially the given condition is providing information about the moments of all orders
$$
E[X^n],\ \, E[Y^n],\ \ E[(X+Y)^n]\qquad \forall n\ge 1.
$$
However, we cannot pin down, for instance, the values of $E[X^2Y]$ or $E[XY^2]$ using this information only. This suggests that the given condition does not determine the distribution of $(X,Y)$ uniquely. So my guess is that the answer is affirmative. I've also tried an abstract approach to find a characteristic function (Fourier transform or a positive-definite function equivalently) $\hat \mu(s,t) = \int_{\mathbb R^2} e^{i(sx+ty)}d\mu(x,y)$ satisfying $d\mu(x,y) \neq (2\pi)^{-1}e^{-(x^2+y^2)/2}dxdy$
$$
\hat\mu (t,0) = \hat\mu(0,t) = e^{-t^2/2},\quad \hat\mu(t,t) = e^{-t^2}\qquad\forall t\in\mathbb R,
$$
but was in vain.

Best Answer

Yes, these exist. Consider the discrete signed measure $\sigma$ that assigns measure $+1$ to each of the points $\{(2,1),(-1,2), (-2,-1), (1,-2)\}$ and measure $-1$ to the points $\{(1,2), (-2,1), (-1,-2), (2,-1)\}$. (Draw a picture: it is as if the corners of a STOP sign are alternately $+1$ and $-1$.) Convolve $\sigma$ with the uniform distribution $U_D$ on a disk, so the resulting measure $\sigma* U_D$ is continuous, with bounded density $g$. Now there is some $\epsilon\ne0$ such that $|\epsilon g(x,y)|<\varphi(x,y)$ for every $(x,y)\in\mathbb R^2$, where $\varphi$ is the joint density of 2 iid $N(0,1)$ rvs. Finally, let $f=\varphi+\epsilon g$ be the sought-after joint density function for $(X,Y)$. The various marginal distributions of $X$, $Y$, and $X+Y$ are as if the $g$ term was not there: the distribution of $\pm1$ masses was chosen to vanish when Fubini's theorem is used to work out the margins.

See this old post for a slightly more conceptual description.