Pick $Y$ uniformly from $[0,1]$. Then pick $X$ uniformly from $[-y,y]$.
Then $E(X)=0$ and $E(X\mid Y=y)=0$, but $P\left(X<\frac{-1}{2}\mid Y=\frac{1}{4}\right)=0$, and $P\left(X<\frac{-1}{2}\right)=\frac{1}{8}$, so they are not independent.
In general, pick $Y$ first, and then let $Y$ determine a distribution for $X$ with the same mean.
For example, the case given by comments above, of picking $X$ uniformly from $(-1,1)$ and $Y=X^2$ can be seen this way - first pick $Y$ from $(0,1)$ with $P(Y<y)=\sqrt{y}$ and then pick $X$ uniformly from $-\sqrt{Y}$ or $\sqrt{Y}$.
Warning: this answer is wrong, see correction in the "Update" below
In general: if ${\bf x}=(x_1,\cdots x_n)'$ is uncorrelated (meaning that the elements $x_i$ are pairwise uncorrelated), and if $A$ is a $n \times n$ orthogonal matrix, then it's easy to prove that the variable
$${\bf y} = A {\bf x}$$
is also uncorrelated. That is, if $C_{\bf x}$ (covariance matrix of ${\bf x}$) is diagonal, so is $C_{\bf y}$.
Now, if additionally $x_i$ are gaussian variables, then the components of ${\bf x}$ are not only uncorrelated but independent. The result above implies that the variable ${\bf y} = A {\bf x}$ is uncorrelated; but, because it's also joinly gaussian, then it's also independent.
Hence your claim is correct.
Now if instead $x_i$ are just independent (hence uncorrelated) but not gaussian, then all we can say about ${\bf y} = A {\bf x}$ is that $y_i$ are uncorrelated. They are not, in general, independent. As your example illustrates.
Update: this answer, as is, is blatantly wrong. To make it right, one needs an essential additional assumption: that the variables have the same variance, that is, that $C_x$ is not only diagonal but also constant along the diagonal.
Without this assumption, it's false that $A {\bf x}$ preserves uncorrelatedness. This can be easily seen in the case of two uncorrelated jointly Gaussian (and hence independent) of different variances: the orthogonal transformation amounts to a rotation, which leads to ellipses (level curves of the densitiy function) which are no longer aligned with the axis.
Incidentally, this also applies to the example in the question: if $X$ $Y$ are independent uniform, but with different ranges, then $X+Y$, $X-Y$ are correlated.
Best Answer
A counterexample is given by equal probabilities for $(X,Y,Z)$ to take the values $(0,0,0)$, $(0,1,1)$, $(1,1,0)$, $(1,0,1)$. Both $(X,Z)$ and $(Y,Z)$ have uniform distribution, so $X$ and $Z$ are independent and $Y$ and $Z$ are independent, but $(X,Y)$ is clearly not independent from $Z$ (in fact the value of $(X,Y)$ fixes the value of $Z$).