No.
Consider three boolean variables: A, B, X where X and A are i.i.d. Bernoulli with probabilty 0.5, while B = X $\oplus$ A (that is, B is equal to the xor of X and A).
It's easy to show that B is also Bernoulli distributed with probabilty 0.5, and A and B are mutually independent, though obviously they aren't conditionally independent given X.
"Nevertheless if the two variables are normally distributed, then uncorrelatedness does imply independence" is a very common fallacy.
That only applies if they are jointly normally distributed.
The counterexample I have seen most often is normal $X \sim N(0,1)$ and independent Rademacher $Y$ (so it is 1 or -1 with probability 0.5 each); then $Z=XY$ is also normal (clear from considering its distribution function), $\operatorname{Cov}(X,Z)=0$ (the problem here is to show $\mathbb{E}(XZ)=0$ e.g. by iterating expectation on $Y$, and noting that $XZ$ is $X^2$ or $-X^2$ with probability 0.5 each) and it is clear the variables are dependent (e.g. if I know $X>2$ then either $Z>2$ or $Z<-2$, so information about $X$ gives me information about $Z$).
It's also worth bearing in mind that marginal distributions do not uniquely determine joint distribution. Take any two real RVs $X$ and $Y$ with marginal CDFs $F_X(x)$ and $G_Y(y)$. Then for any $\alpha<1$ the function:
$$H_{X,Y}(x,y)=F_X(x)G_Y(y)\left(1+\alpha\big(1-F_X(x)\big)\big(1-F_Y(y)\big)\right)$$
will be a bivariate CDF. (To obtain the marginal $F_X(x)$ from $H_{X,Y}(x,y)$ take the limit as $y$ goes to infinity, where $F_Y(y)=1$. Vice-versa for $Y$.) Clearly by selecting different values of $\alpha$ you can obtain different joint distributions!
Best Answer
Independence would mean that knowing the value of $Y$ gives no information on the value of $X$.
So here $X$ will be independent of $Y$ only if $X$ has a uniform marginal distribution on $[0,1]$, and the conditional distribution $X|Y$ is uniform on $[0,1]$ independent of the value of $Y$.
An example be a uniform (joint) distribution over the unit square.
Here are some examples using Tetris blocks:
For the "S" block
we have $p[X|Y=\text{middle}]=p[X]=\text{uniform}$, but $X$ is certainly not independent of $Y$.
While for the "O" block
we have $p[X|Y]=p[X]=\text{uniform}$, so $X$ is independent of $Y$.