$\newcommand{\rchi}{\raise{0.5ex}\chi}$
You have been given (effectively) that: $X\sim \mathcal U\{1,2,3,4,5,6\}$ and $Y\mid X\sim \mathcal{Bin}(X, p)$.
That is that $X$ is discrete uniformly distributed (the roll of an unbiased die), and $Y$ when conditioned on $X$ is binomially distributed (the count of successes in $X$ iid Bernoulli events).
So you know the marginal pmf of $X$ is $\mathsf P_X(k) = \frac 1 6 \;\rchi_{k\in\{1;6\}}$ and the conditional pmf of $Y$ is $\mathsf P_{Y\mid X}(h\mid k) = \binom{k}{h} p^h(1-p)^{k-h} \;\rchi_{h\in\{0;k\}} $
From this you can determine the joint pmf of $X,Y$, and from that the marginal pmf of $Y$.
$$\begin{array}{|c|c:c:c:c:c|c|} \hline
& \text{1} & \text{2} & \text{3} & \text{4} & \text{5} & \text{6} & f_Y(y) \\ \hline
\text{0} & (1-p)/6 & (1-p)^2/6 & (1-p)^3/6 & (1-p)^4/6 & (1-p)^5/6 & (1-p)^6/6 & \tfrac 1 6 \sum\limits_{k=1}^6 (1-p)^k \\ \hdashline
\text{1} & p/6 & p(1-p)/3 & & & & & \\ \hdashline
\text{2} & 0 & p^2/6 & & & & & \\ \hdashline
\text{3} & 0 & 0 & & & & & \\ \hdashline
\text{4} & 0 & 0 & 0 & & & & \\ \hdashline
\text{5} & 0 & 0 & 0 & 0 & & & \\ \hdashline
\text{6} & 0 & 0 & 0 & 0 & 0 & & \\ \hline
f_X(x) & 1/6 & 1/6 & 1/6 & 1/6 & 1/6 & 1/6 & 1 \\ \hline
\end{array}$$
In this context, a decoder maps observed $Y$'s to guessed $X$'s, i.e. $d: \{0,1\} \rightarrow \{0, 1\}$. Or in other words, you are choosing $d(0)$ (what $X$ to guess when seeing $Y=0$) and $d(1)$ (what $X$ to guess when seeing $Y=1$). And you want to minimize the prob of guessing wrong, i.e. when $d(Y)$ (i.e. the guess) $\neq X$ (i.e. the actual signal).
At this point, if you stare at the $2 \times 2$ probability table, it should be "really obvious" that you should guess $d(0) = 0$, because if you see $Y=0$ then it is certain that $X=0$. The question is what to guess when seeing $Y=1$, but it should also be "somewhat obvious" :) that in that case you should guess $d(1) = 1$. We will prove these rigorously now.
$$
\begin{align}
P(d(Y) \neq X) &= P(d(Y) \neq X | Y = 1) P(Y=1) + P(d(Y) \neq X | Y=0) P (Y=0)\\
&=P(d(1) \neq X | Y=1)P(Y=1) + P(d(0) \neq X | Y=0) P(Y=0)
\end{align}$$
Note that, as far as the minimization is concerned, $P(Y=1), P(Y=0)$ are constants. Since we can choose $d(0), d(1)$ independently, we can minimize each term separately. Formally we're doing this:
$$d(i) = \arg\min_{j \in \{0,1\}} P(j \neq X | Y = i) = \arg \max_{j \in \{0,1\}} P(j = X | Y = i)$$
I.e. for each case of $Y=i$, simply find the most probable $X$ (conditioned on $Y=i$). I would say this is a general feature of all kinds of decoding.
First, $P(d(0) \neq X | Y = 0)$: Conditioned on $Y=0$, we have $X=0$ with certainty. Nothing is more probable than certainty. So we choose $d(0) = 0$.
Next, $P(d(1) \neq X | Y = 1)$: You can easily calculate that $P(X=1 | Y = 1) = {2 \over 3}$, and $P(X=0 | Y = 1) = {1 \over 3}$. Since $X=1$ is more probable (conditioned on $Y=1$), we choose $d(1) = 1$.
With these optimal choices of $d(0) = 0, d(1) = 1$, the overall error prob is:
$$
\begin{align}
P(d(Y) \neq X) &= P(d(1) \neq X | Y=1)P(Y=1) + P(d(0) \neq X | Y=0) P(Y=0) \\
&= P(1 \neq X | Y=1)P(Y=1) + P(0 \neq X | Y=0) P(Y=0) \\
&= {1 \over 3} \cdot {3 \over 4} + 0 \cdot {1 \over 4} = {1 \over 4}
\end{align}$$
Another way to find the overall error prob is to augment your table with your choices (remember that $d()$ is a deterministic function, depending on $Y$ only), and then realize the red entries are the cases when you guess wrong:
\begin{array}{c|c|c|c}
& X = 0 & X = 1 & \style{font-family:inherit}{\text{Total}} \\\hline
Y = 1 \iff d(Y) = 1 & \color{red}{\frac{1}{4}} & \frac{1}{2} & \frac{3}{4} \\\hline
Y = 0 \iff d(Y) = 0 & \frac{1}{4} & \color{red}{0} & \frac{1}{4} \\\hline
\style{font-family:inherit}{\text{Total}} & \frac{1}{2} & \frac{1}{2} & 1
\end{array}
Best Answer
We have that the pmf of $X$ is$$p_X(x)=\begin{cases} \frac{3}{10} & x=-2 \\ \frac{3}{10} & x=-1 \\ \frac{1}{10} & x=0 \\ \frac{2}{10} & x=1 \\ \frac{1}{10} & x=4 \end{cases}$$
Transforming this to get the pmf of $Y$ we get
$$p_Y(y)=\begin{cases} \frac{3}{10} & y=4 \\ \frac{3}{10} & y=1 \\ \frac{1}{10} & y=0 \\ \frac{2}{10} & y=1 \\ \frac{1}{10} & y=4 \end{cases}$$
Notice that the probabilities remain the same.
Finally, we have repeated $y$ values so we combine them to get
$$p_Y(y)=\begin{cases} \frac{4}{10} & y=4 \\ \frac{5}{10} & y=1 \\ \frac{1}{10} & y=0 \\ \end{cases}$$
We can now easily compute $$P(Y\geq3)=\frac{4}{10}$$