$$H(X|Y)=\sum_{i,j}p(x,y)\log(p(x|y))$$
So to make this zero we want $p(x|y)=1$ whenever $p(x,y)\neq 0$. This means that $x$ is a function of $y$. So to maximise $I(X,Y)$, take both variables to be equal and with uniform marginals.
It is neither convex nor concave. You can work it out using Bernoulli random variables.
Not convex:
1) Let $X$ and $Y$ be i.i.d. Bernoulli with $Pr[X=1]=1/2$. Then $I(X,Y)=0$.
2) Let $A=B=1$ (constants). Then $I(A,B)=0$.
3) Let $(W,Z) = \left\{\begin{array}{ll}
(X,Y) & \mbox{with prob 1/2} \\
(A,B) & \mbox{with prob 1/2}
\end{array}\right.
$
The joint probability distribution for $(W,Z)$ is a convex combination of those for $(X,Y)$ and $(A,B)$. Yet, $I(W,Z)>0$.
Not concave:
1) Let $X$ and $Y$ be i.i.d. Bernoulli with $Pr[X=1]=1/2$. Then $I(X,Y)=0$.
2) Let $A=B$ where $A$ is Bernoulli with $Pr[A=1]=1/2$. Then $I(A,B)=1$.
3) Let $(W,Z) = \left\{\begin{array}{ll}
(X,Y) & \mbox{with prob 1/2} \\
(A,B) & \mbox{with prob 1/2}
\end{array}\right.
$
Then $Pr[W=1]=Pr[Z=1]=1/2$ and:
\begin{align}
Pr[(W,Z)=(0,0)] &= 3/8\\
Pr[(W,Z)=(0,1)] &= 1/8\\
Pr[(W,Z)=(1,0)] &= 1/8\\
Pr[(W,Z)=(1,1)] &= 3/8
\end{align}
and you can show that $I(W,Z) < 1/2 = (1/2)I(X,Y)+(1/2)I(A,B)$.
Best Answer
We know that $H(Y|X) \le H(Y)$ with equality iff $X$ and $Y$ are independent. (This is a consequence of $I(X;Y) \ge 0$ which is a consequence of $D(p(X,Y) || p(X) p(Y)) \ge 0$ which is a consequence of Jensen's inequality; see eg. Cover and Thomas, theorem 2.6.5)
This implies $H(X,Y) \le H(X) + H(Y)$ with equality iff $X$ and $Y$ are independent (another basic result).
In our problem, $H(X)$ and $H(Y)$ are fixed, so the joint entropy is bounded by the above, and that bound is attained (only) if $X$ and $Y$ are independent , i.e. $P(X,Y) = P(X) P(Y)$. Hence, this is the joint probability that maximizes the joint entropy.