Let $X,Y$ be uniformly distributed on $[a,b]$. Then the pdf of $Z = X-Y$ is
$$p(z) = \int_\mathbb{R} dx\int_\mathbb{R} dy \ p(x) p(y) \ \delta(z-(x-y)) =$$
$$ = \frac{1}{(b-a)^2} \int_\mathbb{R} dx \int_\mathbb{R} dy \ \mathbb{1}^x_{[a,b]} \mathbb{1}^y_{[a,b]} \delta(z-(x-y))=$$
$$ = \frac{1}{(b-a)^2} \int_\mathbb{R} dx \int_\mathbb{R} dy \ \mathbb{1}^x_{[a,b]} \mathbb{1}^y_{[a,b]} \delta(y-(x-z))=$$
$$ = \frac{1}{(b-a)^2} \int_\mathbb{R} dx \ \mathbb{1}^x_{[a,b]} \mathbb{1}^{x-z}_{[a,b]} =$$
The product of the two indicators is equivalent to the conditions:
$$a \leq x \leq b \quad \wedge \quad z+a\leq x \leq z+b.$$
For different values of $z$, these conditions can be reduced to a single inequality, for instance if $0 \leq z \leq b-a$, then the inequality $a+z \leq x $ is "stronger" than $a \leq x$, and the inequality $x \leq b+z$ is "weaker" than $x \leq b$. So for those values of $z$, you have the product of the indicators simplifying to:
$$\mathbb{1}^x_{[a,b]} \mathbb{1}^{x-z}_{[a,b]} = \mathbb{1}^x_{[a+z,b]}, \quad 0 \leq z \leq b-a.$$
Similarly, for another set of values of $z$ (negative $z$, but not too negative), you'll have a different total constraint, and if $z +a \geq b$ or $z+b \leq a$, the two indicators will be totally disjoint and give zero.
To help visualize this, you can draw a picture like this:
$$...---[--*---------]--*---...$$
where the square brackets denote the interval $[a,b]$, and the stars are $a+z$ and $b+z$ respectively.
In your solution you had the wrong numerical factor, $1/4$ instead of a $4$.
Best Answer
The pdf you give is correct for values of $z$ that are more than $a$, provided $b$ is positive.
You can prove that as follows: $$ f_y(z) = \frac{d}{dz} F_y(z) = \frac{d}{dz} \Pr(y\le z) = \frac{d}{dz}\Pr(a+bx\le z) = \frac{d}{dz} \Pr\left( x \le \frac{z-a}{b} \right) $$ where the very last step works only if $b>0$. Then: $$ \frac{d}{dz} \Pr\left( x \le \frac{z-a}{b} \right) = \frac{d}{dz} F_x\left( \frac{z-a}{b} \right) = f_x\left( \frac{z-a}{b} \right) \cdot \frac 1b. $$ In the last step, the chain rule is used.
For the expectation, you can use either $$ \int_a^\infty g(z) f_y(z)\;dz $$ or $$ \int_0^\infty g(a+bx) f_x(w)\;dw. $$ The second form given here is an instance of the "law of the unconscious statistician".
I prefer to use capital letters for random variables and often corresponding lower-case letters for the arguments to the cdf or pdf, thus: $$ X \sim \chi^2_\nu $$ $$ \Pr(a < X < b) = \int_a^b f_X(x)\;dx. $$