For the first one, $a = 3, b = 2, f(n) = n$. Since $f(n) = n = O(n^(log_2(3)-e))$ for, e.g., $e = 0.1$, the first case of the master theorem applies. The complexity is therefore $O(n^(log_2(3)))$ ~ $O(n^1.6)$, so you got the right answer.
Your work for the second is wrong when you apply the Master theorem to T(lg m) when its expression contains the n term. As I answered on SO before your question was closed there, we can see from the first few terms of this recurrence - $T(1) = 1$, $T(2) = 1 + 1/2$, $T(3) = 1 + 1/2 + 1/3$, ... - this is just the harmonic series, plus some arbitary constant. Since the growth order of the kth partial sum of the harmonic series grows on the order O(log k), this is what your answer must turn out to be. If it were me, I'd simply find a proof of this fact and use that to prove the order of your recurrence.
For the third one, it's a little tricky. Right away we know that $I(n) < J(n)$ where $J(n) = J(n/2) + n$. Since $J(n) = Theta(n)$ by the Master Theorem, we can say that $I(n) = O(n)$. We may also gain some insight by observing $I(n) > K(n)$ where $K(n) = K(n/2) + n^0.999...9$, and since $K(n) = Theta(n^0.999...9)$ by the Master Theorem, we know $I(n) = Omega(n^0.999...9)$ (remember, any polynomial - even $n^0.000...1$ - grows faster than a logarithm).
Given this, the answer is probably a function of the form n/log^k(n).We can guess k = 1 and see whether we have enough to solve it:
Assume $I(n) <= cn/log(n)$. Then is it the case that $I(2n) = I(n) + 2n/log(2n) <= cn/log(n) + 2n/log(2n) <= 2cn/log(2n)$? $2n/log(2n) <= cn(2/log(2n) - 1/log(n))$, so $2 <= c(2 - log(2n)/log(n))$, so $2 <= c(2 - 1/log(n) - 1)$
As n increases, the RHS decreases asymptotically to c; in particular, for $n = 4$, we have $2 <= c(2 - 1/2 - 1) = c/2$, and the choice $c = 4$ works.
So we have an even better bound than before, namely, $I(n) = O(n/log(n))$. We can check whether, possibly, $I(n) = Omega(n/log(n))$ as well. If so, we're done. Otherwise, we might be able to keep going on k.
Assume $I(n) >= cn/log(n)$. Is $I(2n) = I(n) + 2n/log(2n) >= cn/log(n) + 2n/log(2n) >= 2cn/log(2n)$? $2n/log(2n) >= cn(2/log(2n) - 1/log(n))$, so $2 >= c(2 - log(2n)/log(n))$, and $2 >= c(2 - 1/log(n) - 1)$.
Again, as n goes to infinity, the RHS goes to c. In particular, for $n = 4$, we have 2$ >= c/2$. We can choose $c = 4$ and get that $I(n) = Omega(n/log(n))$.
Since we have $I(n) = O(n/log(n))$ and $I(n) = Omega(n/log(n))$, we know $I(n) = Theta(n/log(n))$, and we're done. Just to be clear, we have found that $2n/log(n) <= I(n) <= 4n/log(n)$ for $n >= 4$.
Your recurrence relation falls into Case 1: $f(n) = n \log n$ is $O(n^{log_{b}{a}-\epsilon}) = O(n^{2-\epsilon})$.
To show why this is Case 1, as Louis says, logarithmic functions ($\log n$) are asymptotically bounded by polynomial functions ($n^a$, where $a > 0$). This can be shown by taking the limit:
$$
\lim_{n \to \infty} \frac{\log n}{n^a} = 0
$$
through L'Hôpital's rule. In particular, $\log n \in O(n^{1-\epsilon})$ for small $\epsilon$. (We can go even further and say that $\log n \in o(n^{1-\epsilon})$.)
Then by multiplying both sides by $n$, (an allowed operation in big-O notation), $n \log n \in O(n^{2-\epsilon})$.
Therefore by the Master Theorem, $T(n)$ is $\Theta(n^2)$.
Best Answer
This was already answered multiple times on the site but here we go. Let $S(k)=2^{-k}T(2^k)$, then $S(k)=S(k-1)+\Theta(k)$, hence $S(k-1)+Ak\leqslant S(k)\leqslant S(k-1)+Bk$ for some finite $A$ and $B$. Iterating this yields $S(0)+A\sum\limits_{i=1}^ki\leqslant S(k)\leqslant S(0)+B\sum\limits_{i=1}^ki$, hence $S(k)=\Theta(k^2)$.
The final step, which cannot be made rigorous without some further hypothesis (such as, that the sequence $(T(k))_k$ would be nondecreasing), is to assert that $T(2^k)=\Theta(2^kk^2)$ implies that $T(n)=\Theta(n(\log n)^2)$.