Exactly as you say:
$$\begin{align}\mathsf E\left(\frac{\sum_{i=1}^k X_i}{\sum_{j=1}^n X_j}\right) & = \sum_{i=1}^k\mathsf E\left(\frac{X_1}{\sum_{j=1}^n X_j }\right) \\[1ex] & = k~\mathsf E\left(\frac{X_1}{\sum_{j=1}^n X_j }\right)\end{align}$$
Your professor would want you to add:
$$\begin{align}&=k~\mathsf E\left(\mathsf E\left(\frac{X_1}{\sum_{j=1}^n X_j }~\middle\vert~ {\sum_{j=1}^n X_j }\right)\right)\\[1ex] & = k~\mathsf E\left(\frac{\mathsf E(X_1\mid \sum_{j=1}^n X_j)}{\sum_{j=1}^n X_j}\right)\end{align}$$
So that you argue that since obviously $\sum_{i=1}^{n}\mathsf E(X_i\mid \sum_{j=1}^n X_j)=\sum_{j=1}^n X_j$ , then by symmetry: $\mathsf E(X_1\mid \sum_{j=1}^n X_j)=\tfrac 1n\sum_{j=1}^n X_j$ and hence
$$\begin{align} &=k~\mathsf E\left(\frac{\sum_{j=1}^n X_j}{n~\sum_{j=1}^n X_j}\right) \\[1ex] &=\frac kn \\ \blacksquare & \end{align}$$
What these steps add to the proof is to make it much more obvious that symmetry argument can be applied.
Well, you can observe that $\mathsf E\left(\frac{\sum_{i=1}^n X_i}{\sum_{j=1}^n X_j }\right)=1$ so by symmetry it should be true that $\mathsf E\left(\frac{X_i}{\sum_{j=1}^n X_j }\right)=\frac 1n$ for all $X_i$. It just leaves a little worm of doubt, at least in your professor's eyes, that you are justified to use symmetry at this point. (Though the extra steps do make it clear that you are.)
One way to avoid explicit convolution (although convolution is always involved, in the end), is to define $Z = 3Y \sim \text{Uniform}(0, 3)$, and look at the distribution of $(X, Z)$ in the $x$-$z$ plane: a uniformly distributed rectangle.
Within this rectangle, the bands of equal values of $X+Z$ correspond to diagonal stripes. The lengths of these stripes, where $X+Z =$ some value $w$, is proportional to the value of the PDF $f_{X+Z}(w)$. All you need to do, then, is to find the proportionality constant that makes it a PDF; that is, it must integrate to $1$.
Best Answer
Let $U_i$ be iid Uniform($0,1$) and let $X_i=5+5U_i$.
Then $$T=\inf\left\{ t\mid U_{1}+\cdots+U_{t}\geq6-t\right\}$$ and: $$\mathbb{E}T=\sum_{n=1}^{\infty}P\left(T\geq n\right)=4+P\left(T\geq5\right)+P\left(T\geq6\right)=$$$$4+P\left(U_{1}+U_{2}+U_{3}+U_{4}<2\right)+P\left(U_{1}+U_{2}+U_{3}+U_{4}+U_{5}<1\right)\tag1$$
With induction we can prove that for $x\in\left[0,1\right]$ we have: $$P\left(\sum_{i=1}^{n}U_{i}\leq x\right)=\frac{x^{n}}{n!}$$ so the last term in $(1)$ equals $\frac1{120}$.
Further the distribution of $U_1+U_2+U_3+U_4$ is symmetric wrt $2$ so that $P\left(U_{1}+U_{2}+U_{3}+U_{4}<2\right)=\frac12$
This together proves that: $$\mathbb ET=4+\frac12+\frac1{120}=\frac{541}{120}$$