The linearity of expectation is the simple way to approach this problem. It is a very powerful technique that enables us to find the expectation of many random variables $X$ even when it is extremely difficult to find the distribution of $X$.
But if you really want to avoid using the linearity of expectation, it can be done in this case. The calculation will take a while: its length can be considered a proof of the fact we should use the linearity of expectation! By the general formula for expectation in the discrete case, when the distribution of $X$ is known, the expectation of the number of red balls $X$ is
$$\sum_{k=0}^r kP(X=k).$$
The probability that $X=k$ is, as you know,
$$\frac{\binom{r}{k}\binom{w}{n-k}}{\binom{r+w}{n}},$$
and therefore
$$E(X)=\sum_{k=1}^r k \frac{\binom{r}{k}\binom{w}{n-k}}{\binom{r+w}{n}}. \tag {$\ast$} $$
We are summing from $k=1$ on because the $k=0$ term makes no contribution to the expectation, and could cause some headaches later.
We use the following result:
Lemma: The binomial coefficient $\binom{q}{p}$ is equal to $\frac{q}{p}\binom{q-1}{p-1}$.
The lemma is easy to prove, either combinatorially or by manipulation. For a manipulational proof, note that
$\tfrac{q}{p}\tbinom{q-1}{p-1}=\tfrac{q}{p}\tfrac{(q-1)!}{(p-1)!(q-p)!}=\tfrac{q!}{p!(q-p)!}=\tbinom{q}{p}. \hspace{1cm}\Box$
Using the lemma, we can see that
$$\binom{r+w}{n}=\frac{r+w}{n}\binom{r+w-1}{n-1}.\tag {$\ast\ast$}$$
We also need some information about $k\binom{r}{k}$. By the lemma, or otherwise,
$$k\binom{r}{k}=r\binom{r-1}{k-1}.\tag {$\ast\ast\ast$}$$
Substituting the values obtained in $(\ast\ast)$ and $(\ast\ast\ast)$ for the terms in the formula $(\ast)$ for the expectation of $X$, we obtain
$$E(X)=\frac{rn}{r+w}\sum_{k=1}^r \frac{\binom{r-1}{k-1}\binom{w}{n-k}}{\binom{r+w-1}{n-1}}.$$
Make the change of variable $j=k-1$. Then the above formula for $E(X)$ becomes$$E(X)=\frac{rn}{r+w}\sum_{j=0}^{r-1} \frac{\binom{r-1}{j}\binom{w}{n-j-1}}{\binom{r+w-1}{n-1}}.$$
Note that $\frac{\binom{r-1}{j}\binom{w}{n-j-1}}{\binom{r+w-1}{n-1}}$ is the probability that when you draw $n-1$ balls from an urn that contains $r-1$ red and $w$ white, you will get exactly $j$ red balls. When we sum this from $j=0$ to $r-1$, we are adding up all the probabilities, so the complicated-looking sum is equal to $1$. We conclude that
$$E(X)=\frac{rn}{r+w}\sum_{j=0}^{r-1} \frac{\binom{r-1}{j}\binom{w}{n-j-1}}{\binom{r+w-1}{n-1}}=\frac{rn}{r+w}. \hspace{1cm} \Box$$
Remark: Although the linearity approach is the smoothest, there are other properties of expectation that one can use for a proof. For example, let $E(n,x,y)$ be the expected number of red balls when we draw from $x$ red and $y$ white. On the first pick, we get a red with probability $\frac{r}{r+w}$, and a white with probability $\frac{w}{w+r}$. If we get a red on the first pic, then our expected number of reds is $1$ plus the expected number of reds from the remaining picks. If we get a white, then our expected number of reds is simply the expected number of reds from the remaining picks. So we obtain
$$E(n,r,w)=\frac{r}{r+w}(1+E(n-1,r-1,w))+\frac{w}{r+w}E(n-1,r,w-1).$$
Using this formula, and a simple induction on $n$, we can prove that
$E(n,x,y)=\frac{nx}{x+y}$.
You've got the right answer. Here is the mathematical explanation. Let $S$ be the sum of the k balls chosen without replacement,
$S = X_1 + X_2 + ... + X_k$ $(i=1,...,k)$, where $X_i$ = value of $i^{th}$ ball chosen, then
$$E[S]=E[\sum\limits_{i=1}^{k}X_i] \overset{(a)}{=} \sum\limits_{i=1}^{k}E[X_i] = kE[X_i] = k(n+1)/2$$
In above, (a) follows from the Linearity of Expectation.
Best Answer
Let $X_1$ be the number on the first ball, $X_2$ be the number on the second ball, and $X_3$ be the number on the third ball. In other words, $X_k$ is the number on the $k$th ball drawn from the box, for $k \in \{1,2,3\}.$
Consider $E[X_k].$ This is the expected value of $X_k$ at the start of the experiment; it is a prediction about $X_k$ that is made before any balls are drawn. At that point in time, all you can say about the $k$th ball to be drawn is that it is equally likely to be any of the $99$ balls in the box. The expected value of that distribution is $E[X_k] = 50.$
Of course, as soon as you draw the first ball, the probabilities for the second ball change: if $X_1=17,$ then $P[X_2 = 17] = 0,$ the other balls become slightly more likely to be the second one drawn, and $E[X_2\mid X_1=17] \neq E[X_2].$ This tells us that $X_1$ and $X_2$ are not independent. But everything up to this point in this paragraph is just a red herring when it comes to the problem we have to solve, because it turns out we have no need to know anything about $E[X_2\mid X_1]$; when we consider what the second ball will be, we just need to know $E[X_2].$
The theorem of the linearity of expectation tells us that $$ E[X_1 + X_2 + X_3] = E[X_1] + E[X_2] + E[X_3] $$ regardless of whether $X_1,$ $X_2,$ and $X_3$ are independent. Hence we can apply that theorem to this problem: $$ E[X_1 + X_2 + X_3] = 50 + 50 + 50 = 150. $$