[Math] Probability Problem – Standard Deviation

probabilityprobability distributions

Calls to a customer service center
last on average $2.7$ minutes with a
standard deviation of $1.4$ minutes. An
operator in the call center is
required to answer $74$ calls each day.
Assume the call times are independent.

What is the standard deviation of the
total amount of time in minutes the
operator will spend on the calls each
day? Give your answer to four decimal
places.

I initially attempted to do: $74\cdot 1.4=103.6$

However, this doesn't seem to work. I'm not exactly sure how to go about solving this problem.

Best Answer

Hint: the variance, not the standard deviation, increases linearly with the number of calls (under the assumptions that you are expected to work under, especially independent variation from one call to another.)