[Math] Probability Between Two Normally Distributed Variables

normal distributionstandard deviationstatistics

The weight of medium-size tomatoes selected at random from a bin at the local supermarket is a normal random variable with mean $μ = 10$ ounces and standard deviation $σ = 1$ ounce. Suppose we pick two tomatoes at random from the bin, so the weights of the tomatoes are independent.

What is the probability the difference in the weights of the two tomatoes exceeds 2 ounces?

I understand that $μ(x-y) = μx – μy$, and therefore $μ = 0$ ounces.

I also understand that the standard deviation should be $\sqrt{σx^2 + σy^2}$.

Where should I go from here?

(Note: I have a TI-84 Plus calculator and am attempting to use the normalcdf function to no avail.)

Best Answer

What you mean to say is that the mean of the distribution should be $\mu_1$ - $\mu_2$, and the standard deviation of the distribution should be $\sqrt{\sigma^2_1 + \sigma^2_2}$. I do not know where you have $x$ and $y$.

Now, given that the mean is $0$, and the standard deviation is $\sqrt{2}$, how many standard deviations of this distribution is $2$ oz?