[Math] How to determine if Standard Deviation is high/low

standard deviationstatistics

I have derived the following response time data for a performance test I am running:

Min – 8sec
Max – 284sec
Average – 28sec
Standard Deviation – 27sec

What does the standard deviation say about the response time data distribution? When you say low/high standard deviation, what does this actually mean? Is this in comparison to the Average/Min/Max?

I know what standard deviation is and how it's computed. I'm just not sure how to tell if it is high or low.

Best Answer

If you take your cues from the financial industry, you can use the coefficient of variation (CV), which is the standard deviation / mean. This formula is used to normalize the standard deviation so that it can be compared across various mean scales.

As a rule of thumb, a CV >= 1 indicates a relatively high variation, while a CV < 1 can be considered low.

Some references to usage as "rule of thumb"

http://www.readyratios.com/reference/analysis/coefficient_of_variation.html

http://www.mhnocc.org/forum/index.php?t=msg&goto=234&