Control Limits – Using Standard Deviation for Individual Control Chart

quality controlstandard deviation

To compute the control limits using the 3 sigma rule, the standard deviation is usually approximated by the formula:

$\sigma=\frac{\overline{MR}}{1.128}$,

where

$\overline{MR}$ is the average of all the moving ranges of two observations, given by pattern:

$\overline{MR}=\frac{\sum_{i=2}^{N}|x_{i}-x_{i-1}|}{N-1}$

$N$ is the number of observations (batches).

Why do we calculate $\sigma$ such, not using the pattern:

$\sigma=\sqrt{\frac{1}{N-1}\sum_{i=1}^{N}(x_{i}-\bar{x})^{2}}$?

Best Answer

The NIST Chapter on "Process or Product Monitoring and Control" is a useful reference on this topic. As explained there, it often is preferable to use estimates of $\sigma$ based on the standard formula that you present. The moving-range approach is perhaps better limited to evaluation of individual measurements rather than batches; from Section 6.3.2.2:

Control charts for individual measurements, e.g., the sample size = 1, use the moving range of two successive observations to measure the process variability.

With only 2 measurements, an estimate of $\sigma$ based on the range is as efficient as the usual standard deviation formula.

Even then, NIST implies that this approach is best used in early "Phase 1" evaluation of quality control when "we use historical data to compute the initial control limits." From the end of Section 6.3.2.2:

Another way to construct the individuals chart is by using the standard deviation...It is preferable to have the limits computed this way for the start of Phase 2.

If moving ranges are used, you have to be particularly careful to monitor trends, as you can have a small moving range while a process is systematically moving out of control.

There is a bias in the estimate of $\sigma$ that depends on sample size, so a correction factor explained in Section 6.3.2 might be applied to the usual standard deviation formula that you present.

As someone who remembers the days of manual calculators, I suspect that use of moving ranges has something to do with that old technology. Updating mean values as new data came in wasn't so hard, but updating variances required extra steps.

Related Question