[Math] Standard Deviation Around an Arbitrary Mean

statistics

I'm collecting data from x and y axis offset from origin of the impact points of rounds I've shot at a target, and I've calculated my standard deviations in the x and y directions as $ \sigma_x $ and $ \sigma_y $ respectively from their variances, and I have also calculated the standard deviation of the root of the sums of their squares (the magnitude of the distance from the origin/bull's eye) as $\sigma_r$. Now I know that these values are deviations around the mean value of each variable, which characterizes my shot grouping. I'm seeking to find my standard deviation away from the origin as well, i.e. I'm wanting to know how far I'm deviating away from the center of the target versus how far I'm deviating from the calculated center of my grouping. Would this value just be the calculation of the $\sigma^2$ and $\sigma$ around a mean of 0 in all the variables so that I have a measure of deviation from perfect, or is this the incorrect way to go about the problem. All standard deviations used are population standard deviations, e.g. $$ \sigma^2 = \sum_{i=1}^{n} \left ( \frac{1}{n}(x_{i} – \mu)^{2} \right ) \;\; ; \; \; \sigma = \sqrt{\sigma^{2}} $$ where in the case in question, $\mu$ would be taken to be 0, leaving just the sums of the squares as the variance.

Best Answer

The line of reasoning in the question is correct.

Calculation of moments about the origin differ only from the former by setting $\mu = 0$.