I have to start by saying I don't know anything about the derivative method shown in this excerpt. I tried some calculations but it doesn't even seem to give the same result as the standard definition, so I'm guessing he is calculating something different from what we call "moments" in modern physics. Anyway, by way of explanation:
The word "moment" is used for several different purposes in physics, so it can be kind of a confusing term because you have to know what is meant by the context. But all the various meanings of moment stem from its definition in math.
In math, a moment is a way of characterizing some distribution. It could be a probability distribution, a mass distribution, a charge distribution, or anything similar; all you need is some function $f(x)$ which defines the density of the quantity (mass/charge/probability) in question. In other words, $\int_a^b f(x)\;\mathrm{d}x$ is the amount of "stuff" between $a$ and $b$.
The $n$th mathematical moment of a distribution with density function $f(x)$ around a point $c$ is computed by a very simple formula:
$$I^{(n)}(x_0) = \int (x - c)^n f(x)\ \mathrm{d}x$$
This generalizes to higher-dimensional spaces, but then the moment becomes an $n$-index tensor:
$$I_{i_1\cdots i_n}^{(n)}(\mathbf{r}_0) = \idotsint \prod_{j=1}^{d}(r_{i_j} - c_{i_j}) f(\mathbf{r})\ \mathrm{d}^d\mathbf{r}$$
In physical applications, the definitions used are a little different, but in general an $n$th moment involves the integral of some $n$th power of position multiplied by the distribution function $f(\mathbf{r})$. (The aforementioned differences show up in how you use the various components of $\mathbf{r}$ to compute that $n$th power.)
Many typical measures used to describe physical systems or mathematical distributions can be represented as moments. For example:
If $f(x)$ is a 1D probability distribution:
- The normalization constant (which is 1) is $I^{(0)}$
- The mean value is $\langle x\rangle = I^{(1)}(0)$
- The variance is $I^{(2)}(\langle x\rangle)$
If $f(\mathbf{r})$ is a mass distribution:
- The total mass is $I^{(0)}$
- The center of mass is $I^{(1)}(0)$ (from which comes the term "weighted average")
- The moment of inertia around any point $\mathbf{c}$ is a second moment
If $f(\mathbf{r})$ is a charge distribution:
- The total charge, or monopole moment, is $I^{(0)}$
- The dipole moment is $I^{(1)}(0)$
- The quadrupole moment is a second moment
- and so on
For charge distributions, the quantities $I^{(n)}(0),\ n=0,1,2,\ldots$ (as modified with the required extra terms) are called the electric multipole moments $Q^{(n)}$. These quantities are of particular interest because you can expand the electric potential of an arbitrary charge distribution in terms involving successive moments:
$$\Phi(\mathbf{r}) = \sum_{n=0}^{\infty} \sum_{\{i_j\}}\frac{C_n Q_{i_1\cdots i_n}^{(n)}x_{i_1}\cdots x_{i_n}}{r^{2n+1}} \sim \sum_n \frac{C_n Q^{(n)}}{r^{n+1}}$$
In many situations, $r$ is relatively large so it's sufficient to use only the first nonzero term of this series in a calculation. In a sense, higher moments incorporate more detailed features of the charge distribution, which "blur out" and thus have little effect at large distances.
For the example you're looking at here, it sounds like Pearson is calculating the moments of area in the $x$ dimension around the origin - in other words, the density function $f(x)$ is the function that would trace along the tops of the rectangles.
$$f(x) = a\binom{n}{k}p^{n-k}q^k,\quad \tfrac{(2k + 1)c}{2} \le x < \tfrac{(2k + 3)c}{2}$$
(you could think of this as calculating the moments of mass of a cardboard cutout of the binomial distribution, assuming the cardboard is uniform density).
You can plug this into the integral definition of a moment, although the resulting expression is rather complicated, and as I said, it doesn't seem to give the same results as the derivative method Pearson is using. So I believe he's calculating something different.
I don't think it is possible to learn physics without math. Mathematics is the language of physics and you can't learn a subject without learning its langauage. (This is just my opinion though).
To start with particle physics, I think you should first learn quantum mechanics and special relativity. For quantum mechanics, grab a copy of Griffith or any other similar book. Knowledge of single variable calculus and differntial equation is needed though.For vector calculus, you can look into the first few chapters of Feynmann lectures vol 2. Along with these, you must also learn abstract Linear algebra (theory of vector spaces) , very basic group theory (definition of group and group actions) and multivariable calculus.
Once you are done with qm and special relativity, you will be ready for Quantum Field Theory.A nice book for QFT is Quantum Field Theory in nutshell by A. Zee.
Also,, now you should learn about theory of group representations and lie groups. A good introductory book for this topic is Group and Symmetries by Yvette Kosmann-Schwarzbach. The last chapter deals with particle physics.
Note that this route won't make an expert on the topic but you will gain a good understanding of it.
Best Answer
It's like dxiv said. The special case they used here is equivalent to the median length theorem, which is a theorem in math. Besides, I find the idea of using physics to "prove" something in math a bit weird. Physics is an empirical science in the sense that we make measurements and formulate mathematical models that agree with the measurements. Sure, one can use the existing mathematical models to make predictions about measurements that they are yet to take, but that's as far as you can go with the framework of physics. In other words, formal physics can be used to predict or "prove" something concerning physics, but it cannot prove anything in math. I like to think of math as being the explanation for physics and not the other way round, you know? It's the math that explains and helps you understand why something in physics is true, and not the other way round. The proof of a statement is just that; it explains exactly why a statement is true.