[Physics] Moment of force about a point definition

newtonian-mechanicsreference framesstaticstorque

In defining the moment of a force about a point as "the tendency of one or more applied forces to rotate an object about an axis [going through a point, hence also about a point]", I see it logical to infer that it is somehow related to the angle between the force and the moment arm, but why put it directly proportional to the sine of that angle? Why not $\ln^n(1+\theta)$ for example? This will become zero as the angle tends to zero.

Furthermore, why have the magnitude of the moment arm and the force in the definition? Maybe so that to underline that for two forces of different magnitudes this tendency to rotate is different? And I guess the same for the moment arm's length?

Perhaps there is no aim to describe how this tendency "changes" as a function of the angle but we just want some formula that would tell us that for a given two forces and two moment arms, one is bigger than the other?

Best Answer

Torque is defined as

$$ \boldsymbol {\tau} = \mathbf r \times \mathbf F$$

But the reason we define it that way is because we have found a conserved quantity called angular momentum, $\boldsymbol {\ell} = \mathbf r \times \mathbf p$, which is related to torque as:

$$\boldsymbol {\tau} = \frac {d\boldsymbol {\ell}}{dt}$$

Related Question