Electromagnetism – Why Faraday’s Law of Induction Lacks a Constant of Proportionality

electromagnetic-inductionelectromagnetismhistorymaxwell-equations

I am learning the Maxwell's equations on my own, and I ran into some questions about Faraday's law $$\nabla \times \mathbf E = -\frac{\partial}{\partial t} \mathbf B.$$

As far as I know, Faraday discovered experimentally that a change in magnetic field induces an electric field. However, the units of $\mathbf E$ and $\mathbf B$ are both defined by the Lorentz force law $\mathbf F = q(\mathbf E + \mathbf v \times \mathbf B)$, so one can only deduce that $\nabla \times \mathbf E = – \alpha \frac{\partial}{\partial t} \mathbf B$ for some constant $\alpha$. How did people figure out that this constant is exactly one?

One possible answer is based on the consideration of special relativity, but that would be an anachronism.

I suspect there were experiments confirming that Faraday's law is compatible with Lorentz force law (under some appropriate unit for velocity in terms of the speed of light). Is this the case? If so, when were these experiments carried out, and were they used as evidence supporting special relativity?

Any comment on the relevant history is also greatly appreciated.

Best Answer

It's a choice. The SI system of units was defined this way. In Gaussian units, the proportionality constant is $1/c$, a different choice.