SI Units – Understanding Base Units in the New SI and the Role of the Ampere

conventionsmetrologysi-unitsunits

One question that comes up pretty much always in introductory electromagnetism courses is Why the base unit of electrical measurements is the ampere and not the coulomb, and the usual answer is that it is more accurate, and metrologically more useful, to build electrical measurements around measurements of current, so that's what the SI is built around.

However, the situation will change around 2018, when the currently proposed redefinition of the SI units comes into effect, essentially making all SI units dependent on seven invariants of nature. I am puzzled by the role of base constants in this new system: what are they, and is there any reason other than historical continuity to even keep the concept around?

To be a bit more explicit, let me concentrate on the role of the ampere, because this is the craziest one. In the new SI, the ampere is retained as a base unit, and electric current is retained as a base quantity

The base quantities used in the SI are time, length, mass, electric current, thermodynamic temperature, amount of substance, and luminous intensity. The corresponding base units of the SI were chosen by the CGPM to be the second, metre, kilogram, ampere, kelvin, mole, and candela

(Proposed draft of the new SI brochure (pdf), §1.2)

but in essence it is defined as the amount of current that will make the elementary charge equal to $e=1.602\,176\,565×10^{-19}\:\mathrm{C} =1.602\,176\,565×10^{-19} \:\mathrm{A\:s}$ (with the exact number replaced with whatever CODATA says is our best measurement at the time of the redefinition).

Compare this with the current definition of the SI ampere:

The ampere is that constant current which, if maintained in two straight
parallel conductors of infinite length, of negligible circular cross-section, and placed 1 metre apart in vacuum, would produce between these
conductors a force equal to 2 × 10−7 newton per metre of length.

Given a standard for force, this directly defines the ampere with no strings attached. The new definition, however, mostly just defines the coulomb as a fixed number of elementary charges, and then hinges on the definition of the second to define the ampere. Shouldn't this make the coulomb the base unit?


Going a bit further from this, the actual implementations make the picture even more muddled. I documented them in this question and answer, but the short story with the ampere is that the implementation mainly hinges on two well-understood physical effects:

  • One is the quantum Hall effect, which essentially establishes a quantum of conductance when an electron gas is confined at low temperatures to a two-dimensional material. The resistance then comes in (sub)multiples of the von Klitzing constant, $R_K=h/e^2\approx 26\:\mathrm{kΩ}$, so a quantum Hall effect experiment essentially gives a ready-baked resistance standard.

  • The other is the Josephson effect, which you get when you put together two superconductors separated by a thin insulating barrier (called a Josephson junction). In its AC form, you subject the junction to an alternating voltage at frequency $\nu$, and then observe how much DC current will pass as a function of an additional DC voltage $V_\mathrm{DC}$; because of quantum mechanical effects, this $I\text{-}V$ characteristic will show a series of jumps at voltages $V_n=n\nu/K_J$, where $K_J=2e/h\approx 484\mathrm{\:THz/V}$ is the Josephson constant. Thus, if you have a frequency standard to compare with, you get a ready-made voltage standard.

The ampere is then implemented by combining both of these standards in the obvious way: as the current through a $1\:\Omega$ resistor (calibrated as above) when subjected to a $1\:\mathrm V$ voltage (also calibrated as above). Given that the ampere was originally chosen as the base unit because that's what made most sense operationally, shouldn't the volt and the ohm take that place now?

(Making things even worse, at low currents the proposed implementations also use an effect called single-electron tunnelling, which essentially just counts how many electrons came through in one second (though apparently this is not yet ready for primary metrology). So, shouldn't the base quantity be electric charge?)


OK, so that was a bit of a rant, but really: am I missing something? Is there any reason, beyond historical continuity, to even keep the concept of a base quantity/unit around? If we do need the concept, why not keep it as the seven fixed constants? (With the alarming feature, of course, that mass would no longer be a base dimension – it would be replaced by action.) Do we just want it as a convenient base for the vector space of physical quantities? Why do we even need a canonical base for it? And if we do need it, wouldn't the coulomb be equally useful there?

Best Answer

Disclaimer: due to limited connectivity from where I'm now, I can give only a short answer and I'm not able to access useful references.

With the new SI the distinction between base and derived quantities (and units) will lose a lot of its foundational value, and will be kept mostly for historical continuity, and thus, as far as I know, there is no plan to change the set of base units (which would have to be anyway 7, to avoid changing the relationships between the physical quantities) or to get rid of them.

For what concerns the realization (not implementation) of the ampere with single electron transistors (SET), notice that at present the level of accuracy of such realizations is much worse than that achievable through the other path (Josephson effect plus quantum Hall effect), and it's insufficient for primary metrology. In fact, at present, the so called quantum metrological triangle has not yet been closed with the highest level of accuracy.