[Physics] How to physically measure voltage and current

experimental-physicsMeasurements

Short Version: How exactly do (modern) voltage/current meters measure the numbers they do, from first principles? References much appreciated.


We measure length with a ruler, time with a clock, and mass with a scale. The units of such quantities are all relative to some commonly accepted standard, i.e. a stick of a fixed length, a certain interval of time, a certain amount of water at fixed conditions, etc. These quantities are relatively easy for us humans to understand.

But what about voltage and current? I understand how you can define electric fields experimentally – get a certain amount of charge on an object, and measure the force (response) felt by that charged object. Similarly with magnetic fields – send some charges through a magnetic field and measure their response. To measure the voltage across two points, I suppose you could connect some small "resistor" in between two points and measure how hot it gets under certain fixed conditions (maintained within your measurement device).

I'm sure my intuition on all this "phenomenology" of voltage and current is based on 19th century equipment, but that's the only equipment I ever see intuitively broken down. Modern electrical equipment is more mysterious (owing to its inherent complexity I'm sure).

Every time I try to look for an explanation of how multimeters really measure voltage and current, I'm met with circular explanations like "You measure voltage by connecting a resistor in parallel and measuring the voltage drop across it, or current passing through it" and "You measure current by connecting a resistor in series and measuring the voltage across it". Please, I just want an intuitive and direct answer. I would also very much appreciate a reference to some article/discussion on this.

Best Answer

An analog ammeter, called a galvanometer, passes current near a bar magnet which is physically attached to an indicator needle. A magnetic field surrounds the current, whose strength is proportional to the magnitude of the current, and so the angle of the needle changes. A clever calibrator paints marks under different locations under the end of the needle corresponding to different currents in the galvanometer.

I believe that all analog meters are, at their heart, galvanometers; I'd love to be corrected.

Digital meters are based on the transistor. There are a lot of different devices which can be called "transistors," and a lot of configurations of the same transistor that can do different things. A common configuration is for the transistor to act as a current-controlled switch (for bipolar transistors) or as a voltage-controlled switch (for field-effect transistors), where the current or voltage into the "base" terminal of the transistor determines whether the path between the "emitter" terminal and the "collector" terminal is insulating or conducting. You can use a collection of transistors to build a comparator, a digital device whose output is "high" or "low" depending on which of its inputs is at higher voltage, and you can use a series of comparators with related reference voltages to build an analog-to-digital converter.

I'm pretty sure the most common multimeters on the market are analog-to-digital converters built from field-effect transistors, which are at their heart voltage-measurement devices.

For a more exotic approach, you might like to read about the Kibble's "watt balance".

Related Question