Electromagnetism – Why 1 Coulomb Was Redefined in the SI Unit System

chargeelectromagnetismsi-unitsspecial-relativity

Since current can be calculated as $I = q/t$, the original definition of 1 Coulomb in SI unit system was that it is such amount of charge which flows through wire cross-section having current of $1 A$ in $1 s$ of duration, i.e. $$\tag 1 1C=1A\cdot 1s$$.

Now it is simply stated that $1 C \approx 6~241~509~074~460~762~607.776~e$ (or exact scientific notation).

Now here comes my question,- Why definition of SI unit of charge was moved from that one related to current strength ?
Can it be because time is not Lorentz invariant under inertial reference frame,- i.e. time flow depends on what observer you choose as a reference frame (because it depends on object speed), hence different observers can measure different charge amount(s) flowing in same wires as per (1),- because 1 second unit is there. Hence (1) definition is faulty because charge is absolute quantity and cannot depend on a reference frame. Is that correct or there are other reasons why $1C$ original mapping was abandoned ?

From wiki:

The SI defines the coulomb by taking the value of the elementary charge e to be $1.602176634×10^{−19} C$, but was previously defined in terms of the force between two wires

Best Answer

Why definition of SI unit of charge was moved from that one related to current strength ?

Because the new standard is more accurate and stable than the previous standard.

Can it be because time is not Lorentz invariant under inertial reference frame, … Hence (1) definition is faulty because charge is absolute quantity and cannot depend on a reference frame.

No. There was nothing faulty about the previous standard. This one is just more accurate and stable.