I guess I am having some confusion about the history of calculating Planck's constant. I see the mass of the electron may come into the equation here but isn't the measurement of mass based mostly on how accurate instruments are? I assume it's not something that can be derived like the ratio of a circle to it's diameter? (flat space). There is measurement involved here and limitations of instrumentation? So can its accuracy ever change or am I just way off course here, may be the case.
[Physics] Can the accuracy in Planck’s constant ever be increased
error analysismetrologyphysical constants
Related Solutions
For point 1) you are correct no instrument can be precisely calibrated or has infinite precision. This is in part limited by how well the corresponding SI units are known NPL has a nice little faq on this. Similarly all measurements will have some noise (possibly very small) which limit precision.
Personally I wouldn't use weight as an example. There are several reasons. Firstly as you point out it is easy to confuse ideas of mass and weight. If you want to be completely correct this is a confusion you don't need. Another concern is that mass is currently the only fundamental unit that is still defined by a physical artifact (the standard kilogram) rather than by physical constants so the definition of a kilogram is pretty uncertain.
In my opinion a better example would be measuring a metre. A metre is defined as "the length of the path travelled by light in vacuum during a time interval of 1/299,792,458 of a second.". Now the question becomes how good is your clock. Now I'm going to use an atomic clock which is accurate to $1$ in $10^{14}$, but this clearly has some small uncertainty. In reality you probably don't use something this accurate but something which has been calibrated against something that's calibrated like this and so will be much less accurate.
As a side note this is not how a metre is actually calibrated, people actually use things like a frequency stabilised laser, which has a very well known wavelength, and an interferometer to count the fringes seen over a distance.
For 2) I don't think you need to say any more. There are lots of things you could say but you are trying to answer a specific question, not write a book. The NPL:begginers guide to uncertainty of measurement provides a good introduction to some of the topics, but iss by no means comprehensive.
For 3) I would say your analogy isn't far wrong. Its only really scientists that care about this sort of accuracy. Possibly also anyone involved in micro-manufacture, think Intel. Even most engineers don't care (they tend to double stuff just to be certain ;) ). I think the best way to show it is to do hat you did in your actual answer and give this as a percentage error to show how small it actually is.
So a second used to be defined as one 86,400th of one day: but now we have these cool things called atomic clocks which measure time very, very precisely: we have therefore decided to define the second as a certain number of ticks of a certain type of atomic clock, so that anybody who needs really high precision can have it. With these new clocks we can see that actually the length of a day is very complicated and, in fact, we can see that our days are slowly lengthening due to tidal forces. Our definition used to line up with the old definition of the second extremely well, but that has become less so as the Earth slows its rotation; right now there are "leap seconds" injected into our time system to try to keep 12:00AM at midnight, so some days are 86,401 seconds long.
Similarly a meter used to be defined as one 40,000,000th of the circumference of the Earth, but when light-interferometry became easier and we had learned that actually, light moves at the same speed $c$ in all directions for all inertial reference frames, we redefined meters in terms of the distance that light travels in one second. As before, we tried very hard to get the new and old definitions to match up for all practical purposes. But as a result, our value for the speed of light has infinite precision; if light were any faster than it happens to be, then the meter would by definition be longer to make light travel at exactly $299,792,458\text{ m/s}$. So choosing a natural constant to have a fixed value in our units, or choosing a unit to be a certain size, are the same general procedure.
Similarly in the early 1900s we found out that light comes in lumps of energy; Planck's constant relates the frequencies of these photons to their energies: so for example a photon of frequency $f$ has energy $E = h f.$
We can do the same trick that we did with $c$, but now with $h$: now that we have been able to measure $h$ to very high precision in our conventional units, we can define $h$ to infinite precision by redefining how we measure energy.
Right now we measure energy in units of $\text{kg}~\text{m}^2/\text{s}^2.$ Remember that we defined $\text m$ in terms of the distance that light travels in one second, so really these units of $\text m/\text s$ correspond to some fixed number times the already-numerically-fixed speed of light $c$. So if we don't change how we measure $\text{m}$, then this redefinition of energies so that $h$ is now an infinite-precision constant, demands that we now measure $\text{kg}$ by this new standard. Essentially, one could say that the photon corresponds to a mass $E = m c^2$ and therefore any measurement of energy is a measurement of mass.
This last point isn't 100% off-base: if you consider an electron which annihilates with its antimatter relative the positron, the mass of both of those must be converted entirely into photons, and this is usually two photons travelling in opposite directions in the center-of-mass frame, and you could measure the energy of either of these photons to determine the mass of the electron. So you use your super-amazing atomic clocks to measure the frequency of the photons, we have fixed $h$ so that tells you $E$, and now you have measured the mass of the electron as $m = E/c^2.$ That's the basic idea.
Best Answer
Following on from Jon's comment above, this is from the NIST Website (Dated June 21, 2016), discussing a more accurate measure of the kilogram, and the involvement of Planck's constant.
The NIST-4 watt balance has measured Planck’s constant to within 34 parts per billion, demonstrating that the high-tech scale is accurate enough to assist with 2018’s planned redefinition of the kilogram.