In Coordinated Universal Time (UTC), leap seconds are added to account for the slowing down of Earth's rotation. But the slowing down is said to be of the order of milliseconds in a century. Then why there were more than 25 leap seconds added to UTC in the last few decades alone?
[Physics] Why are leap seconds needed so often
astronomymetrologytime
Related Solutions
why didn't we modify the measurement of 1 second ever so slightly so as to avoid leap years altogether.
The rotation of earth and the revolution of earth around the sun is not at all synchronized. The Earth really rotates 365.24219647 times during each revolution (in 1992, this ratio changes slightly every year, the tropical year gets roughly a half second shorter each century); so even if we fixed the definition of time to the revolution of earth around sun, we will still need a leap year every 4 years (what we wouldn't need would be leap seconds).
Another reason is because precise time measurement would become incomparable. Since the period of revolution of earth (i.e. the tropical year) isn't constant, if we used the definition of second to exactly match the period of revolution, then whenever you want to specify a precise duration of time, you'll also have to specify which year that definition of second is taken from, and you'll need a table that records the length of second of each year.
Is there any technical reason ... ?
Yes, because with the proper equipment anyone, anytime can take a caesium-133 atom, put it under the specified condition and measure the same second, and it won't have yearly change like the second from earth rotation/revolution would. As far as we know, the frequency of caesium-133 in the 1978 should be the same as the frequency of another caesium-133 in 2049.
It's my understanding that the invention of the metric system during the turbulence following the French Revolution also included a switch to decimal time, with ten hours per day, etc., but that it didn't take. There's a certain amount of cultural inertia that has to be overcome; as you're probably aware, those of us in the United States still have many miles to go before we can fully adopt the metric system.
As you say, you have to give anonymous inventors of the 24-hour day credit: while the metric approach of powers-of-ten relationships between units is dreadfully easy to handle when you're using base 10 arithmetic, it's quite difficult to divide ten things into three equal-size sets. Remember that base ten is essentially an arbitrary choice made because most people have ten fingers and spend their childhood grouping things into fives and tens to count them. Twenty-four has boatloads of divisors: you can separate into a dozen pairs, three groups of eight, or six quartets. Sixty would make a pretty nice base, since it's the first number divisible by two, three, four, and five; but sixty is too many things for most people to count in their heads.
The second is actually historically based not on the length of a year, not of a day: until the adoption of the cesium clock standard in 1960, the definition of the second was actually the appropriate fraction "of the tropical year 1900." It took roughly half a century for the standards committee to realize that we can't go back and re-run the year 1900 to see whether we're still producing correct seconds.
There are several things that the SI system does that don't quite make as much sense as you might like. Why on earth does the base unit for mass, the kilogram, have a prefix? Why is the base unit for electricity the ampere, when we've known for a century that charge occurs naturally in standard-sized lumps? I put the SI endorsement of the historical relationship between the second, the minute, the hour, the day, and the year in the same category. It's a convenient unit with strong historical and popular support. I don't see a need to decimalize the day.
Emilio Pisanty asks for references.
The Time Service Deptartment at the U.S. Naval Observatory, which is responsible for inserting leap seconds every ~500 days to keep atomic time (as defined) from slipping relative to ephemeris time, seems to be the source for the Wikipedia account of the history of the second, but does not cite additional sources.
The Bureau International des Poids et Mesures, which is the orgainization responsible for defining and revising the international system of (SI) units, does not discuss the history of the second in its brief history of SI, and does not seem to have a page describing the history of the second in the same detail as the history of the meter.
The NIST/CODATA reference website contains historical background statements about the different fundamental units; the page for the second describes the shift from ephemeris time to the cesium standard as above.
An abandoned-looking, authorless site about units repeats the same story, but includes a handful of technical and non-technical references, including a 1958 article (Markowitz et al.) entitled "Frequency of Cesium in Terms of Ephemeris Time." This reference discusses plans at that time to move to the atomic standard.
The articles citing Markowitz et al. include a 2005 review in Metrologia entitled Atomic time-keeping from 1955 to the present, a much more detailed discussion with about four dozen techical references. A more recent review, Evolution of timescales from astronomy to physical metrology, seems from its abstract to offer a broader historical perspective.
For historical timekeeping systems and the decimal time adventure of the French Revolution I happened across Carrigan, "Decimal Time", 1978, which cites
For the division of the day in 24 hours by Egyptians, and the 60x60 subdivisions of the hour by Babylonians: O. Neugebauer, The Exact Sciences In Antiquity, Brown University Press, 1957.
For a catalog by Hipparchus (ca 140 BC) of stars whose rising is separated by one-hour intervals, accurate to about one minute: the "time" article in the 11th edition of Encyclopedia Brittanica. The corresponding article in Brittanica online is quite lengthy, but hidden behind a paywall for me.
For a medieval division of time into lit and dark "tides" (in English, "noontide" and "eventide"), each with twelve "hours" but only having equal length near the equinox: K. Welch, The History of Clocks and Watches, 1972.
For a similar Oriental system not supplanted until Western commerce became important in the 1800s: J. Arthur, Time and its measurement, 1909.
Old papers have old references! Carrigan observes that while weights and measures are important enough for commerce that many local standards arose more or less at once, early precise timekeeping would be complicated by the vagaries of travel by ship or by land. The engineering skill to build a clock with a useful second hand "preceded to some extent the need for standards of communication at small time intervals[, which] may have led to the universality of the present time system."
Best Answer
It's not the rate of change of the rotation speed that's important, it's the current rotation speed (in the rotating reference frame that stays facing the sun) not matching a 24h day.
Thus leap seconds (on average1) accumulate at a near-constant rate, because (as you point out) the average rate of change is low compared to the existing mismatch between actual day length and what our clocks say.
Remember that a leap second is an absolute offset added/subtracted, not a multiplier on the speed of our clocks that fixes the problem for the future until the speed drifts some more.
We're correcting the "error" in our time function by adding step offsets, not by changing the slope. The length of an SI second remains fixed, and the length of a day by our clocks remain fixed at 24 hours / 86400 SI seconds (with no leap second).
In practice the linear model doesn't work at all in the short-term: there's lots of year-to-year variation, and 1.5-2ms/day/century is only a long-term average. See @David Hammen's answer for a nice graph and more details. He commented:
The chaotic short-term variation dominates over any period short enough to ignore the average slowdown.
More details from the US Naval Observatory's Leap Second article
The SI second ($9 192 631 770$ cycles of the Cesium atom) was chosen to be $1 / 31 556 925.9747$ of the year 1900.
Note the units of that measurement: it's ms per day per century, or $\Delta s / s / s$, like an acceleration, not a velocity. And definitely not 1.5 ms per century.
Purely coincidentally, a mean solar day is currently on average 2 ms longer than an SI day, so the current error-accumulation rate is 2 ms / day. It's been about 1 century since the defining epoch for the SI second. It takes less than 1000 days to need another leap second. . (There are various effects which make solar days differ in length, but on average they're longer than 24h and getting even longer.)
In another century from now (with constant deceleration of the Earth), we'll need to add leap seconds about twice as often as we do now, to maintain the cumulative difference
UT1-UTC
at less than 0.9 seconds.