The reasons for choosing length, mass, time, temperature, and amount as base quantities look (at least to me) obvious. What I'm puzzling about is why current (as opposed to resistance, electromotive force, etc.) and luminous intensity (as opposed to illuminance, emittance, etc.) were chosen to be base quantities. Does it have something to do with them being the easiest to measure?
SI Units – Why Were the SI Base Quantities Chosen as Such?
electricityhistorymetrologysi-units
Related Solutions
At least one common quantity I can think of has dimension with a non-integer exponent. The specific detectivity, $\text{D}^*$ is a common descriptor of photodiodes, and I'm sure one could make an analogous figure of merit for other types of sensor.
The unit of $\text{D}^*$ is the "Jones," which is equal to
$$\frac{cm \cdot \sqrt{Hz}}{W}$$
Watts in SI decompose into
$$Kg \cdot m^2 \cdot s^{-3}$$
which makes one Jones equal to:
$$\frac{s^{2.5}}{Kg \cdot m} \times 10^{-2}$$
with dimension:
$$\text{time}^{2.5} \text{mass}^{-1} \text{length}^{-1}$$
There are other quantities related to $\text{D}^*$, such as Noise Equivalent Power (NEP), which come up a lot in radiometry. Basically, any measurement which is normalized by frequency bandwidth will end up with that $\sqrt{Hz}$ in the units.
It's my understanding that the invention of the metric system during the turbulence following the French Revolution also included a switch to decimal time, with ten hours per day, etc., but that it didn't take. There's a certain amount of cultural inertia that has to be overcome; as you're probably aware, those of us in the United States still have many miles to go before we can fully adopt the metric system.
As you say, you have to give anonymous inventors of the 24-hour day credit: while the metric approach of powers-of-ten relationships between units is dreadfully easy to handle when you're using baseĀ 10 arithmetic, it's quite difficult to divide ten things into three equal-size sets. Remember that base ten is essentially an arbitrary choice made because most people have ten fingers and spend their childhood grouping things into fives and tens to count them. Twenty-four has boatloads of divisors: you can separate into a dozen pairs, three groups of eight, or six quartets. Sixty would make a pretty nice base, since it's the first number divisible by two, three, four, and five; but sixty is too many things for most people to count in their heads.
The second is actually historically based not on the length of a year, not of a day: until the adoption of the cesium clock standard in 1960, the definition of the second was actually the appropriate fraction "of the tropical year 1900." It took roughly half a century for the standards committee to realize that we can't go back and re-run the year 1900 to see whether we're still producing correct seconds.
There are several things that the SI system does that don't quite make as much sense as you might like. Why on earth does the base unit for mass, the kilogram, have a prefix? Why is the base unit for electricity the ampere, when we've known for a century that charge occurs naturally in standard-sized lumps? I put the SI endorsement of the historical relationship between the second, the minute, the hour, the day, and the year in the same category. It's a convenient unit with strong historical and popular support. I don't see a need to decimalize the day.
Emilio Pisanty asks for references.
The Time Service Deptartment at the U.S. Naval Observatory, which is responsible for inserting leap seconds every ~500 days to keep atomic time (as defined) from slipping relative to ephemeris time, seems to be the source for the Wikipedia account of the history of the second, but does not cite additional sources.
The Bureau International des Poids et Mesures, which is the orgainization responsible for defining and revising the international system of (SI) units, does not discuss the history of the second in its brief history of SI, and does not seem to have a page describing the history of the second in the same detail as the history of the meter.
The NIST/CODATA reference website contains historical background statements about the different fundamental units; the page for the second describes the shift from ephemeris time to the cesium standard as above.
An abandoned-looking, authorless site about units repeats the same story, but includes a handful of technical and non-technical references, including a 1958 article (Markowitz et al.) entitled "Frequency of Cesium in Terms of Ephemeris Time." This reference discusses plans at that time to move to the atomic standard.
The articles citing Markowitz et al. include a 2005 review in Metrologia entitled Atomic time-keeping from 1955 to the present, a much more detailed discussion with about four dozen techical references. A more recent review, Evolution of timescales from astronomy to physical metrology, seems from its abstract to offer a broader historical perspective.
For historical timekeeping systems and the decimal time adventure of the French Revolution I happened across Carrigan, "Decimal Time", 1978, which cites
For the division of the day in 24 hours by Egyptians, and the 60x60 subdivisions of the hour by Babylonians: O. Neugebauer, The Exact Sciences In Antiquity, Brown University Press, 1957.
For a catalog by Hipparchus (ca 140 BC) of stars whose rising is separated by one-hour intervals, accurate to about one minute: the "time" article in the 11th edition of Encyclopedia Brittanica. The corresponding article in Brittanica online is quite lengthy, but hidden behind a paywall for me.
For a medieval division of time into lit and dark "tides" (in English, "noontide" and "eventide"), each with twelve "hours" but only having equal length near the equinox: K. Welch, The History of Clocks and Watches, 1972.
For a similar Oriental system not supplanted until Western commerce became important in the 1800s: J. Arthur, Time and its measurement, 1909.
Old papers have old references! Carrigan observes that while weights and measures are important enough for commerce that many local standards arose more or less at once, early precise timekeeping would be complicated by the vagaries of travel by ship or by land. The engineering skill to build a clock with a useful second hand "preceded to some extent the need for standards of communication at small time intervals[, which] may have led to the universality of the present time system."
Best Answer
Historical issues I suppose; indeed current definition of ampere is rather stupid (force between two cables in vacuum) in light of the fact it could be done with number of elementary charges per second.
Candela is even worse, because it involves properties of average human eye (so called "luminosity function") -- so in principle it changes instantaneously as people get birth, die and their eyes age (not to mention various eye/brain fractures and treatments).