I have seen dozens of different real-time clock chips on the market, as well as a number of processors with a built-in separately-powered real-time clock module.
Nearly all of them not only store time as year-month-day-hours-minutes-seconds, but even the individual fields are stored in BCD rather than binary format.
Is there some underlying reason for this?
Are there any microprocessor applications that do anything more sophisticated than simply display a clock where the BCD format is more useful than binary, or where year-month-day-hour-minutes-seconds format would be more useful than a straight 47-bit count of oscillator state changes?
From what I can tell, it seems RTCC makers add a lot of extra circuitry to make their chips less useful; the only reason I can figure for RTCC modules in processors to behave that way is that the processor vendors use some pre-existing BCD implementation rather than producing their own.