[LEAPSECS] operational time -- What's in a name?

Poul-Henning Kamp phk at phk.freebsd.dk
Fri Mar 28 12:04:49 EDT 2008


In message <20080328151905.GA25223 at ucolick.org>, Steve Allen writes:

>On Fri 2008-03-28T11:42:25 +0000, Poul-Henning Kamp hath writ:

>> Simply not true. A good place to start is the FreeBSD ports-tree

>> which contains about 18000 piece of open source software.

>

>Is there a detailed inventory of how many of those break in 2038?


That's an interesting question.

Today, on a 32bit system, they invariably all do, but running on a
64 bit system, surprising many seems to work correctly.


>And how many break as binaries vs. how many would work if recompiled?


Well, al 32bit binaries break (by definition), but I'm not sure
binary distribution of programs is much of a concern in this respect.

For the average software package, there is still a good 10 releases
before 2038.


>If the ITU-R cannot find a unilateral stance then it may be time to

>start thinking compromise.


I don't think it is as much about compromise as capitulation to
reality: Even if we decided to fix time_t's little red wagon for
good, and got the economic resources to do so, we would be very
hard pressed to find the competent man-power to carry it out reliably.

And compared to humanitys other pending issues, it would be very
hard to justify the priority for this task.


>Is getting the leaps out of the system's time_t worth the other hassle?


If you compare with the other crap we put up with from computers,
and the sheer mindbogglingness of the workarounds people put up
with, I am sure that the disappearance of leap seconds would not
even register on the publics radar.


My personal preference, would be that we create a new definition
of time representation for computers, preferably in a binary format
so the math gets faster and less buggy.

Ufortunately, a 64bit arithmetic type is not enough to give us both
range and resolution.

Some have proposed a dual-modus format, technically a "one-bit
exponent floating point" format, that trades resolution for range
with a mode-bit:

Calendar: 55i.8f + '0' (1.1 bilion years, 5ms resolution)
Time: 33i.30f + '1' (272 years, approx 1ns resolution)

But I would advocate against that on both efficiency and error
detection reasons.

Another option is to have two different types:

calendar_t 54i.10f (570My / 1ms)
utc_t 34i.30f (544y / 1ns)

Provided the bad effects of mixing them are bad enough, that can work.

My personal preference would be to bite the bullet and live with
the 128bit memory hit:

utc_t 64i.64f (big enough, small enough)

Provided we get 10 years notice of leapseconds, that timescale
can contain leap seconds. If we don't get at least 10 years notice,
it should not suffer from them.

Poul-Henning

--
Poul-Henning Kamp | UNIX since Zilog Zeus 3.20
phk at FreeBSD.ORG | TCP/IP since RFC 956
FreeBSD committer | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.


More information about the LEAPSECS mailing list