Anniversary Post: Unix Time

Unix TimeOn this day in 1970, Unix Time began. Well, not exactly. It was just a convenient point to set as an epoch. And all Unix Time provides is the number of seconds since that time. It is coming up on 1.5 billion seconds. It also provides fractions of a second. In its first implementation, it subdivided seconds into 60 parts. But when I was first cutting my Unix teeth on a couple of Sun workstations in the late 1980s, Unix Time was divided into millionths of a second.

All computer systems have their own epochs. MS-DOS had its epoch exactly a decade after Unix on 1 January 1980. COBOL has an epoch of 1 January 1601. Microsoft C++ 7.0 had an epoch of 31 December 1899. VMS had an epoch on 17 November 1858. And MATLAB’s epoch is on 1 BCE. In other words: epochs are totally arbitrary. And that’s great. I’m surprised that some smart–aleck computer geek hasn’t set an epoch thousands of years in the future just so we could have negative time stamps.

But given all this epoch madness, why did we have the great Y2K scare of 16 years ago? Now I know the main issues. There were computer shortcuts taken to reduce memory usage. And there was lots of binary-coded decimal, which I’ve never understood. I mean, really: do everything in binary and then convert to decimal when humans have to look at it. It makes no sense to represent numbers in this way on the hardware level. And I know that binary-coded decimal was especially used for time and date functions.

But I do think that Y2K was vastly overblown. Yes, I think that mission critical computers should have been thoroughly checked out. Any chance of a computer firing a missile is unacceptable. But it was mostly just something for reporters to talk about. But Unix Time is cool. At least I can remember when it is. I will have forgotten all about VMS time by the time you read this.

15 thoughts on “Anniversary Post: Unix Time

  1. I disagree. Y2K _ wasn’t as Big a Deal as some of the stupider movies made it out to be, but it was a Big Deal nonetheless.
    Essentially, programs written with two-digit years had a fundamental assumption baked in, which was obvious when you thought about it but which was ignored for decades at a time: essentially, that January 1 2000 was never going to happen. I was working at a medical software company in the mid-90s, and sadly our product had originally been designed (years before) with YYMMDD dates, stored as BCD. Failure to prepare for Y2K would not have killed anybody – it’s not _that_ kind of medical software – but it would have put all of our clients out of business, as billing would have come to an abrupt halt.

    What truly pisses me off is that so many of the idiot reporters who were screaming that the Sky Was Falling reversed themselves and treated Y2K as a hoax. It wasn’t; they’d exaggerated some of the effects, but it truly would have messed up a lot of stuff if it hadn’t been fixed. And a hell of a lot of people put in a hell of a lot of hours to make sure that didn’t happen. But that story doesn’t sell clicks, I suppose.

    The standard date/time libraries we take for granted now only became standard in the early 70s; anyone who wrote a new system today using BCD to store dates would be guilty of malpractice*, but at the time it was How Things Were Done. Chances are good that, if COBOL was your first language and you cut your teeth on Big Iron, BCD seemed like the natural way to deal with dates.

    *It still happens, of course. I deal with a couple of large EHR systems that have largely been written by autodidacts who re-invent the wheel at every opportunity; some of the things they do with date logic would make your eyes water.

    • I agree with you. There were real problems — as I documented. And it was a good idea that the computer community dealt with the problem. But it was never the kind of, as you put it, “the sky is falling” problem. And I’m just as annoyed at the people who claimed there was nothing to worry about after the fact. The reason was that people spent an enormous amount of time on the problem. But proper programming eliminates these problems. This is why I don’t consider myself a real programmer. I can code anything at all. I’m good at getting prototypes running. But when I want something done right, I hire a real programmer. Because they do things the right way — not the way to just get it working for today.

      • The problem, I think, is that there was a paradigm shift somewhere there in the late 60s/early 70s – on one side of that shift, best practices dictated that date logic should be done the way that people had done it for centuries (hence BCD, ’cause doing traditional date logic in hex is _hard_), and on this side it became natural and right and proper to think in terms of (fractional) seconds since an epoch. I don’t know why the shift happened – maybe an influx of programmers with roots in science rather than business? – but I can’t really blame the guys on the other side of that shift for thinking that BCD was the way to go; it’d be like blaming pre-Copernicans for failing to integrate heliocentrism into their calculations. And, given BCD and the price of storage at the time, Y2K was practically inevitable… in the 50s and 60s. And the inertia of backward compatibility pretty much guaranteed it would stay that way, once a system was built.
        What I do NOT understand is how professional developers in the 80s and early 90s could bake a Y2K problem into _new_ code – and yet I have personal experience of dozens of systems where it happened. That, to me, is the real analogy with global warming.

        • Well, I’m just still mad having to learn all that when I took digital electronics. But it’s odd that one of the reasons there was a problem was that coders were trying to save memory. And then we have other coders who are egregiously wasting memory with BCD. I don’t know if you were serious about hex time and date functions being hard. They aren’t — except for people who don’t feel comfortable with hex.

          Fundamentally, the problem was that no one thought the stuff they were doing was going to last. I think the early programmers were like me: just trying to get stuff to work. It was only later that what I call the real programmers came and did things properly. I greatly respect those people. Just the same, I don’t really want to hang out with them. :-)

              • It must be a locality thing or age-most of the men I know out here around my age who are programmers have had at least one relationship and/or kids. Heck the *blanked out to avoid scaring the viewers* this past NYE is a programmer who has a six year old daughter.

                I was thinking of my own father though but since he served in the navy, maybe he doesn’t really count? He had three marriages with seven daughters. Six if you only count the biological ones we know about.

                • Oh, I was kidding. Most of the computer science graduate students I knew were pretty good with the ladies. The undergraduates were a clueless lot, though.

                  • *laughs* And here I go taking you seriously since you live near Silicon Valley although those guys seem to be mostly dudebros these days.

                    • Well, there’s a distinction. Someday I may write something about the ecosystem of computer programmers.

  2. Well it seemed to make sense at the time … Two-digit BCD dates were the thing back in the 1960s. A lot of input data was on paper cards, remember, and asking a punched card typist to type in “1957” rather than “57” seemed like an unreasonable imposition — as well as wasting two precious spaces on that little card, and increasing the chance of errors. Also, some machines were character-oriented, such as the IBM 1401, the 1440, the 7010, etc.

    Also also, a good chunk of “data processing” back then involving mechanical sorting and merging piles of cards on tab equipment, outside the computer. Inevitiably some cards DID get punched, spindled, or mutilated, which meant that the computer operator had to retype those cards on the handy 026 at the back of the operating room. and working with character-oriented data was so much easier than binary would have been.

    Later on … in the 1950’s and 60’s there were a BUNCH of computer manufacturers (Burroughs, Univac, NCR, Control Data, Honeywell) besides IBM, with incompatible architectures and programing languages. Companies using computers switched from one type to another rather frequently. It wasn’t till the System/360 and JCL (1966) and working operating systems (Unix, 1969) came out that things started to get standardized. So a lot of programmers trained before 1980 or so took it for granted that any programs they wrote would be superceded in a decade or less. It wasn’t till the 1980’s that most of us really grasped that the programs we were batting out so casually might actually still be in use decades in the future.

    • I don’t think punch card operators would be too put out by 1957 over 57. But the point is that storage was a major issue. But 1957 requires 11 bits of storage; 57 with BCD requires 8 bits.

      But a lot of the problems that were going on were from the 1970s and even 1980s — not the 1950s. When did people start to get serious about this kind of stuff? The Mythical Man-Month came out in 1975. And by then, good coding practice was very well established. I think by the mid-1960s, people knew what to do. Of course, a lot of early tech was done by people who didn’t know any better.

      • I think the BCD obsession seems a little spurious. The Mythical Man-Month came out in 1975, but I think, like good books in any field, it actually had very little practical impact. Yes, people love to reference it, but schedule and money pressures always push out quality considerations.

        Also, good coding practices were not well established then (and aren’t now). That was the era of the Structured Programming craze. As Donald Knuth said in an ACM Computing Surveys article in response to Dijkstra’s gotos-considered-harmful letter, good programmers had always written well-structured programs in any programming language, including assembly language. An unstated implication that I would add is that average programmers write average and below-average code. What Knuth had to say is still true 40 years later. I sometimes browse StackOverflow and the number of inane questions and misdirected responses from software developers scares me–these people could be responsible for writing mission-critical software and they don’t really know how to program. (There are, of course, some seriously good questions and answers on StackOverflow.)

        “Real programmers” is a vague concept. When I graduated from college and began to work in 1982, I got to work with code written by our manager’s famed Tiger Team that predated my arrival. I was not impressed. The most well-structured program (excepting my own! :) I have ever worked with was written by a graduate student for his Ph.D. research in the late 1970s in Fortran-IV–no structured programming constructs such as IF-THEN-ELSE, etc. It was well-structured both code-wise and in its use of data structures.

        Unix timeval structures have a seconds (traditionally 4 bytes) field and a microseconds field (traditionally 4 bytes). On 64-bit architectures, that microseconds field (range: -999,999 to +999,999) is probably stored in an 8-byte integer. VMS times were, from the beginning, 8-byte integers representing the number of nanoseconds (IIRC) since the epoch and, because of the lack of wasted space, could therefore represent much longer time periods than the old Unix timevals and with greater precision.

        In the early 1980s, I worked on a program that would work before and after 2000, but would have a problem if a single satellite pass began on New Year’s Eve 1999 before the ball dropped and ended on January 1, 2000 after the ball dropped. Our brilliant systems engineer (and he was brilliant) laughed and assured me that the customer wouldn’t schedule a pass on New Year’s Eve. So I think that back then the Y2K problem was not really on the radar of most programmers.

        Memory and data storage were a problem for a long time, which could be magnified if you had large databases. (Although I’ve long had a problem with people blindly using binary file formats when text files would be so much easier to work with and just as efficient.) Circa 1980, on our Univac mainframe computers at the University of Maryland, the graduate students who needed to use more than 64 KBytes (that’s correct) of memory had to (i) get special permission and (ii) run their programs at night when the load was less. (Regarding punch-card typists, you also ran into the problem of fitting the required number of fields on a card, in which case 2-digit years could help.)

        Also, don’t forget the cultural prevalence of 2-digit years and their natural carryover into computer applications. Some Victorian novels I’ve read used 2-digit years back in the 19th century. So there was also cultural inertia at work.

        Getting back to BCD, the complex-instruction-set VAX had a fast BCD-to-integer conversion instruction implemented in hardware (or microinstructions). I was able to use it as part of an ASCII-to-floating-point conversion routine in the early 1980s to reduce the daily required time of a production program in FORTRAN from 7 hours down to 20 minutes.

        Marc above mentioned date logic that would make your eyes water; I’m not sure if he means the logic or the coding would make your eyes water. Date logic is hard (much like floating-point representation and arithmetic, and Unicode text processing–see, for example, normalization forms for abstract characters that have multiple possible code-point sequences) because it is complex. (About the Unicode normalization forms: Twitter uses one of them to determine if a message meets or exceeds their 140-character limit.) For a small taste of date logic complexity, see Erik Naggum’s 1999 article, “A Long, Painful History of Time. (And Naggum notes that robust, binary time formats are not necessarily less verbose than textual formats.)

        Sorry for the long comment! I guess your post hit a lot of nerves for me! Which is good because it gets me thinking, but bad for you if you force yourself to wade through it!

        • I totally blew it. This interface for responding to comments is really convenient, but it has glitches that allow one wrong click to destroy hundreds of words. Let me summarize:

          1. Knuth: still use TeX for all screenplays I write. I love it. (Of course, I also use vi for all my non-blog editing.)
          2. Real programmers: people who design, pick the best algorithms, and so on. I’m great for writing prototypes because I can make things work. But I’m not a real programmer, even though I’m very good in my way.
          3. I remember a lot of people freaking out about Y2K. Although I’ll admit: it wasn’t mostly programmers
          4. This one struck a nerve for a lot of people. I was just babbling — and venting about BCD. I just remember having to convert BCD to straight binary and I didn’t like that. I think even at the time, it was a skill that wasn’t of much use. Maybe it’s used in calculators? Do people even have calculators anymore?

Leave a Reply