Y2K, The Unix Epoch, And Humans’ Millennia-long Struggle To Measure Time

Is the current year 5782, 1443, or 2021?

It depends on who you ask.

I just finished week 5 of my software development training at Tech Elevator and I have become fascinated by how humans and computers measure time.

UTC and the Unix Epoch

The current international standard for time is Coordinated Universal Time (UTC). UTC was first established on January 1st, 1960, but the current iteration was established in 1970 and adopted in 1972.

Thursday, January 1st, 1970 at 00:00 is known as the “Unix Epoch”.

Basically UTC Unix Epoch Time is a measure of the number of seconds that have passed since that arbitrary date. So time can be measured with this standard approach right?

It’s not quite that easy. Let’s back up for some context.

Greenwich Mean Time

Before UTC, there was a standard you may have heard of, “Greenwich Mean Time” (GMT). GMT’s day started when the sun was in it’s highest point above the Royal Observatory in Greenwich, London. Once again, a completely arbitrary decision.

In 1884, the International Meridian Conference in Washington, D.C. adopted a modified version of GMT to define the “Universal day”. But this became a problem with the international boom of the railroad industry. Trains needed to arrive and depart on time, but the sun behaves differently at different latitudinal and longitudinal points.

The Royal Observatory in Greenwich, London

The Royal Observatory in Greenwich, London

The Sun, Moon, and Big Rocks

And of course before GMT, there were countless systems of organizing time. Most systems measured days, months, and years based on solar and lunar cycles.

Warren Field in Scotland is widely considered the earliest calendar at approximately 10,000 years old, Stonehenge is only 4,000 years old.

Fun fact: Babylonians are often attributed as creating the “week”. The reason they picked the number seven was that they observed seven celestial bodies — the Sun, the Moon, Mercury, Venus, Mars, Jupiter and Saturn.

A Brief Clarification

Technically, UTC replaced Greenwich Mean Time in 1960. However, GMT is now a time zone within UTC.

In fact, GMT is UTC±00:00.

  • New York is UTC-05:00

  • Los Angeles is UTC-08:00

Back to 2021

Right now, there are a bunch of time-keeping nerds that want reform UTC time. Here’s why.

Every four years there’s a leap year right? That prevents years from being too long or too short.

However, leap seconds have been added 27 times since 1972. They’re always added to the world’s clocks at 23 hours, 59 minutes and 59 seconds UTC, on either June 30 or December 31. It’s quite a contentious issue.

Leap seconds need to approved by the International Earth Rotation and Reference Systems Service at least 6 months in advance. The last leap second was on December 31st, 2016.

Leap seconds are added because UTC time is not in sync with the unpredictable rotational patterns of the earth. In short, UTC time is constant but imprecise. It needs to be adjusted to stay in line with the earth’s natural cycles. Right now there is a 37-second gap between UTC and Unix Epoch Time.

unsplash-image-t7EL2iG3jMc.jpg

The Case Against Leap Seconds

The International Telecommunications Union (ITU), the United Nations body that governs some global issues related to time, has been contemplating leap seconds for some time.

They argue that leap second adjustments are disruptive to tv/radio broadcasts and internet servers. It’s actually a big deal that can cause serious synchronization issues.

In 2015, the ITU voted to keep leap second adjustments. But they are slated to revisit the issue in 2023.

However, getting rid of leap seconds would slowly shift midnight to the middle of the day. Stay tuned for the decision. (Will it be International Atomic Time? or Very Long Baseline Interferometry? or something else?)

Y2K

Here’s what happened in a nutshell.

For a long time after computers were invented, memory was extremely limited.

For example, the IBM 350 drive had fifty 24-inch disk drives, with a total capacity of 3.75 megabytes.

The IBM 350 disk storage unit

The IBM 350 disk storage unit

0’s and 1’s

A “bit” is the smallest storage unit. It can be on or off. 0 or 1.

Back in the day, characters such as letters and numbers where represented by 8 bits, aka a “byte”.

Since each of the 8 bits in a byte can either be on or off, one byte allows for 256 unique characters. 2^8 = 256.

To conserve memory, early developers used 2 digits to represent the year instead of 4. For example, 70 instead of 1970.

When 1999 came around, many companies and governments feared that computers would interpret 00 as 1900 instead of 2000.

The Y2K bug ended up being much ado about nothing. Many companies updated their software and issues were not widespread as feared.

The 2038 Problem

Let’s go back to Unix Epoch Time for a minute.

I secretly admire it’s mathematic perfection, despite its inability to represent the real world accurately. I’ve delightfully discovered that there are celebrations of Unix Billenniums. They’re basically new year celebrations every billion seconds after the Unix Epoch.

At 01:46:40 UTC on Sunday, 9 September 2001, the Unix 1st billennium (Unix time number 1000000000) was celebrated.

The next billenium is at 03:33:20 UTC on Wednesday, 18 May 2033. The Unix time value will equal 2000000000 seconds. I can’t wait!

The Unix Epoch is commonly based on a 32-bit signed integer. This means that after 2,147,483,647 seconds, the maximum positive 32-bit integer will have been reached. The first digit will shift from 0 to 1 and the whole number will become negative, effectively transporting back in time to December 13th, 1901.

The 2038 Problem

The 2038 Problem

Luckily, there are unsigned 32-bit integers which will last until February 7th, 2106, and 64-bit integers which will last until Sunday, December 4th, 292 277 026 596. That’s 22 times the estimated current age of the universe. LONG LIVE UNIX EPOCH!

How I Ended Up Down This Rabbit Hole

I’m currently learning the Java programming language. I’m building an events calendar that takes user input to create new events.

In Java dates are stored as ZonedDateTime objects with the following format: “2007-12-03T10:15:30:00+01:00”. The final digits after the “+” represent the time zone.

I’m still not sure how computers compensate for leap seconds. I guess humans have to reboot and reconfigure the time? Why can’t we live in a Unix Epoch utopia?

Kindly,

Garrett

Garrett John Law

I’m a Los Angeles-based software engineer and musician.

https://garrettjohnlaw.com
Previous
Previous

Listen To My Latest Spotify Release

Next
Next

Tech Elevator Week 4: “Show Your Work!”