Saturday, December 07, 2019

Warning: Bug of 2038 is ahead! (English)

Заметка полностью.

To be more precise: 19 January 2038 at 3:14 am.

It’s 31.12.1999 and 9 years old me was sitting in my grandma’s living room, waiting to watch the millennium countdown and fireworks on TV with my little sisters, but secretly waiting to see what is going to happen with bug y2k (bug 2000). Are planes really going to fall down from the sky? Are elevators going to stop working and nuclear plants going to explode?

Ниже есть продолжение.

Data takes up space. If you can represent data in two different ways, say - year ‘1990’ versus ‘90’, you surely will choose the more efficient way. Why waste two more digits when you can represent the same with only two? Up until the 90’s, many systems were indeed saving up space by using only 2 digits to represent a year, setting the machines up to a failure once we hit the year 2000 and would programmably go back in time and represent it by ‘00’ - making it indistinguishable from year 1900. The world went nuts and prevention and remediation teams were set up, working extra hours and spending a total cost of 300 billion dollars for preparations and another 13 billion for damages remediation.

Unix was originally developed in the 60's and 70's and when engineers had to decide how to represent time, they chose to start counting how many seconds have passed since 1.1.1970 at midnight. Up until today, this is one of the main methods of representing universal time in many systems, allowing a single representation of time no matter where in the world you are, being super useful for computer systems that are dynamic and distributed - like most are today.

The timestamp is stored as a signed (positive or negative) 32-bit binary integer, which can represent up to 2^32 numbers, equals to 4,294,967,295 seconds. Since it’s a signed integer, the actual values can be any number between -2,147,483,647 (The time before 1.1.1970) and 2,147,483,647 (the time since). This seems like quite a lot - and surely UNIX engineers thought so too when back in the 60’s they chose to set the “zero time” to be 1.1.1970. But those two billion one hundred forty-seven million four hundred eighty-three thousand six hundred forty-five second ARE going to pass on 19 January 2038 at 3:14 am. Sounds familiar?

You know how you download a file and need to choose between 32bit version and 64bit version? This represents the architecture of your computer’s CPU - the processor. The processor’s most basic memory units are the registers, and their size determines how much data can be stored in one chunk. 32bit architecture implies the size of the registers and how much data it can store - which means that without migrating to 64bit arch, developers cannot keep using the UNIX timestamps. Many systems today are of 64bit arch, but all computers that by January 2038 will still use 32 bit Unix Time will *overflow* - Meaning the counter will run out of usable digit bits, and will affect the sign bit instead, causing errors and bugs.

Who is at risk? Mostly embedded systems that are meant to last decades. Since most modern computers are coming out with the 64bit arch, the systems that have the potential of issues are old systems. Those transportation systems, equipment like sensors and control systems, computers embedded in machinery and others - most of their software cannot be simply upgraded. In reality - many 32bit systems will be completely replaced in the remaining time to y38k bug - or we will all be eating popcorn waiting for planes to come crushing from the skies in 19 years.

No comments:

Post a Comment