DEV Community

Cover image for COBOL, Dates, May 20, 1875, and Disinformation
Michael Chaney
Michael Chaney

Posted on • Edited on

COBOL, Dates, May 20, 1875, and Disinformation

Let me start at the end for those who don't want to read the whole thing. COBOL does not have a "default date" or any kind of default value for a missing date. COBOL doesn't even have a "date" data type. ISO8601 is an interchange format for dates. "May 20, 1875" has no prominence in that standard or any other date standard.

That's the short version. How did we get here?

Elon Musk mentioned in a press briefing that there were many names in the Social Security database of people who weren't marked as dead but were 150 years old. Clearly, this is bad data. At the time, he provided no other information, so speculation abounded on the internet.

In the middle of all of it, this appeared on threads:

Image description

https://www.threads.net/@ashmore_glenn/post/DGDfmj6TsZS

It's interesting that he starts off making the claim that "in COBOL, if a date is missing the program defaults to 1875. e.g. 2025-1875=150". This claim isn't just false, it's false in multiple ways:

  1. There is no "date" data type in COBOL. Dates are stored however the programmer wants, but usually numeric character strings
  2. There's no "default" date, even if there were such a data type
  3. Even if there were a default, 1875 would be a bizarre choice

His initial post has over 20,000 "likes" and numerous shares as of this writing. The followup - included in the picture above - has him backpedaling hard and pointing out that they were "taught to never leave a null date" and "I assume some group at IBM or elsewhere decided that May 20,1875, the date of the Convention on the Meter, would be the standard filler." That got 10 likes. Welcome to the internet.

So, later on in the same thread he gets a lot of push back from other programmers, and then we find this with 1000+ likes:

Image description

I'll put the whole thing here:

Correct. The original ISO standard for a default reference date was May 20, 1875 - the date the international standards and metrics treaty was signed. Geeky to be sure but that is coders for you. This standard was changed in 2019.

As you've probably guessed, that's not correct, either.

But this was added to the lore and we had some disinformation with running shoes on.

ISO8601

In the ISO8601:2004 standard, "May 20, 1875" is mentioned as a "reference date" in a section discussing the Gregorian calendar.

For those of you unaware, the Gregorian calendar is almost certainly the calendar that you're using. The "Julian calendar" is still used in some contexts in the west, mostly among Orthodox churches. You most likely see this as "Orthodox Christmas" and "Orthodox Easter" on your (Gregorian) calendar.

Without getting into all the specifics, Julius Caesar adopted what's now known as the "Julian calendar" in 46 BC. It has 365.25 days in a year, so every four years a leap year is added. That's pretty close.

Unfortunately, "pretty close" doesn't work over the course of centuries, and people determined that the real number of days in a year is slightly less than 365.25. So, in 1582 Pope Gregory XIII established a new calendar system that's pretty similar, but to get closer to the number of days in the year the leap year is skipped every hundred years, unless the year is a multiple of "400".

In order to "reset" the calendar, October 4, 1582 (Julian) was followed by October 15, 1582 (Gregorian), which was October 5, 1582 (Julian).

The Julian calendar will continue to drift apart from the Gregorian calendar. The starting offset in 1582 was 10 days. The Gregorian calendar skipped leap years in 1700, 1800, and 1900, but the Julian didn't skip those. As of 2025, there are 13 days between the two.

So, Christmas is December 25 in both calendar systems. But those aren't the same day. "December 25, 2024" in the Julian calendar is "January 7, 2025" Gregorian. That's why "Orthodox Christmas" is celebrated in what most of us call "January". Inside the Julian calendar, it's still December.

In the ISO8601 specification, 2004 edition, section 3.2.1 discusses the Gregorian calendar as it is considered the "standard" calendar and the one that ISO8601 assumes. In this discussion of the Gregorian calendar, it is mentioned:

The Gregorian calendar has a reference point that assigns 20 May 1875 to the calendar day that the "Convention du Mètre" was signed in Paris.

This is a statement about the Gregorian calendar, not ISO8601. It is saying that the date that the "Meter Convention" was signed in Paris is "May 20, 1875" on the Gregorian calendar, as an example.

Equally valid statements:

"The Julian calendar has a reference point that assigns 8 May 1875 to the calendar day that the 'Convention du Mètre' was signed in Paris."

Even more:

"The Gregorian calendar has a reference point that assigned 29 Aug 1958 to the calendar day that Michael Joseph Jackson was born."

That's not a statement about ISO8601 - it's a statement about a calendar system. It was removed from the 2019 update to ISO8601.

https://www.loc.gov/standards/datetime/iso-tc154-wg5_n0038_iso_wd_8601-1_2016-02-16.pdf

Due to what was shown on threads, this morphed into a full-scale disinformation campaign that usually disparaged Elon Musk or his "DOGE" team as hapless idiots who don't know about COBOL and thus don't know that "May 20, 1875 is the epoch in COBOL".

Ah, the epoch. What's an "epoch"?

Let's back up a bit. The ISO8601 standard for exact dates is pretty simple:

yyyy-mm-dd (dashes optional)

That's the entire standard for an exact date. Four-digit year, two-digit month, and two-digit day of month, each with leading zeroes as applicable. So, "May 20, 1875" is "1875-05-20". "May 19, 1875" is "1875-05-19". "October 15, 1582" is "1582-10-15". "October 10, 1582" - ha! fooled you there, that date technically doesn't exist. The Gregorian calendar didn't exist then, so we use the "proleptic" version which just numbers the days as if they did exist. So, "October 10, 1582" Gregorian would be "September 30, 1582" Julian, and we would still use "1582-10-10". Anyway....

I use ISO8601 all the time. It's the format used by HTML form fields for dates and times. It's a date/time interchange format, and it's useful because a date/time value includes all information to accurately specify the exact date and time of an event, even if someone in a different time zone consumes the data being produced or if daylight savings time is different. It's also great for dates as they can be sorted easily.

Epochs

An "epoch" in computer lore is basically when a system of timekeeping starts. Year "1" is the first year in our Gregorian calendar system. Times before that are just "BC". There is no year "0", which is a bit odd, but exact dates that long ago typically don't matter.

In the Unix operating system, the "epoch" is January 1, 1970 at 12:00:00 AM. Later, that was determined to be "UTC". A standard Unix timestamp is 32 bits, but it's considered to be a "signed" number so 31 bits are accessible. That means that we can count up to 2,147,483,647 seconds past January 1, 1970 in this system, which puts us at 2038 when the number "overflows" and resets to a negative value.

Put another way, with this system we can specify a date and time within about 68 years either way from January 1, 1970. Here, I show what this means using the Ruby programming language:

3.3.6 :001 > Time.at(0).utc
 => 1970-01-01 00:00:00 UTC
3.3.6 :002 > Time.at(2**31-1).utc
 => 2038-01-19 03:14:07 UTC
3.3.6 :003 > Time.at(-2**31).utc
 => 1901-12-13 20:45:52 UTC
Enter fullscreen mode Exit fullscreen mode

As I type this, we are at 1,740,000,100 seconds since the epoch.

The system that I just described has a couple of features that we can see as relevant:

  1. It only works within a fairly narrow range of years
  2. A timestamp of "0" will be obvious since it's always the same value

When I'm working with systems and I see the date/time is "January 1, 1970 12:00:00am UTC" or "December 31, 1969 6:00:00pm CST" I know right away that I have a timestamp that is 0. This is a pretty common bug form - we forget to initialize a value, it ends up being 0, so the date that shows up is January 1, 1970 or December 31, 1969.

There are other epochs. A bunch of others. One example is a timestamp for a file in MS-DOS, which starts in 1980.

Now, in ISO8601 the specification for an exact date is just the four-digit year, two-digit month, and two-digit day of month. There's no "epoch" other than, basically, year "1" or something like that. In this system, May 20, 1875 has no significance. That's why the line was removed from the standard in the 2019 edition.

Disinformation Multiplies

Musk countered the disinformation by showing the counts by age bracket:

https://x.com/elonmusk/status/1891350795452654076

Image description

By this point, the disinformation was "if Musk and company were competent they would have noticed the huge spike at 150 years old and known that was important." As the data shows, there is no spike at 150 but rather the downward slope past 70 that one would expect with this data. And it ends not long after 150, not surprising since those are the first Social Security recipients. There are a couple of outliers on there as well.

It was too late to stop the disinformation campaign. It was repeated by Rachel Maddow on MSNBC, and shows up in other places. For instance, it's repeated by this "Politifact" "fact-check":

https://www.politifact.com/article/2025/feb/17/are-150-year-old-americans-receiving-social-securi/

This guy repeated it on X and got quite a few views:

https://x.com/toshiHQ/status/1889928670887739902

It showed up in this Wired article:

https://t.co/79FN6dSvBO

Note that the guy who wrote this - David Gilbert - supposedly covers "disinformation". Yeah, he covered it. By repeating it.

Here it shows up in a yahoo!news article:

https://www.yahoo.com/news/trump-press-secretary-hit-embarrassing-191941730.html

And this article from someone on Daily Kos:

https://www.dailykos.com/stories/2025/2/14/2303889/-Nope-There-are-no-150-year-olds-on-Social-Security-It-s-COBOL

This disinformation is showing up almost exclusively on left-wing media and "fact checkers", and usually includes digs at Elon Musk and the "young" DOGE employees.

COBOL

So, how does COBOL store dates? That's up to the programmer, and I'm sure there are many, many different ways that dates have been stored.

I pulled out my "Structured COBOL, Pseudocode Edition" by Shelly, Cashman, and Forsythe, 1986 edition. ISBN 0-87835-196-5. This is pre-Y2K, and it's mind-blowing to me that they weren't planning for Y2K yet.

On page 6.32, "Defining the Date Work Area":

On most computers, the current date is stored in computer memory as a 6 character field (Year, Month, Day). In the sample program, the date, which will be printed in the report heading, is obtained from computer memory and is placed in a work area within the Working Storage Section of the Data Division. The work area and the method used in the Procedure Division of the sample program to place the date in the work area are shown in Figure 6-48.

In Figure 6-48, the DATE-WORK field is defined on line 006080. The DATE-WORK field is then subdivided into three 2-digit numeric fields - the YEAR-WORK field, the MONTH-WORK field, and the DAY-WORK field. The date is six characters in length and is stored in YYMMDD format (i.e. - January 25, 1987 would be stored as 870125).

The date is obtained from computer memory using the Accept statement. When the Accept statement is executed, the reserved work DATE identifies that the current date is to be copied from the area in main computer memory where it is stored by the operating system to the field DATE-WORK which has been defined in the program.

The Accept statement can also be used to retrieve the day of the year and the time of day. The day of the year is returned in a Julian date format. The first two numeric characters are the year and the next three numeric characters are the day of the year. Thus, the value returned for January 25, 1987 would be 87025. The time is returned as a two-digit numeric hour, a two-digit numeric minute, and a two-digit numeric second, and a two-digit numeric hundredths of a second. Thus, the time 9:15 a.m. would be returned as 09150000.

Figure 6-48, Data Division:

006070 01  WORK-AREAS
006080     05  DATE-WORK
006090         10  YEAR-WORK       PIC 99
006100         10  MONTH-WORK      PIC 99
006110         10  DAY-WORK        PIC 99
Enter fullscreen mode Exit fullscreen mode

Procedure Division:

014070     ACCEPT DATE-WORK FROM DATE
014080     MOVE MONTH-WORK TO MONTH-HEADING
014090     MOVE DAY-WORK TO DAY-HEADING
014100     MOVE YEAR-WORK TO YEAR-HEADING
Enter fullscreen mode Exit fullscreen mode

For those of you wondering how we had the Y2K bug, there it is in all its glory. This was written 14 years before 2000, and in the world of COBOL nobody apparently thought to point out that this was going to cause problems very, very soon.

The part of the book shown above is the entire treatise on handling dates (times aren't even mentioned except in that section) out of hundreds of pages. They show two possible date formats: "YYMMDD" or "YYDDD". COBOL generally stored data like this as characters, so the standard date format would be six separate characters.

Contrast that with a Unix timestamp which is a binary format that uses the space of four characters total and is able to specify more than 130 years of dates and times. The YYMMDD format can specify only 100 years of dates, and uses six character spaces. In modern terms, that's 48 bits to store around 36,525 days, which handily fits into 16 bits.

But the nice thing about YYMMDD is that there is no endianness to worry about, no epoch, no real calculation to determine the date. Just add a couple more Ys and you have ISO8601. That's why it's a great interchange format.

Summary

But, "May 20, 1875" has nothing to do with ISO8601, COBOL, or basically anything else other than the day that the Meter Convention was signed in Paris.

It was interesting to see a disinformation campaign take off like this, and I think at this point those involved have moved on to something else. But this will live on because of the internet and all the newly-minted COBOL date experts expounding on how Musk and company don't know something that's just so obvious. Sigh.

Never change, internet. Never change.

Top comments (66)

Collapse
 
n2ygk profile image
Alan Crosswell

The cool cobol kids store their dates in packed decimal.

...
   07 LMOD-DATE                          PIC 9(08) COMP-3.
Enter fullscreen mode Exit fullscreen mode

That's 8 digits stored in 5 bytes (IBM mainframe architecture).

Collapse
 
mdchaney profile image
Michael Chaney

"cool cobol kids"

First time in history those three words appeared in that sequence.

Collapse
 
fpmorrison profile image
Frederick Morrison

Hey! I was "cool" with my plastic pocket protector, my HP 16C Programmers RPN calculator and hands that could practically fly over a 3278 terminal keyboard back in the mid to late 1970's.

OK, so maybe that wasn't a good "chick magnet" line to use at the local bar, but still, we were the closest thing that a "nerd" could get to being "cool".

Collapse
 
andydentfree profile image
Andy Dent

I was a "cool cobol kid" when I had to wear two layers of sweaters to survive coding in the mainframe computer room!

Collapse
 
fpmorrison profile image
Frederick Morrison

Back in 1975, when I started my first full-time COBOL programming job, we weren't even that "cool" because all dates on the student master file (SMF) located on 4 reels of 1600 BPI NRZI 2400 foot round reel tapes were stored in S9(3) COMP-3 because we were not smart enough to include the 2-digit century in each student's date of birth (DOB) field.

Collapse
 
francis_mccarthy_520d54a1 profile image
Francis McCarthy

Nowadays, when I see a date in packed decimal, it is often in PIC 9(7) COMP-3 format (because it represents a YYYYDDD format).
I also had a customer who felt that the extra nibble used in a COMP-3 variable that records the sign was a waste of memory, so they created their own packed format in order to store a YYYYMMDD format in 4 bytes rather than 5 (so 01/01/2025 would be stored as X'20250101' rather than X'020250101C'). It meant incessant conversions, but given the amount of dates they had to deal with, it saved megabytes of disk space data every day, which, back then, was apparently more costly than the extra CPU cycles.

Collapse
 
olivierjm profile image
Olivier JM Maniraho

Very well explained, How I wish we could be seeing such articles more often here on dev.to.

Collapse
 
shawnsbrain66 profile image
Shawn Anderson

Even if it were all true, shouldn't it be investigated and/or corrected?
What is wrong with these people?

Collapse
 
mdchaney profile image
Michael Chaney

Yeah, that's the amazing aspect of all of this. Musk uncovers something and the loudest people are yelling at him instead of asking why something isn't being fixed.

Collapse
 
andydentfree profile image
Andy Dent

instead of asking why something isn't being fixed

  1. One database with missing death dates doesn't mean people get checks
  2. Inquiries decided the $millions required to chase up records across America would be better spent elsewhere (see 1).
Collapse
 
idenbusia profile image
Iden Busia

The official SS fraud rate, according to the Office of the Inspector General -- the agency tasked with finding government waste, fraud, and abuse -- is less than 1%.

source: oig.ssa.gov/assets/uploads/072401.pdf, apnews.com/article/social-security...

relevant section: "A July 2024 report from Social Security’s inspector general states that from fiscal years 2015 through 2022, the agency paid out almost $8.6 trillion in benefits, including $71.8 billion — or less than 1% — in improper payments. Most of the erroneous payments were overpayments to living people."

Collapse
 
crispin_fresch_89c71217b3 profile image
Crispin Fresch

Yes, it has been very frustrating to see how this took off like a brush fire in California. I have done my part to try to correct people when I have seen references to this, but people have been very stubborn about it. I usually mention the fact that I started COBOL programming in 1990, and I have never in my life heard any reference to 1875 being special in any way - until last week! It is the latest example of how people love to repeat things on the interwebs just to jump on what they think is a popular bandwagon, even though they have no clue what they are talking about. Then as more people repeat the same BS, you have people referring to the number of times it has been repeated as some kind of proof that it must be correct. Very lazy thinking!

Collapse
 
mdchaney profile image
Michael Chaney

My favorite part of it is watching people who know nothing about computers beyond how to send an email talk condescendingly about Musk and the DOGE team being so stupid that they don't know something this trivial that everybody knows.

Collapse
 
eddie_storey_d6303c7f64d6 profile image
Eddie Storey

Great article. Wish there a way for non-devs to understand. Seems like people just pickup what they want to hear and don't let the facts interfere.

I began developing in COBOL/CICS/IDSM and DB2 in the 80's and moved to client server in the 90s and finally web dev in early 2k after all the Y2K work dried up.

As tech support and local expert for friends and family, I've tried to explain this to them. It takes about 8 seconds for their eyes to glaze over. My standard response is "Its all fake news".

Collapse
 
danieljsummers profile image
Daniel J. Summers

two words - THANK YOU!

I saw the original claim, and thought "Uh, COBOL doesn't have a date type, and there's certainly no default." I came into a COBOL project ramping up their Y2K conversion, and between that and three different references (there was one between GMT and local), the procedure division copybook was over 4,000 lines.

An interesting aside - everyone hears COBOL and thinks IBM, but I know the US Internal Revenue Service (IRS) uses Unisys mainframes. Still COBOL, but a different flavor. The Social Security Administration could very well be using IBM.

It's not an area where I have first-hand knowledge, so I'm avoiding making a definitive statement about it. (See how that works, Internet folks? It ain't that hard... heh - of course, this comment won't go viral either.)

Collapse
 
idenbusia profile image
Iden Busia • Edited

Regardless, this article is heavily slanted towards Musk and Doge. The numbers are VERY bad for their case. According to the Office of the Inspector General -- the government agency tasked with finding waste, fraud, and abuse in the government -- Social Security has a fraud rate of less than 1%. Musk and his moronic children couldn't possibly hope to find fraud by mere database query. That's not how complex government systems work.

source: oig.ssa.gov/assets/uploads/072401.pdf, apnews.com/article/social-security...

relevant section: "A July 2024 report from Social Security’s inspector general states that from fiscal years 2015 through 2022, the agency paid out almost $8.6 trillion in benefits, including $71.8 billion — or less than 1% — in improper payments. Most of the erroneous payments were overpayments to living people."

That's about $10.25 billion per year, and we don't know how much of that was recovered or had adjustments made to future payments.

Collapse
 
mdchaney profile image
Michael Chaney

I'm slanted only toward the truth, and this article very narrowly talks about one piece of obvious disinformation that was shared widely and even showed up in fake "fact checks". I don't know if there's fraud in social security - well, obviously there is but I don't know the extent. I cannot judge that without getting into the data personally.

Collapse
 
idenbusia profile image
Iden Busia

Actually, you can know without getting into it personally! As I've said numerous times, it's been evaluated by experts who's job it is to do this evaluation and they have concluded that the fraud rate is less than 1%!

If you think you need to personally evaluate something to understand it rather than relying on experts, I have infinite bad news for you regarding food, medicine, transportation, etc.

Thread Thread
 
mdchaney profile image
Michael Chaney

I am an expert in this domain. I don't trust people who tell me to just trust them. That's why in my article above I "showed my work".

Thread Thread
 
idenbusia profile image
Iden Busia • Edited

I'm sorry, you're an expert in auditing government programs including social security?

Do you believe Elon and DOGE's claims?

Collapse
 
moopet profile image
Ben Sinclair

I think we need to be careful and do our due diligence.

A person or organisation can have a lot of things wrong with them, but every time someone mocks them for being wrong and they're not, it damages the situation for everyone. It lends credibility to the bad actor because people can then dismiss the critics are lazy, misinformed, or malicious.

Sticking to stats and guesswork, let's say that 1/3 of the improper payments were caused by this "problem" which may or may not exist. That's ~ $23B over 7 years or about $3B/year. From a financial point of view it's probably worth investing a few million in tracking that down and seeing how much you can recoup. That's the "efficiency" part.

The real trouble of it being chump change to the government is that Musk is using it as a demonstration of something, which means... he doesn't have anything better to show. It's literally the worst thing he could uncover, if indeed he's made any legitimate efforts to uncover anything.

Collapse
 
idenbusia profile image
Iden Busia

The Department of the Inspector General has already tracked it down. Hence, why we know about it at all. We don't need anyone to spend any more money; it's already done.

I agree with the rest of your statement.

Collapse
 
ron_bird_12b028ad378febaf profile image
Ron Bird

Having been a Cobol programmer since the 1970s, this has little to do with Cobol, but rather expensive disk space of that era. I implemented the Information Associates student records system at UCLA way back when. Their software stored dates in two bytes. The earliest date they could store was January 1, 1900 and that date meant blank date. They could not store a date past 2027, if I remember correctly. They had a subroutine written in Cobol that would convert two byte dates to six or eight byte dates and vice versa using division. They also had an assembler routine called BITBYTE that compressed eight yes/no bytes down to one byte for storage. Storage was expensive and they went to extreme measures of compression which drove us programmers crazy.

The social security date starting with a date in 1875 sounds plausible to me, but being stored that way because of Cobol is bunk.

Collapse
 
joevansteen profile image
joevansteen

Mainly right, but the space problem started with 80 column Hollerith cards for input data. The six digit dates then moved from the cards to tape master files. Disc space was actually so expensive that almost nobody could afford to maintain master files on disk. Definitely not the IRS or Social Security.

Collapse
 
rediedotie profile image
Redie Dotie

So, is COBOL going to be discontinued anytime soon?
I know that a lot of critical systems run on COBOL, but isn't it time for something more up-to-date? I'm not saying that new is better, I just want to know if COBOL will be replaced in the near future.

Collapse
 
mdchaney profile image
Michael Chaney

COBOL is being replaced daily. I was working on a project 28 years ago (circa Y2K) to replace everything on an old mainframe running COBOL code with something more modern.

I don't think there is tons of new COBOL being written, but there's definitely megabytes of it still running out there on IBM hardware in particular.

There are plenty of drawbacks to COBOL, and you can read about them everywhere. It's one of the first high-level languages and computer language design has improved dramatically in the last 65 years. To be fair, IBM has kept up and made many changes to their COBOL implementation as well, but the language structure puts some pretty tight limits on what can be accomplished.

There are some areas where COBOL shines:

  1. The code tends to be very readable if written properly
  2. It can be very fast since it's easy to optimize - no pointers and such to worry about
  3. The data formats are character-based and pretty much automatically interchangeable

But each of those bullet points comes with a list of downsides as well. The reality is that a lot of code was written in COBOL that still works and there's little incentive to change it. People also tend to discount just how fast IBM's big iron systems are. Coming from the PC world, there's really not much commonality to even base a comparison on.

Thread Thread
 
rediedotie profile image
Redie Dotie

Thanks a lot for this answer. It means a lot to me. I was thinking about going into mainframe development. But everybody keeps telling me that it's a waste of time.

Thread Thread
 
mdchaney profile image
Michael Chaney

It's not a waste of time and IBM's mainframes aren't going anywhere. Many entire industries run on them and will for the foreseeable future. Not my cup of tea, personally, but it'll always be a decent way to make a decent living.

Collapse
 
crispin_fresch_89c71217b3 profile image
Crispin Fresch

The main issue from a business, dollars-and-cents, point of view with rewriting this mainframe COBOL stuff in another language is: what is the benefit of doing that? It will cost a lot of money to pay contractors to rewrite all that code in some other language, and I don't really see that investment being recouped in any way after it is done, just because it has all been implemented in a more modern language. The main thing that would probably drive it, I think, is if companies feel they have no choice BUT to rewrite it because there just aren't enough people skilled in COBOL anymore, and the mainframe environment in general, to get the work done.

Collapse
 
fpmorrison profile image
Frederick Morrison

Back in my first paid programming position in July 1975, I worked on several COBOL 68 programs that were eventually converted to COBOL 74 that stored the date in an S9(6) COMP-3 field that was REDEFINED to break out the Month, Day, Year (in that order because ISO 8601 was not even a gleam in the ANSI Standards Committee's eye). It was a storage saving device since mainframe disc (DASD as we called it back then) and 1600 BPI NRZI round-reel 2400 tapes needed all the help we could give them to store as much information as possible. This bit us in the you-know-what around 1999 when two-digit years suddenly had to be converted to 4-digit years using either 00-49 => add 2000 and 50-99 => add 1900 or 00-29 => add 2000 and 30-99 => add 1900. I spent most of 1999 furiously converting programs to be Y2K compliant (a loosely defined term, I might add).

Bottom line: Musk is right and the guy who brought up the 1875 nonsense didn't spend as much time punching (on 80-column punch cards, from COBOL coding forms) like I did; otherwise, he would have mentioned the S9(6) COMP-3 "trick" to save storage for a mmddyy date.

Collapse
 
nathanhannon profile image
Nathan Hannon

Loved the explanation and how you got into the nitty gritty of it. So glad Community Notes exists to counter the disinformation.