
Let me start at the end for those who don't want to read the whole thing. COBOL does not have a "default date" or any kind of default value for a ...
For further actions, you may consider blocking this person and/or reporting abuse
The cool cobol kids store their dates in packed decimal.
That's 8 digits stored in 5 bytes (IBM mainframe architecture).
"cool cobol kids"
First time in history those three words appeared in that sequence.
Hey! I was "cool" with my plastic pocket protector, my HP 16C Programmers RPN calculator and hands that could practically fly over a 3278 terminal keyboard back in the mid to late 1970's.
OK, so maybe that wasn't a good "chick magnet" line to use at the local bar, but still, we were the closest thing that a "nerd" could get to being "cool".
I was a "cool cobol kid" when I had to wear two layers of sweaters to survive coding in the mainframe computer room!
Back in 1975, when I started my first full-time COBOL programming job, we weren't even that "cool" because all dates on the student master file (SMF) located on 4 reels of 1600 BPI NRZI 2400 foot round reel tapes were stored in S9(3) COMP-3 because we were not smart enough to include the 2-digit century in each student's date of birth (DOB) field.
Nowadays, when I see a date in packed decimal, it is often in PIC 9(7) COMP-3 format (because it represents a YYYYDDD format).
I also had a customer who felt that the extra nibble used in a COMP-3 variable that records the sign was a waste of memory, so they created their own packed format in order to store a YYYYMMDD format in 4 bytes rather than 5 (so 01/01/2025 would be stored as X'20250101' rather than X'020250101C'). It meant incessant conversions, but given the amount of dates they had to deal with, it saved megabytes of disk space data every day, which, back then, was apparently more costly than the extra CPU cycles.
Very well explained, How I wish we could be seeing such articles more often here on dev.to.
Even if it were all true, shouldn't it be investigated and/or corrected?
What is wrong with these people?
Yeah, that's the amazing aspect of all of this. Musk uncovers something and the loudest people are yelling at him instead of asking why something isn't being fixed.
As someone who has worked with government databases-it's really ignorant to think that all this data is coming from one table or even one database. Even if these DOGE "whiz kids" did look at some API where there was a birthdate column and if dead column, that would be more reason that they're looking at an aggregate of several databases (it's not one SQL table). This whole thing is blown out of proportion. Musk engages in whataboutism, as shown with his tweet with a different if dead table. None of his kids are familiar with government systems and how they're setup. Obviously the determination of who gets a check is based on a lot of factors (39 million 0-9 year olds are not getting checks). Checks also stop when the age reaches 115 (among with other factors). Just seems author has a beef about a possible placeholder being 18750520 for large organizations.
Edit, since I’ve been blocked from replying....the author's LinkedIn doesn't seem an authority of what date placeholder might have been with government systems (his job listings don't include major corporations or government administrations). He says his article is just on one topic, while every single section is a different topic that's not a fact check of what the SSA might have set for a default date placeholder (it came up after DOGE's first claim that there were thousands of people with birthdates in 1875).
This article has to do with one single topic - "18750520" is not a "placeholder", "default value", "epoch", or anything like that.
The official SS fraud rate, according to the Office of the Inspector General -- the agency tasked with finding government waste, fraud, and abuse -- is less than 1%.
source: oig.ssa.gov/assets/uploads/072401.pdf, apnews.com/article/social-security...
relevant section: "A July 2024 report from Social Security’s inspector general states that from fiscal years 2015 through 2022, the agency paid out almost $8.6 trillion in benefits, including $71.8 billion — or less than 1% — in improper payments. Most of the erroneous payments were overpayments to living people."
Yes, it has been very frustrating to see how this took off like a brush fire in California. I have done my part to try to correct people when I have seen references to this, but people have been very stubborn about it. I usually mention the fact that I started COBOL programming in 1990, and I have never in my life heard any reference to 1875 being special in any way - until last week! It is the latest example of how people love to repeat things on the interwebs just to jump on what they think is a popular bandwagon, even though they have no clue what they are talking about. Then as more people repeat the same BS, you have people referring to the number of times it has been repeated as some kind of proof that it must be correct. Very lazy thinking!
My favorite part of it is watching people who know nothing about computers beyond how to send an email talk condescendingly about Musk and the DOGE team being so stupid that they don't know something this trivial that everybody knows.
You've been duped by including a completely different column of if dead boolean to whatever column was used when DOGE claimed there was thousands of people born in 1875. It doesn't matter what date type was stored, whether it's new COBOL version that has a date type, or older that would be a string or integer. 18750520 could have been a standardization with government systems. If you've worked with large database systems in government, you'd know these two columns are not just in one table. From what I can tell, DOGE was looking at data from the last internal audit a few years ago. You've highlighted how a 19 year old would not know the significance of a discrepancy of thousands of fields being 1875 in one column (because I can assure you that all the data in their system are aggregates of several databases).
Regardless, this article is heavily slanted towards Musk and Doge. The numbers are VERY bad for their case. According to the Office of the Inspector General -- the government agency tasked with finding waste, fraud, and abuse in the government -- Social Security has a fraud rate of less than 1%. Musk and his moronic children couldn't possibly hope to find fraud by mere database query. That's not how complex government systems work.
source: oig.ssa.gov/assets/uploads/072401.pdf, apnews.com/article/social-security...
relevant section: "A July 2024 report from Social Security’s inspector general states that from fiscal years 2015 through 2022, the agency paid out almost $8.6 trillion in benefits, including $71.8 billion — or less than 1% — in improper payments. Most of the erroneous payments were overpayments to living people."
That's about $10.25 billion per year, and we don't know how much of that was recovered or had adjustments made to future payments.
I'm slanted only toward the truth, and this article very narrowly talks about one piece of obvious disinformation that was shared widely and even showed up in fake "fact checks". I don't know if there's fraud in social security - well, obviously there is but I don't know the extent. I cannot judge that without getting into the data personally.
Actually, you can know without getting into it personally! As I've said numerous times, it's been evaluated by experts who's job it is to do this evaluation and they have concluded that the fraud rate is less than 1%!
If you think you need to personally evaluate something to understand it rather than relying on experts, I have infinite bad news for you regarding food, medicine, transportation, etc.
I am an expert in this domain. I don't trust people who tell me to just trust them. That's why in my article above I "showed my work".
I'm sorry, you're an expert in auditing government programs including social security?
Do you believe Elon and DOGE's claims?
I'm an expert in auditing data sets, yes. I don't have access to relevant data to evaluate Musk's claims. Again, I'm discussing a very specific piece of disinformation.
Yet you correlated a boolean column for if dead, with a column of date of birth (which clearly isn't a boolean type). As someone who has worked with government databases that aren't close to being as big as SSA, it seems you're only adding disinformation.
I think we need to be careful and do our due diligence.
A person or organisation can have a lot of things wrong with them, but every time someone mocks them for being wrong and they're not, it damages the situation for everyone. It lends credibility to the bad actor because people can then dismiss the critics are lazy, misinformed, or malicious.
Sticking to stats and guesswork, let's say that 1/3 of the improper payments were caused by this "problem" which may or may not exist. That's ~ $23B over 7 years or about $3B/year. From a financial point of view it's probably worth investing a few million in tracking that down and seeing how much you can recoup. That's the "efficiency" part.
The real trouble of it being chump change to the government is that Musk is using it as a demonstration of something, which means... he doesn't have anything better to show. It's literally the worst thing he could uncover, if indeed he's made any legitimate efforts to uncover anything.
The Department of the Inspector General has already tracked it down. Hence, why we know about it at all. We don't need anyone to spend any more money; it's already done.
I agree with the rest of your statement.
Having been a Cobol programmer since the 1970s, this has little to do with Cobol, but rather expensive disk space of that era. I implemented the Information Associates student records system at UCLA way back when. Their software stored dates in two bytes. The earliest date they could store was January 1, 1900 and that date meant blank date. They could not store a date past 2027, if I remember correctly. They had a subroutine written in Cobol that would convert two byte dates to six or eight byte dates and vice versa using division. They also had an assembler routine called BITBYTE that compressed eight yes/no bytes down to one byte for storage. Storage was expensive and they went to extreme measures of compression which drove us programmers crazy.
The social security date starting with a date in 1875 sounds plausible to me, but being stored that way because of Cobol is bunk.
Mainly right, but the space problem started with 80 column Hollerith cards for input data. The six digit dates then moved from the cards to tape master files. Disc space was actually so expensive that almost nobody could afford to maintain master files on disk. Definitely not the IRS or Social Security.
So, is COBOL going to be discontinued anytime soon?
I know that a lot of critical systems run on COBOL, but isn't it time for something more up-to-date? I'm not saying that new is better, I just want to know if COBOL will be replaced in the near future.
COBOL is being replaced daily. I was working on a project 28 years ago (circa Y2K) to replace everything on an old mainframe running COBOL code with something more modern.
I don't think there is tons of new COBOL being written, but there's definitely megabytes of it still running out there on IBM hardware in particular.
There are plenty of drawbacks to COBOL, and you can read about them everywhere. It's one of the first high-level languages and computer language design has improved dramatically in the last 65 years. To be fair, IBM has kept up and made many changes to their COBOL implementation as well, but the language structure puts some pretty tight limits on what can be accomplished.
There are some areas where COBOL shines:
But each of those bullet points comes with a list of downsides as well. The reality is that a lot of code was written in COBOL that still works and there's little incentive to change it. People also tend to discount just how fast IBM's big iron systems are. Coming from the PC world, there's really not much commonality to even base a comparison on.
Thanks a lot for this answer. It means a lot to me. I was thinking about going into mainframe development. But everybody keeps telling me that it's a waste of time.
It's not a waste of time and IBM's mainframes aren't going anywhere. Many entire industries run on them and will for the foreseeable future. Not my cup of tea, personally, but it'll always be a decent way to make a decent living.
The main issue from a business, dollars-and-cents, point of view with rewriting this mainframe COBOL stuff in another language is: what is the benefit of doing that? It will cost a lot of money to pay contractors to rewrite all that code in some other language, and I don't really see that investment being recouped in any way after it is done, just because it has all been implemented in a more modern language. The main thing that would probably drive it, I think, is if companies feel they have no choice BUT to rewrite it because there just aren't enough people skilled in COBOL anymore, and the mainframe environment in general, to get the work done.
Back in my first paid programming position in July 1975, I worked on several COBOL 68 programs that were eventually converted to COBOL 74 that stored the date in an S9(6) COMP-3 field that was REDEFINED to break out the Month, Day, Year (in that order because ISO 8601 was not even a gleam in the ANSI Standards Committee's eye). It was a storage saving device since mainframe disc (DASD as we called it back then) and 1600 BPI NRZI round-reel 2400 tapes needed all the help we could give them to store as much information as possible. This bit us in the you-know-what around 1999 when two-digit years suddenly had to be converted to 4-digit years using either 00-49 => add 2000 and 50-99 => add 1900 or 00-29 => add 2000 and 30-99 => add 1900. I spent most of 1999 furiously converting programs to be Y2K compliant (a loosely defined term, I might add).
Bottom line: Musk is right and the guy who brought up the 1875 nonsense didn't spend as much time punching (on 80-column punch cards, from COBOL coding forms) like I did; otherwise, he would have mentioned the S9(6) COMP-3 "trick" to save storage for a mmddyy date.
Loved the explanation and how you got into the nitty gritty of it. So glad Community Notes exists to counter the disinformation.
Great article. Wish there a way for non-devs to understand. Seems like people just pickup what they want to hear and don't let the facts interfere.
I began developing in COBOL/CICS/IDSM and DB2 in the 80's and moved to client server in the 90s and finally web dev in early 2k after all the Y2K work dried up.
As tech support and local expert for friends and family, I've tried to explain this to them. It takes about 8 seconds for their eyes to glaze over. My standard response is "Its all fake news".
two words - THANK YOU!
I saw the original claim, and thought "Uh, COBOL doesn't have a date type, and there's certainly no default." I came into a COBOL project ramping up their Y2K conversion, and between that and three different references (there was one between GMT and local), the procedure division copybook was over 4,000 lines.
An interesting aside - everyone hears COBOL and thinks IBM, but I know the US Internal Revenue Service (IRS) uses Unisys mainframes. Still COBOL, but a different flavor. The Social Security Administration could very well be using IBM.
It's not an area where I have first-hand knowledge, so I'm avoiding making a definitive statement about it. (See how that works, Internet folks? It ain't that hard... heh - of course, this comment won't go viral either.)
★★★★★★★★★★ out of ☆☆☆☆☆
Well done!!! Glad to see others pull out their hair!! (Not really but ya know!)
Signed,
90s COBOL student and programmer
I'm an old COBOL programmer who used to work for a bank. We typically stored dates as mmddccyyjjj (yes we had to add cc for Y2K). JJJ was the day of the year (1-365/366). . This allowed us to use either the calendar date, mmddccyy or what we called the Julian date, ccyyjjj. If you used the Julian date (don't know why it was call that), you could calculate the number of days between two dates by subtracting one from the other.
Interestingly, the main pushback on this article is from two people who have never commented on any other article on the site. There are around 15 comments between the two of them. In an attempt to "debunk" what I wrote, one of them links to an article, which links to an article, which is written based off the comment that I started with at the top of this. It would be interesting to see what's behind this.
That’s all very well, but does not explain the table, especially the small spike around apparent birth dates in 1800.
Could it be possible that (horrors) setting the DEATH field to FALSE is not the only variable involved? Why would there be apparent birth dates as far back as the founding of the US? Even considering the possibility of fraud (unlikely, as the SocSec error rate is pretty low), why would that show up? There is a serious problem with data interpretation here.
Sentinels not Epochs
Old-timer here (40+ years dev). I don't think this is a COBOL issue and nor is it about epochs! When I first heard the tale, it was that the value was a sentinel value, which is an arbitrary, obviously wrong value used to indicate a special status. This is particularly common in non-SQL databases that lack NULLs, or where people have multiple sentinels for different meanings.
It only took me a couple of minutes to find a comment saying similar:
I've spent a lot of years working with grungy data (you think this is bad, try working for mining engineers and geologists). Sentinel values are a common gotcha.
Aging data
I do appreciate the work you did and thanks particularly for the Elon tweet with the bracketed breakdown.
That long tail looks awfully like data entered from historical records lacking death dates - there have been a few discussions of the cost of finding death dates and the decision to avoid spending $millions on it, as this is not data used to make payments.
You would expect, in a system that's pulling data from many sources, to see historical jumps in data cleanup like this. Imagine a few large states finally get around to digital records of deaths, so their data is easily aggregated - you get a sudden flushing of people who would previously have been left on the list. However, this will only apply from a certain age onwards as those sources in turn don't have the time/budget/interest to digitise really old records.
Well, don't you see though, that in pointing out that someone merely said something as proof that it must be true is how this whole 1875 thing got out of control in the first place? I have mainly worked on financial systems, primarily mutual fund transfer agent systems in Boston, and a large payment card processing company, and I have never in my life seen any reference to the year 1875, and certainly not a date as specific as 1875-05-20.
Part of disinformation campaigns like this involves seeding random comments like the one quoted there. You won't find them before February of 2025 anywhere, then suddenly here they are.
So you haven't been a federal COBOL dev. Other systems using COBOL might have had other date placeholders, but 18750520 could have been set by developers in government. news.stthomas.edu/in-the-news-manj...
Edit, since I’ve been blocked from replying....I see the author graduated college in the 90s: doesn't seem an authority of what date placeholder might have been with government systems. When it comes to "little game": go look at his LinkedIn: I don't see any job that involves a large government organization with aggregate databases.
That came from the Washington Post, which got it from the guy that I mentioned in the article. All part of the disinformation campaign.
I studied folklore extensively in university - I know this little game really well.
Actually I don’t want to point out something in such a well researched and written article, but the Unix timestamp, the number of seconds since Jan 1, 1970, will only be 68 years until 2038 when the time wraps. I’m wondering how much chaos is going to be caused when that date arrives…
That's what I said. Did I mess something up?
I believe you referred to 130 years
Yes. If you can go 68 years in either direction, you can handle about 136 years of dates/times. I show you in the article the very first and very last times.
1901-12-13 20:45:53 UTC through 2038-01-19 03:14:07 UTC
Do the math on those and you can see that it's 68 years, 18 days, 3 hours, 14 minutes, and 7 seconds in either direction.
Haha, that’s great! I never have tried negative numbers with the date function 😆
Agreed that non-cobol programmers have been stubbing their toes on this. Yet we're not party to the code used to develop the data - nor do we know how it may have changed over the years. Is this a monolithic data set, or actually a compendium? Not knowing that, we should be reluctant toa make any categorical statements about interpreting the data. A longitudinal data dictionary might resolve the ambiguity. I've ported a COBOL application, and understanding what the data meant involved more than just knowing COBOL's data rules - it demanded an understanding of the programming that yielded the data.
As someone who actually spent years writing and updating COBOL 68 and COBOL 74 programs in the mid to late 1970's, knowing the S9(3) COMP-3 with REDEFINES "trick" is crucial to understanding how MM/DD/YY dates (e.g., date of birth) were stored (we didn't use "persisted" back then - sorry), especially on DASD (we didn't call it disk, maybe disc, but certainly not disk) or magnetic tape (1600 BPI NRZI 2400-foot round reel tapes, subsequently converted to 6250 BPI GCR 2400-foot tapes).
Y2K bit us in the proverbial backsides as we figured out how to write a future-proof (is there any such thing?) algorithm for converting mmddyy into yyyymmdd depending on the range of the old yy number (00-49 => add 2000 and 50-99 => add 1900 OR 00-29 => add 2000 and 30-99 add 1900).
It was a messy conversion, given we used 24 line x 80 column 3278 green-on-black monochrome block-mode terminals (or 3279 4-color ones if your organization had the money for color) "terminals" to perform the task. At least using a "terminal" beat punching 80-column cards, so I guess that's a win.
Correct. I'm not giving an opinion about the data, per se. I'm just pointing out that "May 20, 1875 is the epoch/default date in COBOL" is a) false and b) part of a disinformation campaign. That's it.
news.stthomas.edu/in-the-news-manj... Placeholder date is different than if dead boolean. Whataboutism of whether SSA's COBOL version has date type, string, or integer....it's not an argument from what appears you not accepting that the gov placeholder is 18750520
I hope the real point of this article is not to dismiss all fact-checking attempts by media that is not clearly right-wing.
I'm not sure how you'd arrive at that conclusion. I would point out, though, that this disinformation was shared by Politifact - a supposed "fact-checking" organization - and a guy who claims to be a "disinformation expert". Keep that in mind when you read "fact checks". Also keep in mind that this obvious disinformation won't actually get "fact checked" by any of them.
Yes, you are very aware how I would arrive at that conclusion.
For everyone else, it's the quotation marks around "fact checks".
When "fact checks" include a literal disinformation campaign, you might want to put quotation marks around "fact checks". They're not checking facts, they're spreading propaganda. And, yes, both sides are guilty of this. But only one side has their "fact checks" show up on facebook and other platforms as gospel truth.
This is may be the biggest reason this 1875 thing got so out of control: because it totally supported the point of view that people very much wanted to believe, and they just didn't care whether it was actually true or not, because they liked the story.
Not 1875 but I definitely remember a lot of the Y2K "fixes" we implemented had logic around YY > 75 being CC 19 and < 75 being CC 20. Probably still lurking in the JPMorgan mainframes
Wondering if that's another potential misunderstanding along the way
That's basic "windowing", very common back in the day. The disinformation specifically revolves around May 20, 1875, and is unrelated to windowing.
In your long tirade in defense of the indefensible moron that is Elon Musk, it never occurred to you that you were contributing to the misinformation campaign of Trump? I often use default dates in place of null values myself to make things work. This is exactly what happened here. 1875 was a default date chosen by those defining the database. It has little to do with COBOL.
It has nothing to do with defending Musk, Trump, or anyone else, it is just explaining facts. As for this comment:
That is exactly what the claim was that is being debunked here though - that it was "how COBOL works". As I have said in a few other comments, I have never in my life seen or heard of the year 1875 being used as a default date for anything.
I once used the years of WW1 and WW2 as default dates. Nothing to do with COBOL. I just had to use easily identifiable years in place of null values.
Just because you haven't seen or heard of something doesn't make it impossible. It's a common trait with Republicans and others on the far right to think something hasn't happened to anyone just because it hasn't happened to them. You're not explaining facts. You're presenting a case in support of Musk and Trump that is adjacent to facts, but not representing facts.
I'm not 1) "far right" or 2) "Republican". I'm a normal person. If "May 20, 1875" were a "default date" of some kind the data would make it obvious.
Yet in this article you cherry pick as DOGE does. You blame others of disinformation, but go off on tangents. Such has showing a table with a boolean of if a person is dead: not date of birth. You claim you know better than a person who claims to be an old COBOL dev who used the YYYY format (that COBOL for gov just happened to pick 1875 because of standards and measure). I've found sources saying that COBOL for enterprise often utilized ISO 8601 as date format, and that May 20 1875 was chosen as a placeholder because of its significance with science.
Edit, since I’ve been blocked from replying....I don't see any listing of the author working for the federal government. There's no evidence against government developers seting 18750520 as a placeholder: ISO8601, no matter what the data type was! iso.org/iso-8601-date-and-time-for... Author claims that his example of a dev saying 1875 was a date wherever he worked, or my link of a professor supplying that info to Washington Post, was just invented last month to counter Musk's whataboutisms. news.stthomas.edu/in-the-news-manj... This one topic is from before the invention of the internet, it's not going to be a ranked search item on Google. It is a response to Musk's original claim that there were thousands of birthdates from 1875. There very could have been conditions where government developers set 18750520 as a default placeholder.
Of course you "found sources" - this is a disinformation campaign. I do know better than the supposed "old COBOL dev", which should be obvious. He also backpedaled, in case you missed it. ISO8601 is unrelated to May 20, 1875. Period.
Social security started paying monthly benefits in 1940, for those age 65 1/2. In 1940, if you were 65 years old, what year were you born in? I think the idea here is that this was a default in the local system, not in COBOL itself.
Beyond that, is there any proof that Musk's histogram means anything at all, regardless of this question?
Whether this is true or not, it is not what the original claim was. The original claim was that it is the way COBOL works. I haven't seen a bunch of programmers who worked on the SS system jump in and say "hey I worked on that system, and we just used that date as a default date". I don't think I've seen a single person claim that. People have claimed, as you did, that it might be an explanation, but that's it.
Interesting. It sure would be nice to get more information that we could trust as to what is being found, what is known about the system, and what actions are being taken.
One aspect not mentioned, widows/widowers can legally receive social security payments that would have gone to their spouse. If the age difference upon marriage were high (it happens) an account with an "impossible" age may still be receiving payments. For example, rich dude of 60 marries 20 year old. If she lives to her 90's she'd still be collecting money from his social security, even though he'd be in his 130's if still alive. I wonder if that accounts for some of the impossible ages in the system.
We need more transparency. Take the time to do some audits, stop the current rush and lack of accountability. As it stands, the story line that Musk is "rooting out fraud" appears to be a cover story to allow the filling of government jobs with political hacks. It's unfortunate that deceit is being used to combat deceit, but there you have it. Propaganda and disinformation fills a void.
How good can a article be? Damn!
The Musk post you refer to (and the table you show) states these are the numbers of people with FALSE boolean values for whether they're dead. There's a separate claim in which there's thousands of people with the birth date being year 1875. In that one, it's that the default value coded for government COBOL as May 20, 1875 (with date starting YYYY to account for date of standards of measure, as devs who worked COBOL for gov have confirmed on forums). news.stthomas.edu/in-the-news-manj... At least current versions of COBOL has a date type (if it hadn't before, I'm sure it was included once it was revised to be object oriented). ibm.com/docs/en/i/7.4?topic=items-... Either way, if the birth date is recorded as a string or date type, it's not the boolean of whether a person is dead. Since government databases also keeps track of dates going centuries: gee, maybe that's why they did settle on YYYYMMDD. ISO's actual definition for 8601: iso.org/iso-8601-date-and-time-for...
So seems there are two columns here-date of birth and whether the person is dead. Don't mix up the two. They more than likely don't have a direct correlation because of the difficulty to find a birth and death certificate before digital record keeping, and that the database is very complex. Don't be a Musk apologist and say you're just pushing the truth. It's clear DOGE was taking the whole data out of context to claim that there are fraudulent checks being sent (when eligibility is based on multiple fields).
As you say, "it is up to the programmer". So the programmer has set a "default" date and therefore it lands on 1875, it is not that hard to grasp really.
Tired of the misinformation too.
In ANSI COBOL-68 (a.k.a. X3.23-1968 COBOL) and later COBOL-1974 (a.k.a. X3.32-1975 COBOL), there was no such thing as a default date (but I sure wish there had been), so trying to interpret a S9(6) COMP-3 field containing a date of birth such as 121455 (mmddyy) was a matter of just splitting out (via REDEFINES) the yy part and adding (or assuming) 1900 as the century.
For the programs I worked on at Arizona State University, having a default date supplied by the operating system (VM/CMS or MVS) or COBOL run-time wasn't really even necessary.
There's obviously no "default date", as you can see in the data provided.
Oh how I miss my COBOL book taken from me in a storage auction. I will make sure this article goes everywhere the disinformation has been.
Here's my old one, anybody else recognize this
Kinda curious how the 1875 date came up.
Most likely just a date someone pulled out of their ass and got "telephone gamed" into this story
My guess is that the guy that I linked to in the story took the "150" number, went back 150 years, and then thought of what might have happened in 1875. I have no idea, though.
Or maybe you haven't worked on government systems and want to believe what you want to believe. Maybe there is a reason why 18750520 was used as a standardized placeholder for organizations that had large enough main frames to hold that (whatever data type it was), and needed that space for storing full birthday information of people born in America before SSA was established. Apparently that date placeholder was standard for engineering and science because of the start of standards and measure. You have gone down a complete rabbit hole of topics off topic-even citing clear deflection of Musk...a completely separate column of if dead boolean.