DEV Community

I've been building software for 40 years. But I want *you* to tell me about dev in 1986...

John Munsch on April 01, 2026

It's 2026 and I started professional software development in 1986 when I took a Summer internship at Tandy Computers in Fort Worth, TX after my j...
Collapse
 
jess profile image
Jess Lee • Edited

COBOL pops to mind. And...the hole punch thing would have been way before this right? 😬

Collapse
 
pascal_cescato_692b7a8a20 profile image
Pascal CESCATO

Perforated media existed well before computing (barrel organs, early programmable looms), but dedicated mechanical punching devices likely appeared in the early 19th century with the industrial production of Jacquard cards. The same paradigm was later repurposed for computing, most notably by Herman Hollerith in his tabulating machines.

Collapse
 
johnmunsch profile image
John Munsch

What Pascal said is 100% correct, punch cards had existed for quite a long time but I just missed them (and COBOL/Fortran too). When I went to college in 1983 the computing center had a wall of cubbies on it (backless) where people would turn in a bunch of cards in a stack in one of the cubbies, someone would retrieve and run them, and then they would come back with a printout that showed the results of the run. I never saw anyone do that and I actually saw them roll out the very last card punch they had on campus (at least according to one of the folks working there).

When I went to Tandy we had a couple of guys working there who really only knew COBOL and when we tried to move them to C, they were resistant and eventually ended up being let go. I never heard how they did after that, but I assume they made bank when Y2K rolled around about a decade or so later.

Collapse
 
ben profile image
Ben Halpern

GUI is a new thing, but not super mainstream, email is poking its head out, and devs are excited by the personal computer revolution, but circumspect about it.

That’s me riffing on my ideas of what 1986 might have been like.

Collapse
 
johnmunsch profile image
John Munsch

GUIs really started to show up after just a few years. Tandy took a run at it with their first version of DeskMate which was character based (VGA and higher resolution graphics cards weren't quite yet a thing in 1986) but they used character based UIs to provide simple editors and stuff in software they could bundle with the machine. We actually did not have a network in the group I was in upon first joining but it was coming reasonably soon (token ring, Novell NetWare). We were moving everything around on floppy disks.

As a matter of fact, in the group I first worked in, we were building Varsity Scripsit (a new PC version of an older editor Tandy used to sell on their Unix machines). Of the eight machines used in our group, I believe only three had hard drives in them. Most of them were floppy only. You have not lived until you've done software development using Microsoft C on a floppy only machine. You edit your code on a disk with part of the compiler. Run it, it does one of the compilation steps, copies the build product onto the next disk (which might require multiple swaps to happen), runs the next build step, etc. Probably crossing multiple floppy disks in the process. If it failed at one of the steps, like the linker, then you could go back to the beginning and try and correct it. No millisecond builds in that environment :)

I will say this though, we saw what we had and its incredible limitations, but we also saw how fast it was changing. We needed literally every hardware change to happen overnight because we needed ten years of progress to move from the hardware being the bottleneck to the software being the bottleneck.

Collapse
 
alen_p profile image
Alen P.

I can only imagine typing C code on a green screen terminal in '86 with just a blinking cursor. Version control was piles of floppy disks everywhere, each named project_final_v3_USE_THIS_ONE 😂 It's a miracle anything worked. Respect!

Collapse
 
johnmunsch profile image
John Munsch • Edited

LOL. I was fortunate that the green screen terminal thing was largely restricted to when I was at school. For most classes we time shared VAX machines (like the VAX 11/780). If you put a whole class of people all doing development on one of those machines to work at the same time, you could bring it to its knees if everybody ran the compiler at the same time. So the rather crude system of a couple of tokens came to be (often an empty soft drink can sitting on top of a terminal that showed you had the right to compile right now). Only those with the token could compile, everybody else had to stick to editing their code. Those would then be moved around as people got their compile done and then shifted back to editing again.

When I went to work, Tandy was shifting from Unix machines and home computers they had built internally (like the TRS-80) to IBM PC and PC Jr. clones. Most every machine I worked on had a display card in them capable of displaying text and various graphics characters in color! Woo. This spot in the video will give you a better idea of what the heck I'm talking about: https://youtu.be/X_mFNBRXObg?si=amtuwVul7ERBOk_T&t=923

Collapse
 
alohci profile image
Nicholas Stimpson • Edited

1986 was about the time of my first dev job. I was creating and maintaining code for three different platforms, 8088 MSDOS in C, Z80 assembler for CP/M based Kaypros, and an 8035 based embedded tool also in assembler.

The PC had a CGA screen. 4 colours, low resolution and a flicker that hurt the eyes to look at for very long. We had full screen text editors - Brief was a godsend - but nothing at all resembling version control.

Build tools, beyond a few self-created batch files were also non-existent, but then things were simpler. Apart from the standard C library and O/S supplied APIs, code had no external dependencies that needed managing.

The Kaypro had a 10MByte hard disk.

Learning was done from books, manuals and experimentation. Personally, I didn't have any more experienced colleagues to lean on, although that was undoubtedly an unusual situation.

Transferring files between computers was possible - over RS232 serial ports at very low bit rates - but when you're entire storage capacity was only 10MBytes, files tend to be very small anyway. Sneaker-net - copying files to a floppy, carrying it to the other machine and copying it again - was generally faster and more reliable.

Development methodology was agile, although we wouldn't have called it that, and didn't follow any formal process. Someone would ask for a feature, I'd listen, code it and say "there you go". No sprints - it took as long as it took. If it wasn't what was wanted, we'd iterate.

Collapse
 
oliverkroener profile image
Oliver Kroener

@alohci interesting, a hard drive, that was advanced those times ... I started with an ITT 3030, with two floppy disks 5 1/4" and green monitor. Boot time of CP/M was around a second ...

I started with 8080 assembly, and I accidentally wrote a cross assembler for Z80 commands, since I only had the asm.com assembler. Didn't know anything about crafting a compiler and so forth ...

Collapse
 
rawveg profile image
Tim Green

I so want to chime in here, because I too started out at this time... so have firsthand knowledge, but that would be too much of a spoiler....

The one thing that I will mention, rather than software or equipment was process.... compute was expensive, so planning was meticulous, flowcharts, memory registers, plans reviewed before coding began.... it's actually a methodology that we're returning to now considering that in the age of the LLM context is king.

I will be watching these comments with great interest!

Collapse
 
sreno77 profile image
Scott Reno

C... lots of C and tweed suits

Collapse
 
johnmunsch profile image
John Munsch

100% correct on C. C was king at this time but a lot of people don't remember that C compilers like Microsoft C at the time had the ability to create C functions that were Intel assembly inside the function when you had to drop down to a lower level for higher performance. It was a way to embed some assembly into your overall higher level language application. Today it would be like having a 95% JavaScript app but part of the code was written in raw WASM.

Maybe you wore a suit if you worked for some places but fortunately I never worked any of them. The worst I ever dealt with was beige slacks and shirts with collars and even that was rare.

Collapse
 
yukster profile image
Ben Munat

In 1986 I was writing Pascal on a VAX PDP 11 at Hampshire College in Amherst, MA. It was definitely well-past punch card days by then but the PDP was in a glassed-in room and was only accessible via terminals arranged around the main room. You had to type in your entire Pascal program and then run it. If it crashed, you printed it out on linked, dot matrix paper (so one long scroll for the entire length of the program). Then you got to read through that whole thing to figure out why it went wrong. (I think there must have been an error type and line number involved but I can't remember for sure.) We also had a computer lab with a few Apple Macintosh computers... the original (it came out just a couple years earlier). I took a winter term course on C and still have my copy of The C Programming Language on my bookshelf). There was no internet or email. I didn't even hear of the concept of email until the mid-nineties. Computers were strictly something used in a computer lab; no one that I remember actually owned one.

Collapse
 
rawveg profile image
Tim Green

Well you might say that it was well-past punched card days, but not for everyone. Early adoption was not a thing in government, local councils and other large-scale institutions. In 1987 I was working for a local council and everything was punched card, except for a couple of paper tape machines. We were using multi-platter Winchester Hard Drives weighing around 5lbs each, and stacks and stacks of magnetic tape, but punched card was still the primary way to get your code into the machine.

Collapse
 
yukster profile image
Ben Munat

Ouch. I had no idea then or now that anyone was still using punch cards. Interesting!

Thread Thread
 
joetexan1962 profile image
Joe Henderson

In 1986, we were still targeting Minuteman II's with punched mylar tape.

Collapse
 
starkraving profile image
Mike Ritchie

My dad had a giant 8086, complete with the 5” floppy disks and a monochrome screen. When my friends all started getting the new Commodore 64, I asked my dad if we could get one too. His non-verbal reply was to build a colour graphics card and 2 joysticks, and then told me if I wanted any games I’d need to write them.

So I learned how to spec out a program using a paper flowchart, how to make sprites in Basic, map joystick outputs to inputs in the program, and do it all in 700 bytes.

Things have come a long way since then!

Collapse
 
johnmunsch profile image
John Munsch

BASIC was my first language (though on the TI 99/4) and in those days every version of the language was different (including some versions that had no line numbers at all). If you had a game you wanted from a magazine like Compute or a book like BASIC Games or More BASIC Games, you had to figure out how to translate it to your version of the language.

Collapse
 
john_rodger_dee953ed28186 profile image
John Rodger

'86 I think it was still just the black square floppy disks? Small programs could have been written and shared on that, maybe school assignments as well, else the source code might have been printed out on paper and submitted? (totally guessing).

I think IBM was the main player, would have been PC-DOS. Apple had created a home computer by '82-83 I think so the school and job might have had those, with BASIC. Dev work I guess would have been in the terminal with Cobol or C, or a few others. Look forward to reading the other responses!

Collapse
 
mergeshield profile image
MergeShield

1986 meant every dependency decision was intentional because you felt the cost of each one. now agents install whatever fits the prompt and nobody reads the dep tree. some of the discipline from constrained environments was load-bearing.

Collapse
 
johnmunsch profile image
John Munsch

This is actually a good observation and something that came up repeatedly in the first decade or so I was working. People would decry the fact that the computers were faster, with more memory, and more storage; yet the apps did not seem to get faster. And it's true. If you're not worrying about every byte and cycle, it's easier to write stuff in a lazy way and not have to spend so much time on everything. So writing the code sped up, but it certainly wasn't as efficient as it could be and we still see that today.

But on the flipside, it can be a lot more fun to put something together quickly and easily than to fret over every line.

Collapse
 
carlosarias_ profile image
Carlos Arias

I started when I 8 years in 1988 on the Commadore 64 coding in QBasic, then I moved on to Visual Basic 3 on the Windows 95; and graduated to VB 6 building Windows Applications and hacking software for Compuserv and AOL lol ; by 1996 I started coding Java, JSP and HTML. The rest is history. 38yrs Later I'm having a blast coding AI

Collapse
 
manithree profile image
Barry Roberts

My professional experience was limited to some dBase (II maybe?) fixes to a shareholder dividend program, and BASIC fixes for a loan amortization program, both on original IBM 5150 PCs. But, hey, I was just a sophomore in college, and I was actually hired in the stock/mail room at the bank, not as a programmer. But in the summer of 1985 I had unlimited time on an S-100 system donated to my church and I wrote a screen-oriented editor in C for CP/M. Dunno what happened to those 8" floppies. That made me miss UCSD Pascal and C on the Vaxen at school running BSD Unix.

Collapse
 
sfrobins profile image
sfrobins

I used a commodore pet with 4k of ram and a magnetic tape drive(1980). It booted into a basic intrepreter. I wasn't a professional, just a lucky kid. Before that, I took programming in highschool that consisted of sitting at a teletype (input keyboard, output roll of paper) that talked to a school district mainframe (1979). Then joined the army 82-86 and got a little more sophisticated working at the NSA.

Collapse
 
boblied profile image
Bob Lied

In 1986, I was five years out of college, working for a large corporation on a big team. The environment was Unix, and the programming language was primarily C. Production work was done on big iron (Amdahl, IBM 370), running Unix variants, but there were a lot of DEC VAX minicomputers for other things, primarily documentation and test systems.
There was a LOT of paper; design and code reviews were done by groups reading through paper copies. But almost never in color -- that was way too expensive. "Viewgraphs" was a literal thing -- presentations were done from projected transparencies. Documentation was done with text-based specification (nroff/troff). Without graphics terminals, the edit cycle usually included printing and marking up with pens. Printing was not an office task; printouts were distributed at the computer center and you had to walk over there to get them.
Networking was available but slow -- the quickest way to move a lot of data to another building was to put it on a tape and drive it over. Data speeds and memory were measured in K; maybe M if you were lucky; G was a fantasy. Memory was still expensive and optimizing for size was a useful skill. My desktop was an 80x24 character terminal; my editor was vi.
Domain-specific languages were popular; we had quite a few specialized little languages for subsystem niches, often built on lex/yacc. They were invariably called X-Design-Language (DDL, PDL, RDL, MDL, SDL spring to mind). Object-oriented design programming was a hot idea, but the seminal books (Booch, Rumbaugh, etc) had yet to appear. The coolest of kids were looking at C++, Smalltalk, or Common LISP, but not in production.
Waterfall processes were king, with a lot of chafing against frequent review steps. We had diff-based change control and version management. Between the manual processes required, the frequent design/review meetings, and the common practice of co-locating teams physically close together, software development had a lot of personal interaction and was quite a social experience.

Collapse
 
hcamacho4200 profile image
Henry Camacho • Edited

The 80's for me was a lot of Apple II and beagle brothers tooling. I wrote code on the C-64 using TSDS (The total software development Kit) which had the 1st assembler I even used on the C-64. And a lot of C and C++ on the PC with Borland C++, oh and in the 90's I was using an editor called Brief and Poly Version Control. Man I loved Brief.

Collapse
 
bumbulik0 profile image
Marco Sbragi

1986 was the year I truly began my career in IT. I was an electrical systems teacher at a vocational school. Some colleagues taught programming, and they had moved from Honeywell systems to the first PCs. In addition to teaching, they collaborated with a company, and given my passion for programming—at the time, I had developed small management systems for the Commodore 64 in Basic—they asked me if I wanted to lend them a hand. I started working with them. We developed management software in DataFlex, a semi-compiled 4GL language, en.wikipedia.org/wiki/DataFlex. The S.O. was Concurrent-CPM, a Multi-Tasking and Multi-User alternative to MS-DOS but compatible. However, software developed with DataFlex could be ported to Unix/Xenix/Novell without recompiling.

Collapse
 
harsh2644 profile image
Harsh

This is such a refreshing post to read. 🙏

A developer with 40 years of experience asking us to tell him about the past that takes genuine humility. Most people with that much experience would be telling stories, not asking questions.

I wasn't alive in 1986, so I can only imagine. But here's what I think it would feel like for a developer from 2026 to be dropped into 1986:

No Stack Overflow. No GitHub. No Google. A bug means a book, a printed manual, or asking the person next to you.
One typo could mean re-typing everything (no syntax highlighting, no autocomplete).
Shipping code meant floppy disks, not a git push.
No one asking what's your tech stack? you used what you had.

What genuinely amazes me is that people built incredible software with those constraints. I complain when my build takes 2 minutes. You probably waited hours for a compile.

The flip side question for you: what do you think we're overcomplicating today that you had simpler in 1986?

Thanks for this it's a reminder that experience isn't about knowing more. It's about having asked questions for 40 years. 🙌

Collapse
 
johnmunsch profile image
John Munsch

Yup, none of the quick and easy references we are used to from our WWW days. But! As I mentioned in another reply, Usenet was there and people were turning to it to both ask and answer tech questions (though, like Reddit, it's closest current analog, it was mostly other things). And fast answers were colleague's and books.

Most of the tech stack was paid for, a huge change from today. While most devs of today have paid for few or none of their tools (definitely not OS, compilers, most libraries, and maybe not even an editor), then you paid for most everything. When Windows 95 came out, it was a huge event at the computer store. They had stacks of boxes and lots of ancillary software made sure they were doing big releases at the same time. A box of fonts or screen savers (flying toasters!) priced to entice you to buy add ons for your new OS.

What are we overcomplicating today? I'm not sure. I can say that a lot of overhyping is happening in AI and that comes from someone who uses it all day every day at my current employer. Not just as a tool to help me build stuff but also as a fundamental piece of the software we're building. It's neat, I think it will have impacts for decades to come, but it's being oversold by people who think it can do the same thing as a great software engineer, lawyer, accountant, writer, artist, [insert favorite job that AI is supposed to replace here] only because they do not do that job today and thus cannot recognize the difference between barely adequate and exemplary output. In my experience, you should regularly ask AI questions or to solve problems you already know the answer to. If you do, then you see the cracks. If you only ask things you don't know or cannot do, it looks great!

OK. Maybe overcomplicating is how quickly a lot of people turn immediately to huge back-end-as-a-service providers. Like, you couldn't possibly take a shot at a piece of software using the bare minimum bits and bobs like a cheap VPS and a database. You need to start day one on AWS, Google Cloud, or Microsoft Azure; even if you have nothing complete and no visitors.

Concentrate on building something minimal and market/promote it. Your biggest fight at first is getting people to come. If you don't focus on that, as the author Cory Doctorow says, “It's very hard to monetize fame, but impossible to monetize obscurity.”

Collapse
 
harsh2644 profile image
Harsh

Thank you for the detailed response this is gold. 🙏

The paid tools point really puts things in perspective. We complain about a $10/month subscription today; you probably paid more for a box of fonts than I've paid for my entire dev toolchain.

The AI point resonates hard. Ask it questions you already know the answer to that's the litmus test. Most people only ask things they don't know, so AI looks magical. Ask it something you're an expert in, and the cracks show up fast.

And the Cory Doctorow quote saving that one. Impossible to monetize obscurity is going on my wall.

Really appreciate you taking the time to write this. Conversations like this are why I love this community. 🙌

Collapse
 
alexstoneai profile image
Alex Stone

What a perspective shift. I just started building digital products a week ago and already feel the pace change. Can only imagine what 40 years of watching the industry evolve looks like from the inside. The one constant seems to be: the people who ship, win. Not the people with the fanciest tools. 1986 to 2026 — same game, different stack.

Collapse
 
rawveg profile image
Tim Green

The key thing for me, and I'm sure for the OP is how often we've seen the same trends and cycles repeat - albeit, rebranded, renamed, and promoted in a different way, but still the same.

A classic example of that is Cloud Compute or Infrastructure As A Service (IAAS), that actually existed in the 1960's but was actually time sharing of mainframes, where institutions made money from taking their super expensive mainframe computers and rented out time slots to users who paid based on usage. It was done to keep the mainframe running 24/7 to earn its keep.

Collapse
 
gramli profile image
Daniel Balcarek

Curious, was open-source or shared code even a thing back in 1986?

Collapse
 
shredwheat profile image
Peter Shinners • Edited

A variation of open source in the 80s was in printed magazines (like Compute! Gazette). These had pages of printed program source (in basic, assembly). When you found an interesting one you would type it in, line by line.

archive.org/details/computes.gazet...

Collapse
 
alohci profile image
Nicholas Stimpson

Gosh, that takes me back. Personal Computer World (in the UK) used to print BASIC programs in their magazine. A colleague who wasn't a programmer would type the programs in, then when they didn't work because of a copying mistake, would bring the program in to the office to ask us to debug it!

Collapse
 
gramli profile image
Daniel Balcarek

That’s really interesting, thanks for the explanation and the link!

Collapse
 
johnmunsch profile image
John Munsch

Peter mentioned one version of it, apps from magazines and books (often simple games). But when I started at Tandy one of the things that was interesting was that we had access to Usenet. It was one of the manifestations of the Internet before there was an Internet.

Usenet was (and still is) a store-and-forward messaging system with different topics you would subscribe to and you would pull new messages whenever you connected to your local source, they would send them on to you, forward your replies to their own setup and then on to other Usenet systems, etc. Imagine the Internet but the packets are individual messages and they propagate in and out very very slowly.

Not surprisingly, groups with source code for games like Hack and Larn, utilities, editors, etc. were posted constantly. They were broken up across dozens, sometimes hundreds of posts, with tools to help you extract them and put them back together (but you might still have to build that code on your machines). Other tools handled character mapping issues so you could embed binary data into posts (like pictures).

So even from the start of my career, this subset of the Internet was an everyday thing for me. The World Wide Web just took that to the next level because there were consistent tools to render stuff and to interact in real time with servers (which meant that real-time access to the Internet suddenly became a big deal for Tandy).

Collapse
 
davelnewton profile image
Dave Newton

ADM-3As and Wyses, early PCs, Borland compilers, blue-background text editors, CASE tools, assembly, 🤬 extended vs expanded memory, TSRs, IBM Model 80 keyboards

Collapse
 
johnmunsch profile image
John Munsch

As I mentioned above, Tandy was transitioning to being a PC clone house, so I had little to do with terminals, but early PCs were our bread and butter. In particular Tandy liked to clone and extend IBM's PC Jr. and expand on it. They reused the analog-to-digital/digital-to-analog converters they were including for joysticks to give themselves digital audio so within just a few years they were hard at work on recording and playing back sound and playing music, well before Soundblaster cards became ubiquitous.

Here's software I worked on but was in no way a principal. Another developer did all of the heavy lifting: youtube.com/watch?v=hS-ColSsPZY

Collapse
 
cfir_aguston_f751a11907c2 profile image
Cfir Aguston

Resource limits in 1986: bytes, CPU and I/O, impacted engineering habits: smaller code, tighter testing, and manual operations that often prevented the deployment mistakes we accept today.

Collapse
 
johnmunsch profile image
John Munsch

We invented new ways of having deployment mistakes you've never thought of :)

Imagine shipping a piece of software on floppies in a box to people and then finding a bug in your game or whatever after its already in people's hands? You can fix new ones before they ship, but the others are just out there. Hopefully whatever it is is obscure and hard to trigger, because if its not, you have no way to contact them, there's no website they're going to hit. You (and they) are hosed.

Collapse
 
alexandertyutin profile image
Alexander Tyutin

I wrote my first code in 200X but feel the same 😆