DEV Community

Cover image for Programming won't be automated, or it already has been
edA‑qa mort‑ora‑y
edA‑qa mort‑ora‑y

Posted on • Originally published at mortoray.com

Programming won't be automated, or it already has been

I'm afraid of a robot taking my job. I don't care about the job part much, I'm afraid of the actual robot. To replace my job it'd to exhibit human level intelligence. That's scary. Before that happens though, maybe some of its infantile, and less scary, brethren can make my job a bit easier.

What is automation

It's easy to say automation will replace jobs. But it's not some magic wand that can just replace any job. It's highly industry specific. In manufacturing automation means converting some raw materials, or baseline supplies, into a finished product with a bunch of moving machines sticking stuff together. In Logistics it means fulfilling, loading, and delivering that product to a customer with self-driving vehicles.

Those are both cases of getting from state A to state B. It's easy to understand what automation. It's replacing a very tangible process with a machine. Sure, we might keep a few humans around to babysit the machines, or just for decoration, but it's the machines doing the bulk of the work.

This understanding is a bit of a problem for software. We do have a target product, point B. It's the application that executes on a device (desktop, mobile, appliance, whatever). But what is point A?

We have source code already. A compiler takes this and produces something the machine can understand. It may not be an industrial looking robot, but it's certainly a machine that transforms from A to B. As our languages become more abstract and more expressive the role of this machine becomes ever more important. Throw in some build scripts, automated tests, push-button deployment and we have a huge amount of work not being done by a human.

I'm not afraid of a compiler. Indeed, I'd love that it becomes more intelligent. I dream of a future where I get sensible error messages.

But no...

One popular interpretation of automation is that we won't need source code. The machines will be able to "program themselves". It seems a bit fuzzy to me, so let's try to understand what this might mean.

Consider a person sitting in front of their computer, wishing it to do something for them. We'll allow this person to speak to the computer, since voice recognition is a reality, even if not yet perfect. What do they say to the computer?

What if we require them to speak in structured commands, like "load this file", "change this text to this", or "save the file to directory A"? This is just a high-level programming language. It may be a really nice product, but it's not something fundamentally new. It's just another compiler. Albeit a somewhat cool one.

To be new we'd need to allow the user to make abstract requests, like "update the date in my business documents", or "cross reference my bank statements with the expenses and report on discrepancies". Moreso, if the goal is to produce reusable applications, we need this to be even more abstract: "Create a plugin that downloads bank statements, compares to the expense statements, and produces a standard report".

This is the crux of programming, taking vague, abstract, and often confusing requirements and creating something usable. We need to navigate human understanding as well as a sea of technical options. Writing code is about taking those ideas and making them concrete, and useful.

This is not something AI is anywhere close to being able to do. Even us human programmers struggle to do this at times. It's a level of reasoning way beyond the scope of what we consider automation.

Wait, you say your job doesn't involve this type of reasoning? You just take structured feature requests and blindly transform them into code. Then yes, you will be replaced. But as a programmer you should have automated that long ago and already moved on to other things.

But yes...

There's also a less tangible role for automation. In law it could mean using a natural language understanding, case-law searching, agent. In medicine it's a research collation and symptom matching engine. These tools both replace some human effort, but also assist in making better decisions. This is a place where automation can shine. After all, it's not just abstract reasoning that we do as programmers, there's still a bunch of actual coding to do.

I'd love to see static code analysis developed more. These are tools that look at the source code of a program and find errors in it. They've already shown they can help find memory leaks and security holes. This can only get better as time goes on. I've got better things to do than tracking down obscure race conditions and memory loops.

The optimizers will also get better. Instead of working at a low-level, where they already do some amazing work, they can look higher in the code. It'd be really interesting to have my tools suggest a rewrite of a function since it's identified it as algorithm XYZ and has a good solution.

I'm also waiting for the next generation of refactoring tools. Identifying logically similar functions and factoring out the common code is something I think an AI could do. There is plenty of refactoring I don't do now since it's often not worth the effort. The same automation could roll over into better source control systems that actually understand the code, not just line-by-line text.

There is so much potential, and i'll welcome all of it.

Lots of potential, no loss

A lot of programming is already automated and there has always been a push to get more. Certainly more automation will minimize certain roles, but mainly it'll improve productivity. Given the volumes of issues most projects have, a significant increase in productivity won't be putting any teams out of work. It'll just result in higher quality software.

To completely remove programmers from the equation would require essentially a human level artificial intelligence. And if I start seeing near sentient robots walking around, my first thought is certainly not going to be, "oh no, it's going to take my job!"

Top comments (14)

Collapse
 
tra profile image
Tariq Ali

For various other alternative perspectives of "can programming be automated" debate, you may be interested in "Will programming be automated? (A Slack Chat and Commentary)" and the Reddit comment thread.

As for my own personal (and ever-evolving) view, I don't really think most of what we do as humans require human-level intelligence. Perhaps a few things can't be easily automated (such as turning ideas into concrete software), but you will likely need much less programmers than before.

What happens to the excess programmers then? Either they find new IT work (which can happen...due to the Complexity Paradox) or they lose their jobs (and we deal with the ramifications).

Collapse
 
lepinekong profile image
lepinekong

An old colleage 30 years ago told me this: during the glorious day of industries factory workers did think their jobs require too much intelligence to be automated, when that happened, they were surprised. Same for developers today. Even without talking about automation, there are a lot of inefficiencies today which could already be eliminated, as for automation many stuffs are just transposing requirements elicited by business analysts (I'm sorry to say many devs at least in corp world are not capable of understanding complex business without a business analyst), but long term it's not about automation, it's about AI capable of same and even superior intelligence to humans see aimonitor.stream/resource/scientis...

Collapse
 
nepeckman profile image
nepeckman

What if human creativity can be approximated? There are already machine learning projects that can produce new melodies that sound as if they were written by certain composers, or in a certain style. Given enough data, machine learning algorithms can learn how to categorize and produce just about anything. If we theoretically had a dataset of the source code for 1 million web apps, and labeled them according to what the web app accomplished, I don't think its unreasonable to say that a good machine learning algorithm could spit out programs that achieved different labeled goals. With a good NLP that could take requirements in English and translate them into labels on the dataset, I think it could be possible to produce applications automatically. Now of course that theoretical data set is currently impossible to produce, and this is all theoretical, but my point is that its not unreasonable to say that one day we could have an AI that approximates human creativity. Maybe with 20 years of improvements on machine learning, and 20 years of building code bases as dataset, a machine learning algorithm could potentially threaten programmer's jobs.

Thread Thread
 
nonespoon profile image
Tsvetan Dimitrov

That's different. Creating music with algorithms is really easy. It's not creativity, it's just math. What he's talking about is real creativity, inteligence, emotion. This is what robots can't have.

I remember the book "The Positronic man", when he had all the information about people, all the knowledge and ability to learn really fast, but actually he wanted to become a man. He asked for human rights, they gave him. He was so obsessed of being a man that he developed an artificial human organs that absorb the energy from the food. People started buying his organs and lived forever. But then he decided to implant these organs into himself so he can feel what is to be a human. He was in a real pain when his liver stopped working but he didn't give up. Then he found a doctor to implant a human brain in place of his positron brain. Then after several years of pain, he died with a smile on his face, happy that he had the ability to feel what is it to be a man.

en.wikipedia.org/wiki/The_Positron...

Collapse
 
kelfink profile image
Kevin Fries

What percent of programming these days is consumed by working on the UI? With more powerful AI (e.g. chatbots, automated agents) the UI becomes nothing more than the results of what you wanted to have happen. With a lot of optimism, programming will be reduced to working on the algorithm. Possible?

Collapse
 
d3ckardr profile image
Rafał Radziszewski

This is what programming is all about. The rest is just clutter.

What people misunderstand about programming is thinking that it's hard because tools are hard. Nope, tools are managable and sometimes just great. Programming is hard, because people do not realise scope of implicit assumptions and decisions that are part of natural language, but are unacceptable for computers.

If a non-programmer wants a feature, he thinks about a happy path. Programmer needs to think about all paths and usually performs some kind of linearisation to avoid exponential explosion. This is the heart of programming skill. We might change our tools, abandon source code and use some user-friendly interface, but that decisions will still have to be made and so programmers will be needed for quite a long time (that is until "true AI" emerges).

Collapse
 
johardmeier profile image
Johardmeier

50 years ago, IBM accessed the longtime market potential of their personal computers to be below 1 million pieces. 25 years ago reliable optical character recognition started to seem possible to be implemented in a pc. Today we know, that soon it will be possible to sit into a driverless car and say "take me to the opera" and it does.
We always tend to predict the future in terms of the past. Most problems we as programmers solve we would not have without computers. We cant say now what the future looks like in terms of programming, but it will be here sooner than we think now and as others said it will probably involve a lot less programmers than we have now.
And of course creative people will always find something to do, perhaps something to improve, but we will do it along AIs that do things we cannot imagine today.

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

I am not saying that AI is not coming. I believe in a future with true sentience level artifical intelligence. My position in the article is specifically about how the near future applies to programming.

If you consider what I say about the job in this article, and in my other article, you'll see the inability to automate programming relates to how I define what a programmer does.

The level of AI needed to automated the job away is the same as general human level intelligence. At this point the question of whether it replaces programmers is irrelevant, since it replaces all of human jobs

In the near term I truly help pseudo-AI (what we have today) will help automate many parts of programming. Perhaps there are some kinds of coding jobs that might be minized by this. Perhaps it enables good programmers to get more done. This is good.

Collapse
 
johardmeier profile image
Johardmeier

It always depends on what the near future is. I tried to make the point, that it doesn't need human intelligence to replace a big part of the programmers working now.
If we think about it in terms of the past, we have to wait for true sentience level AI that does the exact job we do now. But with shifting paradigms like in the last few years, we could find ourselves having skills that are no longer needed. Not in a year, but in ten?
The first iPhone came ten years ago and today people dont know how to use maps anymore. Or timetables. And there's an app for everything, and if there isn't, you use Siri. (not yet, but soon ;). PCs are in decline and most important: outside of work, no one uses laptops anymore.
My point is: if we look at the future with todays knowledge, we can probably predict the next five years. If thats your horizon I think you are very right with your assessment.

Collapse
 
verdverm profile image
Tony Worm

I believe 90+% of what programmers produce can be auto generated. Granted this is not AI, but enabled by a transpiler from design space to implementation.

Prototype is here: github.com/hofstadter-io/geb

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

Don't mistake coding for programming.

Programming involves talking to somebody, listening to a bold idea, and then producing a product that can do that. That'd be an unbelievable feat for an AI to accomplish. Creating a better design language, or tool, doesn't chagne anything, it just creates another programming language.

Refer to my other article on what a programmer does

Collapse
 
codemouse92 profile image
Jason C. McDonald

Thanks for bringing some common sense to this ongoing debate. I've been trying to explain essentially the same concepts in IRC lately - programming is interfacing between two incompatible pieces of hardware: the human brain and the computer CPU. Until the computer fully emulates the human brain, we'll still need programmers.

And anyway, who fixes the bots when they glitch?

Collapse
 
kgrgreer profile image
Kevin G. R. Greer

I agree, but a lot of the code written today is just adapter code: adapting from the database to the business objects, from the business objects to/from the network to/from the GUI to/from XML or JSON, etc. While working on Google's FOAM Framework, we found that we could typically eliminate between 80-98% of the code required to develop systems, with the higher end of that range being more typical. So while we still need programmers, we need fewer of them.

This Video explains how it works.