re: Programming won't be automated, or it already has been VIEW POST


For various other alternative perspectives of "can programming be automated" debate, you may be interested in "Will programming be automated? (A Slack Chat and Commentary)" and the Reddit comment thread.

As for my own personal (and ever-evolving) view, I don't really think most of what we do as humans require human-level intelligence. Perhaps a few things can't be easily automated (such as turning ideas into concrete software), but you will likely need much less programmers than before.

What happens to the excess programmers then? Either they find new IT work (which can happen...due to the Complexity Paradox) or they lose their jobs (and we deal with the ramifications).


An old colleage 30 years ago told me this: during the glorious day of industries factory workers did think their jobs require too much intelligence to be automated, when that happened, they were surprised. Same for developers today. Even without talking about automation, there are a lot of inefficiencies today which could already be eliminated, as for automation many stuffs are just transposing requirements elicited by business analysts (I'm sorry to say many devs at least in corp world are not capable of understanding complex business without a business analyst), but long term it's not about automation, it's about AI capable of same and even superior intelligence to humans see aimonitor.stream/resource/scientis...


I can imagine that programmers tend to have more creativity than robots. A robot would take a set of tasks and just do it. The human aspect of it is to redifine a task to form a better product.

Because of this I actually think the human level intelligence should be a must, but in the end it is subjective and would require an actual comparison between a robot with human level intelligence and without one.


What if human creativity can be approximated? There are already machine learning projects that can produce new melodies that sound as if they were written by certain composers, or in a certain style. Given enough data, machine learning algorithms can learn how to categorize and produce just about anything. If we theoretically had a dataset of the source code for 1 million web apps, and labeled them according to what the web app accomplished, I don't think its unreasonable to say that a good machine learning algorithm could spit out programs that achieved different labeled goals. With a good NLP that could take requirements in English and translate them into labels on the dataset, I think it could be possible to produce applications automatically. Now of course that theoretical data set is currently impossible to produce, and this is all theoretical, but my point is that its not unreasonable to say that one day we could have an AI that approximates human creativity. Maybe with 20 years of improvements on machine learning, and 20 years of building code bases as dataset, a machine learning algorithm could potentially threaten programmer's jobs.

That's different. Creating music with algorithms is really easy. It's not creativity, it's just math. What he's talking about is real creativity, inteligence, emotion. This is what robots can't have.

I remember the book "The Positronic man", when he had all the information about people, all the knowledge and ability to learn really fast, but actually he wanted to become a man. He asked for human rights, they gave him. He was so obsessed of being a man that he developed an artificial human organs that absorb the energy from the food. People started buying his organs and lived forever. But then he decided to implant these organs into himself so he can feel what is to be a human. He was in a real pain when his liver stopped working but he didn't give up. Then he found a doctor to implant a human brain in place of his positron brain. Then after several years of pain, he died with a smile on his face, happy that he had the ability to feel what is it to be a man.


Code of Conduct Report abuse