Developers fear that their own obsolescence will be spurred by artificial intelligence, according to a new survey of over 550 software developers to be released this week by Evans Data Corp. When asked to identify the most worrisome thing in their careers, the largest plurality, 29.1%, cited “I and my development efforts are replaced by artificial intelligence.” This was significantly more worrisome than platform issues, such as their targeted platforms becoming obsolete which was the second most cited worry at 23%, or that the new platform they're targeting doesn't catch on (14%).
The thought of obsolescence due to AI was also more threatening than becoming old without a pension, being stifled at work by bad management, or by seeing their skills and tools become irrelevant. While the developers who worried about AI were found across industries and platforms, one strong correlation was that they were more likely to self-identify with being loners rather than team players.
Dev Bootcamp is a 19-week accelerated coding program designed to teach people how to become web developers. Graduates to this program are then invited to an "alumni Slack", where they can talk with other alumni about their future career in programming. (Disclosure: I have graduated from this program and was invited to this chat.)
On February 1st 2017, a discussion took place in the "alumni Slack" about the future of automation in software development..sparked by a off-hand comment in a separate conversation. I lightly edited the transcript below after getting permission from those who participated in this discussion. To protect the identities of the people involved in this discussion, their names have been changed.
daphney.cronin - "Coding is a commodity these days and so while you do need to be competent, the thing that's going to win the offer is your ability to click with the interviewers and your genuine desire to be part of the team and contribute to the product."
This statement makes me dread for the future of my current profession. It's true, of course, but commodification of skillsets will ultimately lead to automation of those same skillsets.
adaline_stracke - I think the automation of development is pretty far off though. There are only so many things that a WYSIWYG editor can do, and the code isn't very maintainable. The main challenge that I see in programming isn't writing code, it's writing code that is human readable because computers are dumb and AI hasn't progressed far enough to debug the way a human can.
The WYSIWYG is a “for instance” BTW.
kallie - I dunno, I think making a computer drive a car is a lot harder than making it write code.
adaline_stracke - I disagree. Making a computer drive a car is a series of concrete decisions where as writing code is a series of decisions that have opinion behind them
skittles_mcbangbang - I see things like â€˜serverless infrastructure' has a harbinger. take a look at aws lambda, you put code on a webpage, and it gives you an endpoint to hit.
How long before someone writes some sort of adapter that takes business logic and written text and turns it into behaviour. I mean, we already have testing frameworks that sort of do that (cucumber).
adaline_stracke - But that's okay. Because it takes someone who knows how to create the business logic. I can learn a new syntax fairly easily, but that's not what programming is.
Actually, isn't that what Ruby is? We write logic in Ruby and it compiles to C.
kallie - Yes higher level language abstraction is already semi-automated code. It all becomes machine code in the end. There's no reason to believe this trend won't continue.
skittles_mcbangbang - Non-programmers are also capable of logic, and one might argue have a better grasp of what the intended product is supposed to look like. And if you provide them a quick enough feedback loop between them writing an english sentence and seeing app behavior change...
kallie - Yeah that would arguably be better than your normal development pipeline. Hyper-rapid development and testing.
adaline_stracke - I guess what I'm trying to say is that the leap from writing code as we do now to full natural language processing and translation into behavior is very big.
kallie - It's big but I don't think it's as insurmountable as many people would assume it to be.
adaline_stracke - Agreed.
skittles_mcbangbang - Agreed. We might not need the natural language processing bit either. It wouldn't be hard to train people to write their english in a certain structure. it would just be part of the job training.
1: Feature: Some terse yet descriptive text of what is desired 2: Textual description of the business value of this feature 3: Business rules that govern the scope of the feature 4: Any additional information that will make the feature easier to understand 5: 6: Scenario: Some determinable business situation 7: Given some precondition 8: And some other precondition 9: When some action by the actor 10: And some other action 11: And yet another action 12: Then some testable outcome is achieved 13: And something else we can check happens too
adaline_stracke - Where I work, we tried doing something similar for testing, but it wasn't successful.
I guess I fail to see the difference between teaching someone to express logic in one structure (like a programming language) vs another (such as what you pasted there). Fundamentally, it comes down to specific training which is what was talked about above.
skittles_mcbangbang - It's extremely helpful in microservice envs, IME. It was an easy way abstracting the several different calls by simply stating the prerequisites needed for the actual tests to run.
daphney.cronin - The difference is that another human (with no training of how to express logic in that formal fashion) can read the specs and understand exactly what is occurring.
adaline_stracke - @daphney.cronin, I get the idea of readability, but we were talking about the automation of WRITING code.
daphney.cronin - Yeah, this type of approach really wouldn't help in automating code (since you're replacing one type of formalization with another). A corporation that wants to automate code would be better off trying to improve code generators (like Rails) or CMSes (like RailsAdmin, ActiveAdmin, or Wordpress)...
skittles_mcbangbang - That doesn't remove the need for a dev though, since in using those frameworks, you're still creating and maintaining servers. If you can type gherkin into something like Lambda or OpenWhisk, then you start to get closer to replacing developers.
adaline_stracke - I think that's the real rub here. Until AI is as good at debugging and maintaining as humans are, there will still be needs for devs.
I think it's the same idea as self-replicating robots. Until robots are good enough to fix and rebuild themselves we need people to maintain them. Until software is good enough to fix and write itself, we will still need people to fix and write software.
skittles_mcbangbang - No doubt, but it's still possible to replace a LOT of devs with the scenario above. You still need people to build and maintain those types of infrastructures, but waay less.
Commentary - Computer scientists have conducted research into the automation of bug fixes -- the most promising program at the moment is GenProg, a program that uses genetic programing to mutate existing codebases to fix bugs. According to an interview by its developers, GenProg is very cheap ($8/hour) and speedy, but the computer-generated code is not as maintainable as human-generated code. GenProg also is heavily dependent on the specifications to determine whether it has fixed the bugs, so the software engineers using GenProg would switch from writing code to writing tests. The "formalization" of business logic would still remain in human hands.
The discussion assumed that the needs for software will stay constant. This may not be true -- instead, software is increasingly required to be made more complex and to handle more stuff. This leads to the Complexity Paradox, based on Tog's Law of Commuting:
"The time of a commute is fixed. Only the distance is variable." Translation? People will strive to experience an equal or increasing level of complexity in their lives no matter what is done to reduce it. Make the roads faster, and people will invariably move further away.
... Given that people will continue to want the same level of complexity in their lives, given that we will continue to reduce the proportion of complexity of any given function that we expose to the user, we may expect that the difficulty and complexity of our own tasks, be they at the application or OS level, will only increase over time. That has certainly been the case so far--we've gone from simple memo writers and sketchpads to document processors and PhotoShop. And we may assume that's only the beginning.
So when we build higher-level languages, code generators, and CMSes, we merely encourage users of those tools to do more "work" with those tools. This increases the complexity of the resulting software, which would require more maintenance work. Software is constantly being asked to be changed and upgraded, and the field must keep up with the demands.
This would seem to suggest that programmers would have high job security (and Tog would make that very assertion at the end of his essay). However, the goal of all automation is cost-savings and greater efficiencies. There is cost-savings involved when you fire programmers and replace them with cheaper "specs writers". And as software becomes more complex and convoluted, we will probably rely less on very-fallible humans and more so on less-fallible machines.
Therefore, it is safe to say that many developers will still remain afraid of automation/AI for quite some time, and so the debate will still rage on...