Unless you've been living under a rock for the past couple of weeks, you've heard of GitHub Copilot, GitHub's AI code suggester powered by OpenAI. Even without being officially released, Copilot has sparked intense debate, concern, and excitement from all corners of the dev world.
It's worth noting, however, that this isn't even the first AI Code suggester(See Tabnine or Kite), but is probably just the most novel one. Nevertheless, many younger developers have expressed anxiety over whether their jobs are at risk of being automated out by Copilot and whether the type of coding that they're practicing will soon be obsolete.
So should you be concerned? Short Answer - No. Long Answer - Still no, but maybe be a little conscious about Copilot.
The most important aspect to realize about Copilot is that it's called Copilot, not Autopilot. On the Copilot homepage, GitHub admits that "the code it suggests may not always work, or even make sense".
Copilot is a developer tool, not a developer. Just as using ReactJS is useless if you don't know HTML and CSS, using Copilot is useless if you don't actually know how to code. You're going to need to review and verify every line that Copilot spits out regardless, so the most it can do is save you time, not replace you.
More importantly, the mark of a good developer isn't the ability to write a Python function or a CSS declaration themselves. Let's be honest here, no matter how long you've been programming you're probably still checking Stackoverflow to remind yourself how to perform these kinds of functions. The only way Copilot is going to change your job, therefore, is just speed up the process of writing these repetitive functions.
The true mark of a good developer is connecting pieces to build coherent software. As long as you know how to build, your job isn't going anywhere. The only thing that Copilot really makes obsolete is a problem set for a Computer Science course.
So our jobs are safe… why is there more to this article? Well, as I said earlier, there are some things about Copilot that we should be a bit worried about.
Copilot is trained off of all public GitHub repositories. While this a great source of data in terms of its size, there are also tremendous opportunities for this data to be flawed. When it comes to machine learning, flawed data can produce incredibly flawed results (see Microsoft's Tay for a reminder).
A significant amount of public repositories are inevitably going to include code that is inefficient, vulnerable, and generally has bad practices baked into it. If you're going to use Copilot, you need to be conscious about this. You should hold any code generated by AI assistants under a close microscope to make sure that you're not weakening the software that you build.
Another concern that many have expressed about Copilot is that it is training off of public repositories regardless of the license of said repositories.
oh my gods. they literally have no shame about this.— ✨ Nora Tindall, automated relay 🪐 (@NoraDotCodes) July 7, 2021
GitHub Support just straight up confirmed in an email that yes, they used all public GitHub code, for Codex/Copilot regardless of license. pic.twitter.com/pFTqbvnTEK
While GitHub says "We found that about 0.1% of the time, the suggestion may contain some snippets that are verbatim from the training set", this doesn't necessarily absolve you from plagiarizing code.
Even if GitHub is providing a snippet based on a number of data sets, that snippet could still be derivative of a codebase that is copyrighted, opening you up to liability.
So moral of the story: Unless GitHub starts checking the licenses of its training data, be cognizant of the code you take from it.
There's a tendency for many parts of the developer community to be critical of new tools and frameworks that make our jobs easier. Whether they're arguing that C is too abstracted from Assembly or that you should only be using Vanilla JS, there is a common sentiment to be wary of crutches.
While you certainly shouldn't avoid everything that makes development easier, there is a level of merit to this concern. As stated earlier, using Copilot without a strong knowledge of programming can be incredibly dangerous. You need to be able to understand, verify, and optimize anything that Copilot gives you, or else AI code suggestions are going to do you more harm than good.
In addition, if you're an aspiring developer, the rise of AI coding assistants does not mean that you can skip out on your algorithms class. Even if you're not going to be writing the code yourself you need to be able to know how that code fits into the larger software that you're building.
Regardless of whether GitHub's Copilot lives up to the hype, there is no question that the way we code is evolving. While previous innovations in the development industry have focused more on languages and frameworks, we're entering an age where our very work processes are changing.
Your code editor is no longer just some program to edit the contents of a file. It's a tool to remove the mechanical and repetitive parts of software development.
When dev tools like Copilot and Codesphere succeed at accomplishing this, development can become a more creative endeavor. For Copilot, that's done by saving you the time of writing cookie-cutter functions. For Codesphere, that's done by saving you the time of managing and configuring your infrastructure.
So is your job as a developer safe? Yes. But your job as a developer isn't going to be what it is today. Development is becoming less like working on an assembly line and more like working in a music studio every day - And I for one am excited about that.
Disagree with our take? Comment down below what you think about Copilot!
Happy Coding from your good friends at Codesphere, the next-generation cloud provider.