Ethical and compliance issues aside, the biggest problem I see with AI generated content or code is waste. Once again we use a new tool to generate more things quicker, rather than to generate fewer, better things. Even more interesting is that we could use this as an opportunity to analyse our ways and recognise cruft. In other words: if things are that formulaic that a machine can generate them, do we really need them?
A common joke is that people use ChatGPT to turn three bullet points into a well-formed email or cover letter and the recipient using it to turn that email back into three bullet points. Itβs funny because itβs true. When I applied for jobs, I took the job descriptions and my CV data and asked ChatGPT to generate well-worded pieces for me. Quite a few companies I applied at use AI to screen incoming emails. I am also sure recruiters use AI to distill CVs and cover letters, given the amount of emails they have to deal with each day.
Application processes, CVs and cover letters have become formulaic to a degree where there are services to write your CV for you. It feels like hiring a lawyer or tax advisor, as the language necessary to get where you want to go is so far removed from day-to-day communication that it needs an expert. Do people read all that well-worded information though? I doubt it. Personally, I scan more than I read.
AI recommending us content doesn't mean we have to take it all
Almost every application of AI sold as a way to make us more efficient means creating a lot of content automatically. I started typing this in Visual Studio Code with GitHub Copilot enabled. Copilot tries to be helpful by autocompleting my sentences and offering new paragraphs that complete the thought. Or so it thinks. What it did was annoy me with lots of unnecessary repetition of points made in the first paragraph. So I switched another editor - Hemingway, which keeps your writing terse and to the point.
I could have let Copilot go nuts and keep all its suggestions. It is tempting as it feels that you create a lot and you're a more efficient writer. It is pretty common that people do that. The amount of generated content is overwhelming the current web. As the Verge put it, AI is killing the old web, and the new web struggles to be born. People generate a lot of articles, and moderators can't keep up, so they also use AI to automatically detect AI generated content. It is the search engine optimisation arms race all over again, but this time it's automated and it costs a lot of energy. Both human energy and electricity being wasted.
This is not only detrimental to the quality of the web as we're drowning in mediocre, traffic-optimised content. It is also bad for the planet. AI functionality doesn't come cheap. It is expensive in computation and means a lot of traffic going back and forth.
Trying to be green in an avalanche of generated content
We are currently looking how software can be greener and use up fewer resources and people like the Green Software Foundation do some amazing work in spreading awareness. And yet, the cost of AI consumption is not often questions as it is the cool thing of the moment.
Sure, with text this isn't that much of an issue. Generated images, videos and upscaled low quality media means a lot of computation power and energy used for, well, what exactly? To prove that we can generated an image from a text saying "a monkey wearing a watermelon as a hat in the style of matisse"? Our few seconds of fame as a funny creator on social media without having to put any craft into it?
It's pretty likely that this is another fad that will go away in the long run. Much like we stopped doing Simpsons avatars or Elfed ourselves. Younger audiences also consider GIFs as "cringe" which makes me happy as that was traffic and distractions nobody needed.
If AI generates code it can also optimise it
It is interesting though that the CEO of Twitter announced that soon Copilot and others will generate 80% of the code out there. The optimiser in me immediately saw this as an opportunity to cut the fat of our code bases. If 80% is generated boilerplate code, why should that always be created instead of re-used? During the course of my career one thing that annoyed me was that developers have no patience with platforms. Instead of taking part in open source and standardisation to make the platform do what we need, people kept writing their own solutions. Solutions often touted as a "stop-gap" solution that in the end, never go away. JavaScript libraries that made cross-browser development easier are now a dependency that can't be removed any more. Even worse, they often aren't even maintained, meaning they do not only become unnecessary traffic and code bloat, but also a security and performance issue.
I have to admit that in the last years I lost some interest in developing code. It's not that there are no cool challenges or excellent new platform features. It is the way we approach development these days that bored me. We don't solve problems, we don't look for native solutions. Instead we include packages and components we don't even know how they work or what they do. We build products from building blocks that other people wrote. It's not a "let's start a project by looking at the problem to solve". It is "run this install to get the boilerplate code you might need". When we release these products we find out they don't perform. So we hire a performance expert to analyse our products and they find unused code and cruft. We then write more code to remove this unused code and create optimised code bundles for different use cases.
AI content generation feels the same. We generate a ton of content that isn't ours and may be bad quality or a terrible idea and throw it out there. And then we use AI to cut it down to something that is understandable again. Seems wasteful, doesn't it?
If 80% of the code of the future will be generated, this is a great opportunity to optimise that code. We won't have to argue with engineers who want code to be done in a certain way. We can let machines generate code for machines. And then we have to ask ourselves why this code even exists. AI is great to detect patterns. So if hundreds of developers keep using the same code to solve an issue, couldn't that code become part of the platform?
Let's stop littering the web - it's already full of rubbish
The web is in danger to be flooded with generated content that nobody needs. Our codebases are likely to get bigger because we get offered lots of code by hitting the tab key instead of writing it by hand. This is the time to be aware of this. Sure, convenience is a lovely thing. But we also need to think about the cost of what we create. For the ease of mind of other people, so we don't overload them with content they will never read. And also what it means to the planet and our electricity consumption. AI is here to stay, and it can be used to optimise our workflows and our software products. But it can also help us to litter the web even more than it is right now.
Photo by Darwin Vegher
Top comments (6)
Hi there! Interesting article. I see it's a bit of a personal rant (nothing wrong with that). I understand the feeling that there is a lot of rubbish on the internet. There is!
I've seen a few different perspectives on modern AI: some people worship AI, others reject it, some use it, others abuse it. It really depends on people's objectives how they use AI. It's certainly not your average Joe that creates online rubble. One example might be the so-called "clout hoarders": people who don't engage for the sake of engagement, but simply for the collection of likes/upvotes/points. It just happens that posting a lot enables that and AI in turn helps create a lot of content. The quality of this content doesn't matter to these people. However, they are only a loud minority and most people are actually better than that.
What I want to say with this is: don't let these one or two bad apples spoil the bunch for you.
I'm a little confused about your views on packages / platforms / code-reuse. On the one hand you say: "why are we constantly rewriting the same code?" But on the other hand you say: "people shouldn't consume packages, but rather extend 'the platform'". I certainly do not identify with this type that "mindlessly consumes packages without knowing what they do or how they work". I personally install packages because they do specifically what I need them to do so I don't have to do it myself. And to be fair: I don't think I need to understand how the package works before I use it, as long as it does what I need. I can't be bothered to learn how oauth works or what the byte structure of a pdf file is. That's a reason to use packages: outsourcing expertise. These things are not "the problem" to me and as such I don't need to solve them. I can focus on the actual problem: my feature.
I also sense a mild distaste for frameworks. I feel like frameworks, especially javascript frameworks for browser apps, are extremely useful. Why should I be bothered with browser compatibility and DOM manipulation if this framework does all of this for me and probably does it more efficiently than I ever could? Once again: these things are not "the problem" to me, but my feature is. We choose frameworks that are widely supported by a community, so lack of maintenance is simply not a thing. Home-brew frameworks are way more problematic because there is no community to support you if it doesn't work and you rely on yourself to maintain it. Nobody to ensure your framework is secure, nobody to ensure your framework is efficient, likely badly documented, it's a liability.
Going back to AI: yes, AI code suggestions can be bad, but I don't know anybody who accepts all suggestions "because it feels good to write many lines of code quickly". I can only imagine people do this who are not passionate about their work and/or who are incentivised by lines-of-code (which is a terrible metric to measure work anyway). The people I work with are passionate about their work and critically review suggestions to see if they are accurate and do exactly what they need.
So much for my counter-rant, have a nice day π
Hey there, thanks for this.
I surely hope so, but on the other hand, I am also worried about every platform offering "AI templates" in any communication. Microsoft surely adds a "Copilot" to any part of office right now.
Thanks for your explanation. You are right, these are confusing. One of them is about saying autocompletion is OK and the other one was about people using packages nilly-willy. Your usage examples of packages that do one thing you can't do makes sense, but I've seen far too many people go with a boilerplate install and get everything in case they need it later. And that clogs up the web. I worked on the developer tools of the browser in Firefox and Microsoft Edge and in both places we did audits of hundreds of products. At times we found a 10MB bundle in a final product, as nobody every questioned presets.
As someone who worked on browsers, this is a huge assumption. There is nothing efficient about circumventing the DOM or CSS engine and create your own other than developer convenience not having to understand either. I wished frameworks would work on optimising the outcome, but in most cases they create huge dependencies and rendering issues. I am also super worried about people not being bothered with browser compatibility. This is what gave us Flash back then. Fact is that if you work on the web, you need to understand that your user is in control. And many frameworks promise you that they will solve that issue for you. They do, but at a massive cost and the framework being a liability in the long run. Even last year I still had to work with a lot of customers who couldn't remove jQuery from their codebase, as they had no idea what it did and how to replace it.
I understand this product focus, but it also worries me. The idea of a framework being widely supported by a community and thus being maintained and secure is worrying. Far too many frameworks fell out of fashion and lost a lot of users. Most frameworks are owned by one company and whilst they allow for outside contributions on paper, hardly anything comes in. Maintaining a framework is to a large part community work, and this isn't billable. So many companies are happy to release a framework, bang the drum and then drop it when costs need to be cut. I've seen this happen so many times.
For sure. My worry is now what happens when paid for bootcamps teach beginners to just use AI and take on the results. You know, focus on delivering the feature rather than learning to code.
Ugh, 100%. I used LinkedIns AI 'draft an intro' message tool and I don't know what it had learned from, its suggestion was awful - I then overwrote the message with what I wanted to say instead.
On the plus side, I won't bother generated text with that tool again!
All these modern AI tools are definitely enabling undesired behaviour. I like your perspective and your passion for a better web is inspiring. What does the better web look like in your opinion? Can we still benefit from eachother's work without relying on frameworks? Or do you think there is a better way to involve beginners so that they can deliver higher quality earlier in their careers? Can we attack this 'mindlessness' that you speak of in some way?
I feel like 'work culture' and maybe passion play a big role in this. I do hear a lot that people go into web development "because it makes a more than decent amount of money easily". If a bootcamp teaches beginners to just use AI and take on the results, that sounds to me like a severe lack of passion for the topic and a too high focus on becoming an emotionless moneymaker. Maybe the world should invest more in passion somehow?
Such a bootcamp wouldn't have a great reputation, who would want to hire devs from them when the word got out?
But on the other hand, I don't think you need or should look for passion in work. To me, passion suggests suffering - maybe curiosity or ethic is enough.
Makes sense, I don't know if 'passion' was the right word to use here, but my point is that people should strive to find joy and satisfaction in the trade itself, rather than the money that it makes them and these bootcamps should encourage this.
In my opinion
Some comments have been hidden by the post's author - find out more