DEV Community

Cover image for Embracing Tech: Tools, Not Threats 🩸

Embracing Tech: Tools, Not Threats 🩸

Dominic Magnifico on October 18, 2023

✨Top 10 ChatGPT Prompts That Will Make You a 10x Developer ✨ The world is inundated with articles about AI, and regurgitated articles like this. T...
Collapse
 
martinfjant profile image
Martin Falk Johansson

Well, I sure will not use them to flood dev.to with really bad listicles or bad tutorials. I wish the people who post a lot of these obviously AI-written things would at least read them through and edit them so that they made sense?

It does feel like the web is so flooded with bad Ai content right now, that this will actually hinder the development of LLMs further, since they cannot train on content they themselves have written. We're basically as a community ruining LLMs by making a circular feedback loop.

Chat bots seem to be quite nice to get information out of a collective consciousness, it is like the entirety of internet being able to provide the information you ask for in a neat format. But the underlying information is just as good, or bad, as what one might find with a google search and as such just as full of bias and misinformation.

I guess it is also quite useful for summarising information quickly? Whether we as humanity need to have a chatbot in every service and app remains questionable.

Collapse
 
magnificode profile image
Dominic Magnifico

The risk of overuse is absolutely something worth being aware of, and your point about the feedback loop is very valid as well. The decline of unique content in the world will serve to make it difficult for LLMs to train and become better.

For the folks developing LLMs, finding and vetting content sources will become more important I'm sure. Potentially putting a higher value on that unique content.

The moral of this small post is essentially just to remind myself, and possibly others, that while there are certainly threats that exist, it's how we as developers user the tool to do our jobs better that matters.