DEV Community

Discussion on: Going Against Conventional Wisdom: What's Your Unpopular Tech Opinion?

Collapse
 
jannisdev profile image
Jannis • Edited

I am pretty much against ChatGPT and AI for programming. Not that I don't like the tech behind it because that's very interesting but how the people use it nowadays.

When people learn to program they need to get every sort of skill like syntax, logic, security, workflows and much more. If they use ChatGPT they only get what they reall search for (if they even know what to search for!) and don't learn as much as you would have on the way to the solution by yourself.

It's not for advertising purposes, but I wrote a little blog post about it and why I think it's bad for programmers or learning something in general: aquahub.studio/blog/why-chatgpt-an...

Collapse
 
diracspace profile image
Roberto de León

I mostly use it to search for very specific errors provided by whatever resource I’m using.

Collapse
 
guithomas profile image
Guilherme Thomas

Symptoms of people concerned with what they will deliver at the end of the day, and not with understanding how they arrived at the result.

Collapse
 
codenerd profile image
Hiro

It depends on how you see the problem, I think. I mean that it has both pros and cons. We can speed up analyzing or get to the main business as soon as possible with chartGPT.
You know, it is true that AI has been boosted and enhanced our world.

Collapse
 
jannisdev profile image
Jannis

Yes, I don't say it's completely bad. I think if it saves you only time then it's worth it.

But if it's disrupting your learning progress by spoon feeding you then I think it's bad.

Thread Thread
 
syeo66 profile image
Red Ochsenbein (he/him) • Edited

Actually still waiting for the moment ChatGPT, Copilot or the likes actually save me time.

Collapse
 
joshhadik profile image
Josh Hadik

Gotta disagree here, I find that I learn more. Sometimes I want GPT to generate some implementation in one way, and it suggests a new pattern or library I hadn't ever heard of that I can add to the toolbox. This of course only works if you also prioritize learning what GPT generates, if you just want to copy paste it than yeah, you're not gonna learn anything.

Collapse
 
umeshgihub2020 profile image
umeshgihub2020

Same here

Collapse
 
jannisdev profile image
Jannis

At some point you can really use it to get additinal knowledge (even if it's not as much as you would have learned the tradional way) but only if you got the basic knowledge before.

Thread Thread
 
joshhadik profile image
Josh Hadik • Edited

I don’t know maybe you’re right but I still feel like I learn more than the traditional way (assuming the ‘traditional’ way is docs and tutorials.)

Like the other day I was messing around with a stimulus gem in rails and I was able to pop in to the source and add some print statements to figure stuff out but whenever something didn’t make sense at a glance I could just pop it in chatGPT and have it explained to me, and we’re talking private methods and functions that there aren’t any docs for. Now I feel like I have a way deeper understanding of how the gem works, not just the interface that you’d read about in the docs but the implementation of it too.

You can even ask questions like “why do you think they chose this pattern” and GPT will spit out the pros and cons for the pattern and some alternatives, it’s just a way more natural way of learning IMO (but of course you need to remember LLMs aren’t always 100% accurate, but the same applies to the people making the tutorials)

Edit: after reading more of your comments I agree with you completely. The problem isn’t the tech, but how it’s used, and beginners generally don’t even know enough to ask the right questions so definitely not a substitute for learning the hard way when you’re starting out.

Collapse
 
darkterminal profile image
Imam Ali Mustofa • Edited

They just playing on their house and imagination, not on the real world!

Collapse
 
jannisdev profile image
Jannis

I wouldn't agree completely on that. It's just that the GPT models are trained on data form 2021. So they don't play in their own house but in the past..

And today's world's changing faster than ever so that's a huge disadvantage.

Thread Thread
 
darkterminal profile image
Imam Ali Mustofa

Yes... what I mean in their house is their comfort zone without knowing the real world.

Thread Thread
 
jannisdev profile image
Jannis

Oh my bad. Yes I agree.

Thread Thread
 
darkterminal profile image
Imam Ali Mustofa

People are born mature nowaday, and grow like baby.

Collapse
 
chrisciokler profile image
Christian Prado Ciokler

I think it gives some people the false notion that they can be developers or experts in any field the easy way, and that is a lie. I am not against AI tools, some people may use them in the wrong way, but in the hands of an expert, it can be a 10x tool, and that is precisely the point. Being able to ship software faster and better, it would never be a bad thing. We just need to learn and evolve with it. This tech is here to stay, and denying it is a mistake.

Collapse
 
eskabore profile image
Jean-Luc KABORE-TURQUIN

Hi Christian,

Thank you for sharing your thoughts on AI tools and their impact on the development field. I understand the concerns you raised about some people having false notions that they can become developers or experts in any field easily through the use of AI tools.

While I agree with your points, I would like to add that having false notions doesn't necessarily make someone a liar. Truth can be relative to an individual's perspective and experiences, and it's important to approach discussions with empathy and an open mind.

As moderators of this forum, we strive to create a welcoming and inclusive space where all members can express their opinions and engage in constructive discussions. We appreciate your contribution to this conversation, and we encourage everyone to share their thoughts and ideas respectfully.

Thank you for being a part of this community.

Thread Thread
 
chrisciokler profile image
Christian Prado Ciokler

Great comment, and I apologize for the confusion. I believe you may have misunderstood my original statement. I was pointing out that people might mistakenly believe that by using AI tools that provide them with all the answers, they become experts in a specific field. I am suggesting that this belief is misleading because true expertise requires not only knowing the answers but also having the experience and knowledge to ask the right questions. While AI can provide answers, it is individuals with expertise and understanding in a particular area who can make informed decisions and judgments. Encouraging people to rely solely on AI to become experts can mislead them and perpetuate misconceptions.

Collapse
 
jessekphillips profile image
Jesse Phillips

People have been saying this for years. Compiler have been ruining the young generation's ability to learn to program. Rather than putting time into learn what a register is or how CLD will save your life. People are just describing what they want and the compiler outputs the program for the. Many of these youngsters don't even look at the code generated.

Collapse
 
tiguchi profile image
Thomas Werner

I was also skeptical at first, but I gave it an honest and serious try about a month ago, and now there's no turning back for me anymore.

ChatGPT absolutely replaced Google for me. I have not used Google in a month. Neither have I used StackOverflow for finding solutions for programming related problems.

I get answers to programming problems instantly, and I get a straightforward answer tailored to my exact question, that could take me hours of research the traditional way. ChatGPT goes way beyond the scope of just letting the AI write code for you.

Yes, ChatGPT can often give misleading, even incorrect answers, especially when it generates code. And it will probably confuse the heck out of a generation of junior developers. But for me as a senior dev, it's an amazing tool. It's often able to correct itself when you point out the issues. And even when the code is incorrect, it often contains pointers I would have never found that easily.

I find ChatGPT best and most accurate when I use it on a higher level. It's a great tool for evaluating solution strategies on a higher level. You can have brainstorming sessions with it and discuss possible solution options that touch architectural or design questions for example. It's like "rubber-ducking" a problem except you get smart answers, and it's not just in your head.

Collapse
 
jannisdev profile image
Jannis

Agree, like I said if it only saves you time I think it's a great tool.

You said it will correct itself and yes I know but as a beginner you don't know it did a mistake because you didn't learn all the logic stuff etc. ;)

Collapse
 
leob profile image
leob

Amazing post, especially the point about rubber ducking was an eye opener!