As a professional software developer, I’ve noticed it’s becoming increasingly difficult to avoid using AI. CoPilot is now built into Microsoft products like Word, Excel, and PowerPoint, to name a few. Most IDEs, like Visual Studio and Xcode, now have AI enabled by default to assist with coding.
There’s a difference between deterministic and non-deterministic AI. If you’ve ever googled anything or used spell check, you’ve already used a mix of deterministic (or pre-programmed) and non-deterministic or probabilistic AI. For the rest of this post, what I mean by “AI” is the non-deterministic generative AI, using tools like Perplexity, ChatGPT, and more specifically CoPilot.
While I’m often amazed by how AI appears to read my mind when it suggests the next bit of code, it’s obvious that it also has some limitations. My experience in using AI in the development process reminds me a lot of guiding a toddler through an obstacle course — I’m constantly adjusting and cajoling it to get the results I need. I often ignore its suggestions — and sometimes even defy them outright. Prompt engineering seems as important an art to learn now as is other aspects of development.
At its core, AI is just a parrot imitating the style and syntax of the code surrounding it. That could be a good thing or a bad thing. For example, imagine walking into the subway in New York City. There’s trash everywhere and 1 out of every 10 people who come through the system evade the fare by jumping the gates.
If you were AI, you might conclude that these details define correct behavior in the context of the subway because that’s what the majority do.
Just because something is common, popular, or even ubiquitous does NOT make it right!
This is what “vibe coding” with AI gives you.
The large language models that power AI are only predicting which word comes next. Sure, today’s more advanced models included lots of training and self-analysis. But, at the end of the day, AI is merely imitating what everybody else has already done, or what it already sees in your codebase.
Or to use an early computing term: GIGO (garbage in, garbage out)!
There’s a lot of brilliant software out there. There’s also a lot of trash. Even some of the training bootcamps I’ve seen don’t actually follow best practices and end up with some less than tidy code. Perhaps because a training project, or a portfolio capstone project has a pretty high entropy tolerance, who cares if the code is tidy or not? But a real, professional production product deserves some careful thought. I’ve always said that even though nobody except one or two other developers will ever see your code doesn’t mean the code can be sloppy! Production code especially should be as clean on the inside as it is intuitive on the outside.
I hope that someone figures out a way to cut through all the politics of curating the “best practices” or “correct information”, and ignore all the “trash” of lazy (I mean brilliant) developers. At the end of the day, “vibe coding” with AI will only give you code that perhaps, possibly, maybe works??? (And yes my AI companion yelled curses at me to “fix” that last sentence!)
Don’t get me wrong, I use AI all the time. But I believe AI works best as an analysis tool, the same way spell-check is a lifesaver for a document. It can do wonders for checking your work, debugging issues, and helping you simplify and optimize things. Relying on it to write ALL your code would be risky.
AI has come a long way in the past few years, but I’m not holding my breath for Mr. Data to show up in the office any time soon! Perhaps we have reached the point of having a real Eddie from HHGTTG, or maybe it’s more like Holly from Red Dwarf. As long as we avoid HAL 9000 and Skynet, we’re good!
What do you think? How do you use AI in your workflow?
Top comments (4)
This also applies to AI in general: Just because large corporations shove their plagiarism into every program they can, doesn't make it right. It's still plagiarism. They are still stealing by their own rules.
What I think is AI is IP theft by the very same megacorporations that will gladly ruin someone's entire life for "steealing" one of their trademarks. Using it is immoral and furthers a techno-feudalist agenda that Frank Herbert warned us about decades ago:
So to answer the second question with another quote from that same book:
Yeah, I don't mind AI filling in that boilerplate code while I'm switching over an old website from using system alerts to a modern framework that I can style. Go ahead and re-write all my error alerts for me and do it in my same syntax and technique! Cool. But you're right about the plagiarism, it seems like a double standard. Love the Dune quotes! Frank seems right on the money there!
I wonder how long it's going to take for people to realize that vibe coding isn't actually useful when it comes to solving a problem. Replit keeps pushing vibe coding as this revolutionary tool that allows people who have never been involved in tech to write "groundbreaking" apps, but they're really just simple web apps with a fancy UI.
Take this for example. It looks quite fancy at first, but after a few seconds, it's kinda obvious that it's just a copy-paste of pre-existing technologies, and even if it wasn't, this is really just a simple CRUD API. One other thing that is very off-putting to me is the complete lack of source code. I can't find any code for these applications anywhere, and I wonder if the code is so poorly written/insecure that Replit doesn't want anyone seeing it.
As for AI in my own workflow, I rarely use it. Sometimes I copy and paste a syntax error or something minor like that into ChatGPT, but that's usually just out of desperation, and it normally doesn't help at all.
Exactly! I can't imagine depending on production-ready apps where you have no idea how it was made or why it might not be working. I use ChatGPT mostly as a glorified spell-checker. I like Perplexity as a search engine, at least it cites its references so you can see where it got its information from.