Reality of Today in the Job Market
I have gone through a lot over the last month. Currently I am dealing with job hunting, and honestly, it has never felt harder than this. When applying, you notice that many postings barely care about the specific stack anymore, and sometimes you end up talking to non-technical recruiters who say things like "Objective Oriented Programming" or who do not seem to know the difference between Java and JavaScript. The bare minimum has shifted, and what I am seeing is that to stay relevant in the AI era, I need to fill the gaps in my knowledge and play the game more as a generalist rather than focusing on a single stack.
Pivoting in Full Stack Web Development
A lot of the full stack web development jobs I see nowadays are centered around agentic workflows. AI has created roles that did not exist before, but the underlying reality is the same: if you do not know REST APIs, web scraping, how to work with dependencies, or the language itself, how are you supposed to "vibe" your way through? I have a way of framing this. I think of AI as gambling, and the best gamblers in this game are experienced developers who have seen plenty of bugs, production issues, read the books, and understand system architecture. I can give as reference this video worth watching how AI creates new jobs on full stack development.
Calling Model APIs Does Not Automatically Make You a Developer
Do not get me wrong, I use these tools myself and I am actively getting familiar with new concepts. While learning AI programming, I have come to realize that a lot of what I apply are the same engineering best practices I already know. Of course, AI programming goes much deeper when you get into the algorithms, neural networks, and the research side that people spend PhDs on. That is not what I am talking about. I am talking about the higher level abstractions that show up in job postings. I recently read Thoughts on slowing the fuck down, written by the creator of the PI coding agent. For now, PI is my favorite coding agent because of its simplicity and extensibility. On GitHub right now, almost any project with "AI" attached to it gets hyped, and I have noticed a lot of the same recurring issues: rendering bugs, poor error handling, and security vulnerabilities. That tells me a lot of fragile software is being shipped on a regular basis.
There Are Vibe-Safe Sections and Vibe-Unsafe Sections
I use AI regularly, but when it comes to the core business logic or the architectural decisions that form the backbone of my software, I slow down. I do research, I write code the old fashioned way, and I think carefully about API design and data pipelines. I have never believed that shipping large amounts of code quickly is the same thing as being productive. I try to stay skeptical and think about the edge cases that could break things. Even when AI produces code, I review it carefully and do not celebrate until I actually understand what was written. For students currently in college who are watching all this unfold, I highly recommend this article, written by a teacher. The core idea is why it still matters to write code yourself, even with AI around. Without developing a sense for code smells and understanding the flow of a program, we end up guessing instead of knowing, and that can lead to bad outcomes down the line.
Critical Software Requires Critical Thinking
When I build projects that I think other people might actually use, I try to be brutally honest with myself and only ship code I genuinely understand in terms of security, efficiency, and user experience. We all see people shipping enormous amounts of code, but only a small portion of them actually produce niche products that people rely on daily. The ones who do tend to have real production experience and a deep awareness of where software could break. They also review code quickly because they have built that instinct over years. If you watch this video with NeetCode and the creator of opencode, you can see how they think through code. I used to tell myself that AI generated code I did not understand was just technical debt. Honestly, I do not think that framing is fair anymore. When you cannot explain the code you shipped, that is a skill issue, not technical debt, and it is better to name it directly so it can be addressed. "Eiffel was not built in a day."
Easier Code Generation, Less Sense of Responsibility
I have recently reviewed a number of PRs generated with AI, and I have seen people commit .env files without realizing what they are or why that is a problem. I feel genuinely concerned for them, because if the pattern continues, either they will get compromised or their users will. Some non-technical folks have also started vibe coding and reaching out on LinkedIn, offering "marketing projects" that they claim can improve my own work. When I open these repositories on GitHub, I often find the same issues: security vulnerabilities, exposed API keys, and so on. That is genuinely worrying. I would rather skip the marketing help than expose my project to that kind of risk.
Lastly
Today I just wanted to share what I have been seeing in the job market and in AI programming. I build projects with vibe coding too, but my personal projects follow standards I set for myself. For example, I have a blog post analyzer CLI tool that I use to rank blog posts based on my own opinionated criteria. That is not something many people would share, and that is fine. I am using it because it makes me productive, not because it needs to be universal. AI is genuinely exciting, but it works best when paired with responsibility and good judgment.
Top comments (24)
the job market right now is a total grind — vibe coding feels like the only way to stay afloat without burning out. i'm leaning into cursor for my own builds because grinding out syntax is the last thing i want to do after a long day. fundamentals still keep the ship upright but speed is the only thing that gets you noticed. austin taught me: just start the thing.
Yes, how much you ship has become important more than ever. We gotta keep going and learning quickly.
“Vibe coding” works until it hits real constraints—security, edge cases, and maintainability. The market is clearly rewarding generalists who can leverage AI, but fundamentals still decide whether the output is usable or fragile.
Treating AI as a force multiplier rather than a replacement feels like the only sustainable approach right now.
Between the .env files and Objective oriented... it's amazing how low the perceived barrier to entry is to be "a coder" nowadays. The pendulum has to swing back, at least a little, after folks realize the negative impact of these current prolific decisions.
bar is at all time low
this is gang
🤣🤣
You mean gyan 😁
no
Good points ... let me just say, "rumors of the demise of the Developer are greatly exaggerated" :-)
Nice
That is not how real engineering works.
That is how LinkedIn cosplay engineering works.
As a builder and a founder:
If the architecture is unsafe, the product is unsafe.
If the product is unsafe, the vendor is unsafe.
If the vendor is unsafe, the relationship is over.
This hit. Especially the “vibe-safe vs vibe-unsafe” distinction — that’s exactly the line most people blur right now.
For me, I don’t really treat AI as something that writes code for me — it’s more like a system I interrogate while I’m building. Every time it outputs something, I start breaking it down:
I basically turn the code into a set of questions first. Then I take those questions and run them back as a separate prompt — almost like creating an external reviewer that’s not attached to the original output.
And the key part: I don’t regenerate the same thing. I generate something adjacent. Closest possible structure, different implementation. That’s where the real signal shows up — patterns, inconsistencies, hidden shortcuts.
That’s how I learn while building instead of just shipping.
Because yeah… calling an API and wiring things together isn’t the same as understanding a system. If you don’t review the assumptions, constraints, and failure paths, you’re basically just prompt-engineering your way into blind spots.
AI is powerful, no doubt. But if you’re not slowing down at the core logic and architecture layer, you’re not speeding up — you’re just deferring problems to production 😅
Your market read holds up. The twist HR postings don't see yet: they lag 6-12 months behind what's actually shipping, so half the "fundamentals" they still fear (leaked .env files, for example) are handled autonomously now by tools like Claude Code without the dev even noticing. Dev competence stays an asset, but standing on it alone is a losing bet as business standards shift fast. The real play is to master, not just use, the tools reshaping the work like Claude Code or Cursor. Mastery compounds, familiarity doesn't.
That’ s harsh reality. Businesses won’t care and they focus on speed more. But I think developers who truly understand what’s happening and are actively adopting these tools will come out on top.
Businesses won’t care, until they get bitten in the *ss by production bugs, vulnerabilities and unmaintainable software - I think companies will find out in the end, and will understand that sometimes going slower means going faster in the long run ...
We don’t need to wait a lot, it is going to happen in near future.
Frankly speaking, the emergence of AI has made the work of developers less important, which is a fact we must accept.
I'd partially agree, boilerplate code is gone, and simple static sites are trivial to generate with AI. But maintaining complex software is still in high demand, the shift is that we need to adopt new tools and exercise our own judgment.
Is that so? And why 'must' we accept that, or more to the point, what exactly do we need to accept?
I'll readily admit that simple projects (basic websites etc) may not even need a developer anymore - end users/business people will be able (or are already able) to "vibe code" those - I agree that THAT is a reality that we'll need to accept ...
For anything more complex, I'd argue that devs are still needed, if only because someone has to guide the tools (features, specs, architecture), and has to check the output generated by AI - correctness, performance, security ...
Saying that the work of developers is less important is akin to saying that the work of a carpenter is less important because he's using an electric saw instead of a hand saw - at least, that's how I see AI - as a tool which still needs a knowledgeable person to use it effectively ...
I totally agree. It is cool to use AI as a tool if you know what you’re doing. When the codebase gets larger and if a developer doesn’t know how the system works, slop creation is increasing. Therefore slowing down makes sense.