DEV Community

charlie-morrison
charlie-morrison

Posted on

I Tested 4 'Hot' Tech Skills for 6 Months Each — Only One Was Worth Learning in 2026

Late 2025 I made a decision: instead of chasing every emerging-tech opinion piece, I would actually pick four skills the internet was hyping and spend exactly six months on each. Real practice. Real projects. Real interviews where I claimed the skill. Then I would compare the resulting interview yield, project value, and time-to-first-job-impact against each other and against the original loud claims.

Two years later I have the data. One skill earned its hype. Three did not. This post is the comparison — and the lesson is not which skill, but what makes a skill worth learning in 2026.

The four skills

I picked them from the top of the "must-learn" lists I was reading at the start of 2024:

  1. Rust — system-language hype, "everyone is rewriting in Rust"
  2. Kubernetes — "every senior engineer needs to know K8s"
  3. AI/LLM application engineering — "wrapping LLMs is the new web framework"
  4. Data engineering / dbt + warehouse stack — "data is the new oil and someone has to plumb it"

Each got six months. The "study" was real: a project that did something useful, public artifact (GitHub repo or blog post), and at least three interviews where I asked about roles using that skill.

The one that worked: AI/LLM application engineering

This is the lede. The third skill on the list — building actual applications around LLMs — was the only one that paid back the time.

What made it different:

  • Interview demand was high. Of the three skills' interview-week numbers, the LLM-app skill had 4x the inbound recruiter messages of the next-best (Kubernetes). The skill is in active demand and the supply of people who can demonstrate it well is thin.
  • The project mattered. The portfolio piece I built (a small RAG pipeline with proper eval, monitoring, and a real benchmark) became the centerpiece of three subsequent interviews. It worked because it was end-to-end and operational, not just "I called an API."
  • The payoff window was short. Within month 4 I was getting interviews directly because of the skill. Within month 6 I had a competitive offer in part because of it. The other three skills had longer payoff windows.

The reason this skill earned its hype, when others didn't: it lives in the gap where engineering meets a new modality. There is no body of senior people who learned this in school. The supply curve has not caught up with demand.

The three that didn't

Rust. Six months of Rust got me technically competent. It did not get me interviews. The roles I would have wanted (Linux kernel work, embedded, low-level infrastructure) require either deep specialty experience already or a credential I do not have. Most "Rust" job postings I found were Rust in name only — service code that could have been any language. The hype was real on Twitter. The job market did not match.

Kubernetes. This one is messier. K8s is genuinely useful and I do not regret learning it. But the interview market for just Kubernetes specialists has commoditized. The role you want — "platform engineer who happens to know K8s" — requires the underlying skill (Linux, networking, observability), and Kubernetes is a layer on top of that, not the differentiator. The six months I spent on K8s would have been better spent on the underlying systems skills, which are evergreen.

Data engineering / dbt. The smallest payoff of the four. The skill is real and the work is needed. But the path to a senior data engineering role from a backend background is longer than the literature suggests — it is not "you learn dbt and pivot." It is "you learn dbt, then prove you can own a warehouse for two years, then pivot." Six months of dbt got me proficient and unhired.

What the failures had in common

The three skills that didn't pay off shared three features:

  1. Saturated supply or commoditizing supply. Senior people in those skills already exist. Junior-to-mid practitioners are easy to find. The hype was reflecting demand, but the supply was already adjusting.
  2. The skill was a layer, not a frontier. Rust on top of "general systems engineering" — easier to hire someone who already has the systems skill. K8s on top of platform engineering. dbt on top of warehousing. The deeper skill is what mattered for hiring; the layer was how you used it.
  3. The portfolio artifact was hard to communicate. Six months of dbt produced a working warehouse — but explaining why it was hard, what was novel about it, and why it should impress an interviewer — that took 20 minutes of interview time, and most interviewers' patience didn't last.

What the winning skill had

LLM application engineering inverted all three:

  1. Supply was thin. The pool of people who can build, evaluate, deploy, and monitor an LLM application is small in 2026 because the discipline is two years old.
  2. The skill is a frontier, not a layer. There is no widely-known "underneath" skill that LLM app engineering is layered on top of. There are adjacent disciplines (ML engineering, prompt engineering) but the integration of all of them into a deployed product is its own thing.
  3. The portfolio artifact communicated quickly. "Here is a thing. It does this. It improved on the baseline by X. Here is the eval methodology." Five minutes of interview time. Result was visible.

The decision rule I use now

I no longer pick skills off "hot list" articles. I run three checks before committing six months:

  1. Is supply thin? Search LinkedIn for the skill as a senior-level title. If there are >5,000 results, supply is not thin.
  2. Is the skill a frontier or a layer? If the most common follow-up question to "I know this skill" is "what's underneath," it's a layer. Spend the six months on the underneath instead.
  3. Can a portfolio artifact communicate the skill in five minutes? If the answer requires exposition, the artifact is too complex to use in interviews.

LLM application engineering passed all three. K8s passed one. Rust passed half of one. dbt passed none.

What this means for 2026 picks

The skills I now think pass the three checks for 2026 are:

  • LLM evaluation and monitoring (the post-build half of LLM apps). The build half has crowded; the deploy/eval/monitor half hasn't.
  • Production-grade vector search and retrieval engineering. Supply is still thin; demand is real.
  • AI agent orchestration at scale. The simple agent loop is everywhere; the production-grade agent infrastructure isn't.

I am not certain about any of those. I will probably be partially wrong. But the three checks are doing the work of filtering hype from real, and they apply to whatever the next list of "must-learn" skills says.


Free tools I built for the job search: resume-checker, job-keywords, resume-bullets. All free, all in the browser, no signup.

Earlier in this series:

Top comments (0)