DEV Community

Cover image for Tech Taboo: What's A Tech Trend You Just Can't Get Behind?
dev.to staff for The DEV Team

Posted on

Tech Taboo: What's A Tech Trend You Just Can't Get Behind?

Most of us love to stay on top of the latest tech trends, but not every new development gets us excited. VR? AR in everyday life? Cryptocurrency? Self driving cars? 🚗🤖 What's the one tech trend you just can't get behind?

Follow the DEVteam for more discussions and online camaraderie!

Image by pch.vector on Freepik

Top comments (10)

Collapse
 
thumbone profile image
Bernd Wechner • Edited

Ha ha, I chuckled at this. Perhaps "Most of us love to stay on top of the latest tech trends", but I don't. I'm distinctly a mid to late adopter interested in mature tech and/or tech I assess as likely to survive ... and not disappear (as many things do ...). I love learning, but I also love return on that investment, long term utility.

I should add though, that early adopters are my best friends. Without them no great technology survives to maturity for a mid to late term adopter. I am humbly aware that I stand on the shoulders of giants (cherished innovators and early adopters 😉).

Collapse
 
cubiclesocial profile image
cubiclesocial

I'm much the same way. Every single time I've early adopted, I've been bitten for having done so. I have this seemingly rare skill where I manage to find all of the obscure, random bugs/issues that no one else else runs/ran into. The more obscure it is, the more likely I'll run into it. If I early adopt, I not only run into those bugs, but I also manage to find the hundreds of other bugs that everyone else runs/ran into as well. It's never a pleasant experience because I'm the type of person to file detailed bug reports, track down the cause to the line of code, and suggest a couple of fixes, so I usually just wait a while to avoid that pain and suffering.

On the upside, being a later adopter is also more budget friendly. What cost $250 at launch is just $50 or less two years later and has fewer bugs/issues too thanks to applied bug fixes. So you get to save thousands of dollars that you can put to use on a greater quantity of higher quality items (hardware, software, goods and services) when later adopting.

Just as an example of a weird, obscure bug. I recently ran into a bizarre issue with Windows 10 where after about 90 days of system uptime, the icons in Task Manager Details tab suddenly go "wonky." The OS starts displaying random icons from the icon cache for each running process or even no icon at all. I can't find anyone else who experiences the problem because, for most people who run Windows, Windows automatically updates long before 90 days is up. I'm at 142 days of uptime right now. Would rebooting "fix" the problem? Sure. For about 90 days. The real concern is that this bug might be exploitable. How does a corrupted OS application icon cache happen in the first place? If it's due to a buffer overflow somewhere, then some code could trigger privilege escalation and might even be remotely exploitable (e.g. via favicons in the web browser). See what I mean though? This is the kind of bug that only I seem to run into. The sort of stuff that no one else has paid much attention to even if they have run into it themselves but didn't think twice about it.

Collapse
 
kalkwst profile image
Kostas Kalafatis

The mindless tech trend followers out there. It's not just a current trend per se, but a downright mindset I've seen developing over the years. We've had these clean code cults, then the whole microservices cargo culting frenzy, and now people are questioning if microservices are dead and monoliths should make a comeback. It's like we're stuck in a cycle of blind obedience to whatever some self-proclaimed tech guru endorses. I still have conversations where people say, "The unit tests should be all the documentation of the code".

Seriously, do people even bother to do their freaking research anymore? It seems like critical thinking has become an endangered species. Instead of diving into the nitty-gritty and figuring out what works best for their projects, they mindlessly latch onto the latest trend just because some influencer says so.

Collapse
 
brense profile image
Rense Bakker

I think I get what you mean, but imho we shouldn't dismiss trends too easily either. There's some merits to clean coding principles for example. Its when people get really religious about something, that it starts to derail completely. That happened with microservices for sure. Everything had to be a microservice at some point, which was just ridiculous. So yes, critical thinking for sure, but also keep an open mind to new trends. You don't want to become religious about dismissing anything, just because its trendy!

Collapse
 
kalkwst profile image
Kostas Kalafatis

I agree that dismissing everything is the same as accepting the latest trends. In the end of the day we are toolsmiths. We create tools to solve problems. The problem begins when we lose focus of the problem, and get enamored with a specific tool. After all, if your only tool is a hammer, all of your problems will look like nails. :)

Collapse
 
ingosteinke profile image
Ingo Steinke, web developer • Edited

Well said, but I'd like to add a third one: a mixed bag of excitement, skepticism, and boredom. I've seen so many trends come and go. VR seemed to be a hype in the 1990s when Jaron Lanier was on magazine covers wearing an immersive headset and data gloves. Metaverse, or "the 3D-internet" back then, had been another 1990s trend with VRML-based worlds, succeeded by "Second Life" where most people replicated the boredom of their first lives. Artificial Intelligence (AI) – another nineties fad: everyone (well, not really everyone) got alarmed, upset, or positively excited when IBM supercomputer Deep Blue beat the human world chess champion Garry Kasparov. What we didn't have back then: cryptocurrencies, NFT, and the Web3, as we were still excited about Web 1 or World Wide Web (www) – one of the few things that didn't disappoint at all. But what happened to the USENET? I'd still prefer that over today's social media, although I also miss MySpace somehow.
I wonder if someone of a different age feels the same about their exciting decade, like "we had it all back in the 1960s, spacecraft, air taxis, robots, so boring", and if the same discussions will repeat again and again in the upcoming decades?

Collapse
 
corentinbettiol profile image
Corentin Bettiol • Edited

Missing MySpace? SpaceHey is a clone of myspace created by a yound german developer, you can customize your profile using css and there's a ton of features on the website :P

Collapse
 
mwillbanks profile image
Mike Willbanks

tl;dr, latest tech trends are worth watching but generally investing into the latest trends often is not going to yield a return on investment, waiting on maturity and/or for the next generation is often a better decision.

I like to skip a generation of tech, I keep my eye on it but it has proven me well over the course of my career. When we talk about programming languages and frameworks, the trends come and go. There are times when a newer trend makes sense in that it solves a specific problem for you. Remember when scala was all the rage but now it's rarely discussed but rather you hear about rust. Then there was ruby [on rails], in which the industry shifted towards ruby on rails but now the vast majority of that has moved over to javascript/typescript in node.js.

When we're talking about programming practices / infrastructure / etc, there are similar trends.. monolith vs. micro service. Both have their advantages and disadvantages but more often than not the monolith has often won out unless a specific use case makes the micro services worthwhile.

I liken this to the evolution of how server computing has changed over the years and evolved. bare metal -> bean counters -> kvm -> docker / kubernetes -> serverless (faas). Often, I've skipped a generation, it's not to say the intermediary generation was not useful but often times the next generation is built upon the shoulders of the generation prior and skipping a generation affords you the ability to focus more in-depth and buck the trend rather than cycling back and forth on items.

Many times we need to do more research from a fundamentalist point of view rather than deeply vesting ourselves in the ever changing landscape of languages, frameworks, tools, infrastructure, etc. If we focus on the needs of what our solution needs to bring and looking for solutions to specific problems, oftentimes we will naturally gravitate to where our needs are best solved. Sometimes that is the latest and greatest but often times it's not necessary to adopt the latest trends.

Now, not to completely toss out new trends.. they should be followed as the industry and shift along with job candidates. If you want to be able to hire, you also need to be aware that certain trends are going to be easier to hire for... if you're a super early adopter of new programming languages and/or frameworks, your hiring pool is only going to shrink, likewise, if you never adopt new tech and are programming in fortran you're not going to find many qualified candidates that are available for the position.

Collapse
 
jlzaratec profile image
Luis Zárate

Que sigan apareciendo frameworks Javascript para el Front End y eliminar de una buena vez la complejidad innecesaria que se está acarreando al hacer Front End.

Collapse
 
kaamkiya profile image
Kaamkiya

JavaScript & its frameworks