Technology is fast-moving. It seems there's something new to try everyday.
Trying something out yourself is one thing. Piloting technology at a company is another. And putting it into production is a big deal.
How do you know when a technology is ready for you and your company's time and effort?
This general question inspired by by a more specific one which started life as a twitter poll.
Top comments (3)
I'm often reminded of Choose Boring Technology when I think about this.
That describes that it's not just a technology question but an org question.
If you are have a ton of people proficient with PHP and your legacy systems are PHP, it may cost more to rewrite than to make PHP work better (like Facebook creating HHVM).
If you have native Android and iOS teams of more than 5 people each and you apps have built up complexity over the years, but you are feeling the pain of duplicate efforts, incremental refactoring with Kotlin Multiplatform will be a better choice than choosing a solution outside the standard native mobile ecosystem (in terms of time, effort, reskilling, people leaving, hiring for the new technology).
Whether the tech itself is ready depends on your org's risk aversion, too. Is there a culture of early adoption, experimentation, moving fast and breaking things? That significantly increases the "ready" surface area. But a culture with higher risk aversion will wait until technology is "proven" by increased adoption of other orgs or just reassurance that paying a vendor will take care of the scaling, rollout, and customization issues that crop up.
Technology adoption should be less a matter of timing and more a matter pragmatism.
The right time to start using a new technology is when you need it, and precisely when your old tools aren't cutting it anymore.
There are plenty of industries and workflows that demand implementation of very new tech, and are running things right into deployment fresh out of the womb, so to speak. Usually because the new thing was made for their specific needs. But, these are usually obvious cases, and often fraught with problems over time. Hasty adoption creates weird stacks that don't modernize well.
In general I think devs and their employers are both overeager to adopt new technology because doing so asserts forward thinking and connectivity with the zeitgeist. These decisions are more mimetic than pragmatic, and stem from raucous conversation, hype, and controversy on tech-news reverb chambers. Being able to show off the new tech is pleasing to investors and sexy at dinner parties, but these decisions are guided by social factors that don't really mean they are actually better. If timing is ever a factor in adoption decisions, it probably means you're being pushed by zeitgeist and not honest about you're own needs.
Adopting young technology can put you out of touch with the standards and consensus that take longer to crystallize, and should be done cautiously. My experience has been that software selection is a constant exercise in bandwagon hopping, each with it's own problems and benefits. The right technology is neither the HOT new one, nor the old tested dog, but the one that balances it's explicit strengths and weaknesses with your needs: maturation is rarely a factor.
We ran into this when working on j2objc a few years back. It was stable, used in production by Google Sheets, Gmail, and more key products. So it was mature and also very low-risk for orgs to adopt. However, neither java nor objc were exciting so we couldn't sell it.
Luckily we are seeing Kotlin Multiplatform has a better "balance of strengths and weaknesses of the technology and the needs of orgs" (as you put it).