I see more and more job offers around DevOps, I see a lot of infrastructure providers, the staple cloud providers AWS, Azure, Google Cloud, even Di...
For further actions, you may consider blocking this person and/or reporting abuse
I don't think Linux is going anywhere anytime soon.
The internet runs on top of Linux and all those managed services offered by AWS (ECS Fargate, Lambda, Elastic Beanstalk, etc) runs on Linux servers.
They are just making it easier for developers to quickly release their ideas by creating abstraction layers.
Amazon Linux is already in existence, and no, AWS isn't killing Linux. ECS and EC2 for example run containers and virtual machines environments respectively, both running Linux in the vast majority of uses.
I'll be honest, this article is really poorly researched.
The cloud is built on Linux - so the answer is no, AWS is not killing Linux. But it's killing Linux admin jobs. If you absolutely don't want to work on anything but Linux you might find a good job at any cloud provider.
You are right. Mariano Rentería is perhaps saying that Linux administrator work will be less required due to AWS.
I think this is just a fear of Mariano. For example on AWS to perform a disaster recovery you need to know how GNU / Linux works, at least a little.
Most of these sweet services cloud providers are offering are lock-in features, there is no standard, no interoperability, no way to reproduce this environments locally for development or testing.
And you learn need to learn Azure, and AWS, and Digital Ocean and all of them offer the stuff just different... quite annoying.
It's a fragmentation issue coupled with the fact that they are trying to lock developers into their own cloud based eco system.
I'm in the middle of trying to transition from 10 years of Systems Engineering + 5 years of BI Dev w/semi-Ops work into...whatever my next phase is...and as someone who loves working on computers in general, the cloud shift still feels disconcerting. Even with VM level abstraction, we were still managing our systems and making sure things were running well and were tuned correctly to their workloads. But I noticed specifically when "don't treat your servers as pets, treat them as cattle" started to become accepted as the norm that more and more of the jobs out there aren't interested in fine-tuning, aren't interested in careful engineering. There's so much focus on scalability (often for systems and apps that don't really need to scale) and designing for micro-services when so many organizations don't need the extra complexity of a micro-server architecture, that it's feeling like a few steps backwards to me.
I'm interested in doing more cloud work obviously because there's no doubt that many companies aren't going to go back to having to maintain their own infrastructure at a physical level, but there's still a gap somewhere that feels like something was lost, and maybe as time moves on the "sysadmin" will go back to treating their resource pools and pods as their new "pets" and we can stop pretending that careful management isn't still necessary. Along those lines, I don't think Linux is going anywhere for a while because we still need people who know how to take care of these systems (Windows too, really!)
Also, the making of "apps" themselves has become so wrong that it's true that most apps don't really need to scale, but also the majority of the others need to scale because the app is badly written and engineered.
There is a lot of hate against CS universities lately, but not everyone realizes that this doesn't mean not knowing most of the things that a good university gives to you.
Much appreciated comment & emotion!!
While I agree that infrastructure is better off being treated as cattle than pets, it still needs to be kept equally clean, well maintained and sustainable. (there's no point for a small org to run their apps as multiple Kubernetes pods having GB sized containers for a problem that can be solved with few VMs in the name of autoscaling)
The example of Elastic given by the author sounds incorrect as much of their good features are available in their enterprise versions and it costs Amazon a whole lot of effort/time even to keep OpenDistro close to the upstream.
Coming to the point of Linux being obsolete, I believe it's based on strong foundations and thus thrived even amidst fierce competition from Windows/Mac OS for servers. This'd be more apparent as more people realise the power of configurable community systems available for free than paying a bomb for proprietary tech.
Great comment @jefmes !!
The problem with cloud is that customers are surrendering all their data (aka business) to providers. Ransomware might be considered child's play compared to cloud business, in a few years. :-p
Seriously: the open source movement was about freeing your systems. With the cloud, you don't even have systems. How dumb is this?
I understand the need for more flexible systems, though.
The open source movement will always live on. But I don't get why cloud services are being so heavily pushed.
"cloud services are being so heavily pushed"
recurring payments are better than one time payments :)
More problems to solve... there will be work to do.
The article is misleading, it speaks about ditching the OS level, however AWS doesn't care about the OS, they only care about the hypervisor use to run these OS'es, and want to remain open to hosting a large (and growing) number of OSes.
Linux alone is still not an hypervisor. Windows now has its own hypervisor, Hyper-V (still optional but more often required if you intend to use some of its features, but Microsoft partnered with Amazon and Google and a few others so that hypervisors other than Hyper-V could be used as an alternative).
On Linux, there are no hypervisor at the kernel level (remember that Linux fundamentally is just the kernel, not its many environments and distributions). But Linux offers some services in the kernel (most of them inside optional drivers) to allow running several hypervisors with good performance (for now it supports well Xen, or VirtualBox, or Hyper-V with some limitations on the Windows guest, a few other will come; and I'm quite sure that Amazon won't kill this Linux kernel capability, it may just add a few other drivers for impoved performance, management or security)
Then AWS can choopse whatever they want for their very large AWS hypervisor. It will still support various VMs running Linux, Windows, Android, or specific VMs tuned for several containers, or for lightweights "functions" or "no-code" and "low-code" solutions (most of them not running VMs or multiple processes or threads, but very thin "fibers" that you can install and instanciate in a few microseconds and that can stay "dormant" for extended periods of time, and costing almost nothing and that can be hosted anywhere dynamically: a perfect choice for Amazon in AWS or Microsoft in Azure, or Google as well, as this will also be very cost-effective and attractive for many more customers than those they currently have).
So what is the problem? Linux support for hypervisors is still not enough integrated and there's a lack of common base that can be used with all hypervisors for all types of deployments (soft or hard, at VM, or containers of threads like Kubernates, or fibers for functions and many event-driven web apps and RESTAPI services).
Microsoft is working on this, so does Amazon, nothing bad. But Linux.org still lays a bit behind and still focuses a standalone full VM with its kernel which is still not easily extensible for modern architectures and deployment needs: full VMs are not the best option for everyone, it's fine for small servers managed and owned by a single person or a small team.
Linux is definitely not dead, it effectively scales on many more different architectures (as long as the universal worlwide cloud is not involved). We still lack a web-centered OS, whereas Linux/Windows/MacOS/iOS/Android are still host-centered, jsut like most databases and storages): OSes must be able to rethink what is an "app"? Is is still necessarily a "process"? Should web services replace apps, and instead of managing resources by host, they should manage user environments from any access point, using arbitrary computing resources deployed on demand and running virtually nowhere precisely but only inside a environment centered on individual identities owned by users and hosted anywhere and accessbiel from anywhere?
But for now we still need an OS on at least one device, we still live in a binary world of the client/server split. We need more tiers, and refocus on users (and the capability offered to users to create as many identities as they want and isolate them when inteacting with other "identities" on the web (this would be great for user's privacy). but for that we still lack a real "network OS" (where everything is virtualized, and the only "host" is the Internet as a whole.
I'd like to comment on some of questions you pose, and your points.
"Will AWS build its own OS and ditch Linux?" -- let me answer a question with another question - do you see new kernels built each day? Linux had about 30M LOC at the beginning of 2020, with about 4k contributors each year. It's being developed since early 1990s. Do you think Amazon, or anybody else, are gonna be able to pull that off, and build a proprietary kernel (which still has to adhere to POSIX standards), in say less than 10-20 years? The main question here is - why they would even do that? What's wrong with Linux from their perspective?
"Nearly all the cloud providers have an academia and offer certification so that you can demonstrate that you can use their services." -- well, supply meets demand. In my experience, these certificates only bring money to the issuers and don't reflect real knowledge in a slightest bit.
Also, what is Linux architect? What is Cloud architect? Linux and Cloud are like apples and oranges; and then I cannot even remotely imagine what a Linux architect is supposed to do. Most probably she doesn't architect Linux itself. Probably it's some business application architecture. Why call her Linux architect, how does the fundamental architecture problems tie to OS in generic sense? Same with Cloud architect, sure, using AWS or GCP or whatever has its quirks. But if one can only do something in the cloud, I doubt one understands how the cloud works.
"I see less useful to know Linux in a cloud first era" -- it's just, the complexity is being buried under abstractions. The fact that abstraction hides implementation details from you doesn't mean that the implementation details are not gonna bite you. I saw far too many people that, when asked "How would you plan capacity for your service?" in a system design interview, would say "I'd check this tickbox in AWS. It would do everything automatically for me". And that works... except when it doesn't, and then you're left completely clueless without understanding the underlying abstractions.
I'm really frustrated by the fact that people see no benefits in knowing how the Big Magic Black Box works anymore. That just means you can't operate that Black Box, and eventually the knowledge would fade away. And this is, at least to me, a bad thing.
Thanks for this long thoughtful comment, this blog post is to talk about it, and think about it... I want Linux to win... I don't want us to just rely too much on the abstractions of vendors...
Unfortunately, one way or another, the moment your product starts to rely on a certain cloud provider, you're locked-in. If you're using multiple clouds, well, you're locked in multiple vendors. The only way out of it is to use self-built systems on self-owned hardware in a self-built datacenter. Starting from there, everything depends on the money you're able to spend on h/w and engineers, and thus on your scale. Then, take one step backward at a time until you reach the point when you can afford it. Rent a datacenter, rent hardware, rent EC2 instances/generic compute, and if you can't afford that - rent managed services.
AWS is not killing Linux, if anything its solidifying its market share with services like ECS. If you mean managing mutable VM's then yes, this is going away with or without AWS.
The turth of the matter is that not all programming is web programming.
Linux is an essential part of many industries - for instance embedded software, where it is a lot easier to use Linux than write your own OS.
That bit about using Linux rathet than writing your own OS also works for the cloud - the companies with the resources you mentioned are more likely to invest in custom kernels, keeping anything else they might need rather than building a whole new env, which would be a waste of time and money because Linux already has decades of functional code behind it.
Finally, even if we stick to more traditional web and network uses, many companies will prefer to use their own server or emulate the cloud due to IP, privacy or plain convenience - imagine trying to edit terabytes of video material off of the cloud at a mobie studio for instance. I've also read developer accounts of companies trying serverless and then going back because the flow just didn't suit their pipeline.
All in all, the new serverless approach has it's uses, and is undoubtedly gaining more and more traction, however, this does not mean Linux will die off anytime soon.
Linux usage in servers may vary over time according to needs, but will likely not disappear, because it is versatile and mutable -> thank you open source and forking!
Linux usage in personal computers will continue to grow as new features, better GUIs, and more tools make distros more accessible to the average user.
Also, it's Elastic Search, not Elasting Search. AWS stole the IP of Elastic Search, and have the legal budget to get away with IP theft. Microsoft and Oracle are in the cloud fight too.
Why do so many tech writers call Python (released 1991) a "new language" when comparing it to Java, which came out in 1996?
I've been recently in the job market as a 5+ Linux Admin/Sys Eng. There are some niche jobs I've interviewed at where you can still manage physical infrastructure with unique startups or small orgs.
The reality is any good high paying job today is going to be DevOps and it's going to be highly automated using python or go + a cavalcade of buzzwordy tools. It's really not great for those of us who love Linux and want to use it to the fullest.
The cloud is really the top choice for most business who rather have it standardized, automated by a 3rd party and easy to use...
The times they are a changin'
Linux will not be ditched by AWS or any other cloud providers for sure, BUT the jobs related to the system already lost the battle to Cloud things.
In general, there are more positions regarding Cloud instead of system. In some businesses or sectors, it's really rare to find they have recently deployed/designed infrastructure on-prem or private cloud.
Pretty sure Canonical's ubuntu has a lot of ties with Amazon, given the placement of the Amazon link by default on the Gnome3 desktop environment.
But Linux is not going anywhere. It's a core dependency for a lot of web services throughout the world.
After reading the title I was ready to say
no
straight away, but after reading the post itself, I'm not that sure anymore. Very good points actually!Thank you @bobbyiliev , I was not trying to say its dead or something, just saying it seems to be less relevant as it used to be...
I'd bet Docker killed more linux admin jobs than aws. I've been using ci/cd automation to build docker images for half a decade, I can't imagine going back to a world before ecs/kubernetes. In my situation sysadmin role is non-existent, system health is controlled by container management and in the domain of devops.
In 5-10 yrs your robot assistant will write better code than you ever could.
Everything in IT can and will be automated away. It's already happening.
Not quite. In 5-10 years a junior level developer will be more powerful than any other time in history, and experienced devs might use powerful code completion tools as if they were syntax linters (they will just tell you that there is a more efficient option). Nothing will change for the masters, if you're willing to learn vim for fun, you probably won't be affected by the robot invasion.
Maybe, but I think you are under estimating the speed of innovation, and the amount of greed out in the world.
Meh, I think most people are overestimating what machine learning can achieve. Computers are great at logic, but posses zero creativity. They can beat you at chess because there are highly defined rules, but in the real world there are too many variables to "just try all of the options and see what works". The difference between a developer and a great developer is creativity, and knowing how to make complex things into simple things so they can be reasoned about.
For most of the DevOpses -- maybe.
For advanced / good DevOpses? Of course NO
good DevOpses will have to keep maintaining, and hopefully bring some light to this...