Author: Vishal Ghariwala, SUSE (Platinum sponsor, KCD Chennai 2022)
The proliferation of distributed and data-intensive workloads is spurring rapid growth in edge computing. According to Gartner, around 10 percent of enterprise-generated data is created and processed outside a traditional centralised data center or cloud, with this figure expected to reach 75% by 2025. The adoption of intelligent edge applications is already transforming a range of industries through autonomous vehicles, industrial robotics, and industrial IoT devices, as the number of global IoT connections is set to reach 83B by 2024.
In SUSE’s recent global survey of more than 800 IT leaders, 93% say they are excited or interested in the possibilities that edge computing presents, and 79% agree that COVID-19 has accelerated moves to edge computing. As part of our report: Why Today’s IT Leaders are Choosing Open, these IT leaders told us the most compelling benefits of edge computing are:
- reliability (45%)
- enhanced security and privacy (43%)
- enhancing the entire IT ecosystem (41%)
- fostering the use of innovative new IT services (40%)
- real-time insights (39%)
As edge computing becomes more ubiquitous, there are several questions that IT leaders will need to ask to reap its benefits:
- Where should edge solutions replace/augment public clouds, as they may be too far away to provide the required latency?
- How can I create consistency across traditional and edge environments in cloud-native development and deployment?
- Is my infrastructure platform future-ready for the rapid growth of both edge computing and hybrid and multi-cloud use cases?
Success criteria for edge-related innovations
Innovation in edge computing is leading to digital transformation across industries with autonomous vehicles and equipment, industrial robotics, and IoT devices – each connecting operational functions and outputs with organisational management technologies to create an intelligent and interconnected network. There are three key factors undergirding the success of many of the modern edge-related innovations.
The first factor pertains to defining how the edge infrastructure will intersect with existing on-premises and cloud infrastructure. This will largely depend on the use case and can be segmented into three logical tiers namely the near edge, far edge and tiny edge. Near edge is closest to centralized services such as university compute facilities. Far edge is furthest from the data center and close to the edge device. Tiny edge is the edge device itself which includes sensors and actuators. As an example, the rapid rise in remote working and remote learning over the past year has clearly shown the importance of good internet latency and bandwidth especially for those living in rural areas. In such a scenario, service providers can deploy a near-edge infrastructure as close as possible to their customers.
The second factor is about having a consistent way to develop, deploy and maintain your cloud native applications regardless of where they will be living be it in the data center, cloud or at the edge. You should also be able to perform all the routine maintenance functions like applying patches, updates, configuration changes and roll backs seamlessly. Kubernetes will be a central technology here as it helps to abstract underlying heterogenous infrastructures and provides common APIs to manage software complexity across these environments.
The final factor is around building an infrastructure that is able to support the diverse set of edge use cases in addition to existing hybrid and multi cloud applications. Compute resources at the edge are generally scarce and not always connected to the Internet. Hence we will need to leverage lightweight cloud-native technology stacks that are fit for resource constrained environments and able to operate in remote locations.
SUSE edge solutions
SUSE provides Kubernetes-ready open source, edge solutions for full-lifecycle edge infrastructure management. This includes:
- A secure and lightweight Linux operating system
- A secure and lightweight edge-ready Kubernetes distribution
- A distributed, software-defined storage platform for Kubernetes that can run anywhere
- GitOps tooling for continuous delivery of containerized applications at the edge
With SUSE edge solutions, customers can orchestrate containerized workloads at the edge. The solutions employ a unified management layer to empower technology leaders to understand where and how their containerized applications are running across traditional, cloud and edge infrastructures. Finally, SUSE edge solutions are architected using lightweight technologies that are fit for resource-constrained and remote environments.
Where are you on your edge computing adoption journey?
Which edge computing use cases are relevant to your organization?
How is edge computing influencing your cloud strategy?
About the author
Vishal Ghariwala is the Chief Technology Officer for SUSE for the APJ and Greater China regions. In this capacity, he engages with customer and partner executives across in the region, and is responsible for growing SUSE’s mindshare by being the executive technical voice to the market, press, and analysts. He also supports the global Office of the CTO to assess relevant industry, market and technology trends and identify opportunities aligned with the company’s strategy.
Prior to joining SUSE, Vishal was the Director for Cloud Native Applications at Red Hat where he led a team of senior technologists responsible for driving the growth and adoption of the Red Hat OpenShift, API Management, Integration and Business Automation portfolios across the Asia Pacific region.
Vishal has over 20 years of experience in the IT industry and holds a Bachelor’s Degree in Electrical and Electronic Engineering from the Nanyang Technological University in Singapore.
Top comments (0)