Story image

Opinion: Meeting the edge computing challenge

22 Mar 2019

Article by Scale Computing office of the CTO Alan Conboy

The fastest growing segment of IT is edge computing, but what is it?

Edge computing is a physical computing infrastructure that is intentionally located outside the four walls of the data centre. The purpose of edge computing is to place applications, compute and storage resources near to where they are needed and used – and to where the data is collected.

With only a small, possibly tiny hardware footprint, infrastructure at the edge collects, processes and reduces vast quantities of data that can be uploaded to a centralised data centre or the cloud. Edge computing acts as a high-performance bridge from local compute to private and public clouds.

The importance of edge computing

It is particularly useful for any industry that has remote sites, such as retail, finance, industrial, remote office branch office (ROBO) and the internet of things (IoT).

In retail, for example, retailers need reliable computing that can provide maximum uptime for point of sale, inventory management and security applications for the numerous store locations on the edges of their networks. Banks and other financial institutions with multiple branch offices also require reliable computing to support rapid, business-critical transactions.

Edge computing is also set to play a prominent role in the continuing deployment of IoT devices as the most effective means to process the vast amount of data they produce quickly and effectively. This requirement is only likely to become more pronounced when communication of that data to the cloud may not be reliable or fast enough to be effective.

In the case of ROBO deployments, small branch locations are now increasingly running core, mission-critical applications and the infrastructure they reside on needs to evolve to match the critical nature of the workloads they are running.

It’s obvious that most edge computing sites have very specific computing needs and require much smaller deployments than the primary data centre site. Many organisations may have dozens or hundreds of smaller edge computing sites and they cannot afford to roll out complex, expensive IT infrastructure to each site.

How to fix the great edge mismatch

But with many applications running on the edge becoming as critical as those in the data centre, how can organisations match the resiliency, scalability, security, high-availability and human IT resources found in the data centre? How can they address the growing mismatch between the importance of the applications and the infrastructure and IT that supports them at the edge?

To support critical applications with little or no onsite IT staff, edge computing systems have to be more reliable, easy to deploy and use, highly available, efficient, high performance, self-healing and affordable. In many instances, to keep applications running without dedicated IT staff onsite, systems require automation that eliminates mundane manual IT tasks where human error can cause problems.

Automation also keeps the systems running by monitoring for complex system failure conditions and by taking automatic actions to correct those conditions. This eliminates the downtime that would take a system offline and require an IT staffer to come onsite to bring it back online. Even when hardware components fail, automation can shift application workloads to redundant hardware components to continue operating.

Edge computing infrastructure systems need to be easy to deploy and manage because businesses with hundreds of sites cannot afford to spend weeks deploying complex hardware to each site. They need to be able to plug in the infrastructure, bring systems online and remotely manage the sites going forward. The more complex the infrastructure, the more time they will spend deploying and managing it.

Edge computing systems should also run with as little management as possible. They need to be self healing to provide high availability for applications without requiring IT staff resources, with automated error detection, mitigation and correction. Management tasks should be able to be performed remotely and with ease.

In addition, these systems should be scalable up and down, dependent on the requirement of the edge location, to ensure organisations are not saddled with excessive overheads for resources they don’t need.

A data centre in a box

Virtualisation, converged infrastructure (CI) and hyper converged infrastructure (HCI) are helping organisations to deploy the compute, storage and network resources at the edge to match the growing requirements being placed on dispersed locations. The data centre in a box approach makes it easy to deploy IT infrastructure that can be managed individually or centrally, to add additional resources without downtime, to deliver built-in self-healing, local high-availability, remote disaster recovery and hybrid cloud capabilities.

These systems can be deployed in minutes and preconfigured to avoid lengthy on- site resources during initial deployment. No specialised training or certification is required because these platforms are designed to be powerful but intuitive, which also negates the need for on-site IT expertise.

CI and HCI help to make edge computing much more accessible and affordable for organisations of all shapes and sizes. These systems can transform isolated locations into unified, self-managing micro data centres that meet the changing needs of computing at the edge for so many organisations. CI and HCI can give businesses the capabilities they need to meet the developing challenges of computing on the edge without tipping them over it.

SingleRAN Pro: Combining simplicity and openness for a 5G future
Huawei's SingleRAN Pro solution supposedly offers an open, simplified networking concept to help operators roll out commercial 5G networks.
Gartner recognizes Huawei's data center networking expertise
The Gartner Peer Insights Customers’ Choice analyzes more than 200,000 reviews across more than 300 markets posted to Gartner Peer Insights. 
How Huawei aims to enhance IP networks
'We believe that the intelligent IP networks built with the four-engine series products can continuously empower users with business intelligence."
Nutanix introduces new cloud-native solutions for enterprises
Nutanix announced the general availability of its certified Kubernetes solution.
VMware to roll out cloud platform on AWS in Singapore
VMware has selected AsiaPac Distribution as its launch partner for VMware Cloud on Amazon Web Services (AWS) in Singapore.
True IDC & Tencent launch AI-enabled cloud platform in Thailand
The platform is designed to offer a complete range of cloud services to the public and private sectors.
Dell EMC’s six server market trends
As the evolution of cloud-based computing continues, it is important to know what’s ahead to stay ahead of the market.
Huawei FusionServer Pro built for 'intelligent transformation'
The next generation X86 servers draw on an intelligent acceleration engine, an intelligent management ending, and intelligent data center solutions for ‘diverse’ scenarios as transformation shifts from digital to intelligent.