Interview: Edge computing - the force powering hyperconverged infrastructure
Scale Computing is a company that's on a mission to bring edge computing innovation to the world. From working with partners like Lenovo to deliver solutions to companies all over the world, Scale Computing is undoubtedly on the leading edge.
We quizzed CEO Jeff Ready on what Scale Computing offers, its plans for the future, and why edge computing has 'the edge' on cloud infrastructure.
Could you explain a little about Scale Computing and what it offers?
Scale Computing is an edge computing innovation company that integrates storage, servers, and virtualisation software into an all-in-one appliance-based system that is scalable, self-healing, and as easy to manage as a single server. Patented technology transforms isolated locations into unified, self-managing micro-data centers for edge computing. When ease-of-use, high availability, and TCO matter, Scale Computing HC3 is the ideal infrastructure solution.
Using industry standard components, the HC3 Edge appliances install in under an hour, and can be expanded and upgraded with no downtime. High availability insulates the user from any disk or server failure, and a unified management capability driven by the HyperCore Software efficiently integrates all functionality. The result is a data center solution that reduces operational complexity, allows a faster response to business issues, and dramatically reduces costs.
How do you work with clients across different sectors?
One example: Edge computing plays a critical role in the retail industry. The Ahold Delhaize Group needed a future-proof system that had the ability to support new edge-based applications with hyperconverged infrastructure (HCI) technology. Delhaize deployed the Scale Computing HC3 Edge Solution on Lenovo servers and benefited from a 75% reduction in management and a 99.9% reduction in recovery time, as well as the stability, support and simplicity needed to modernise the stores with IoT for optimising freezing, heating and the customer experience. It has since deployed the new platform at over 100 stores with plans to expand this across all 800 stores in Belgium.
What are Scale Computing's plans for the year ahead?
Having closed out a record-breaking sales year in 2018 and a significant round of new funding, in 2019 our focus remains on international growth and our mission to deliver market-leading edge and HCI solutions for global organisations.
As a result, we are continuing to make significant investment in products, partners and within the Scale team.
Our OEM partnerships with technology providers including Lenovo, NEC, Schneider, and Google, continue to be a key focus. Our existing partnerships are very important to us and we are always open to partner with other providers who share the same values as Scale Computing.
We have also recently been named as a 'Notable Vendor' by Gartner in its Magic Quadrant for HCI for the second year running, and we believe that the company is in a really strong position to take advantage of the mainstream adoption of HCI technology this year and into the future. It's a very exciting time to be at Scale.
Edge computing is a hot topic at the moment – what are some of the key developments that you have seen?
Many of the key developments in edge are focusing on IoT and, by definition, IoT will need edge computing to work effectively and realise its long-term potential. The inherent latency of cloud is no longer cutting it when it comes to deploying intelligent automation and getting real-time results.
Edge computing is here to solve that problem, and by mitigating the latency associated with cloud, it ensures that the latest IoT developments are available to businesses across every industry. IoT and the growing global network of sensors are adding more data than the average cloud has had to handle in the past. According to a study from IDC, 45 percent of all data created by IoT devices will be stored, processed, analysed and acted upon close to or at the edge of a network by 2020. In the process, edge computing will take on workloads that struggle on hosted cloud environments, passing the torch over to HCI platforms.
Looking at some specific markets and use cases as examples, retailers need reliable computing across the edges of their networks - think smart thermometers for a grocery store's freezer section, interactive signage, customer recognition and improved quality control.
For industrial applications, IoT sensors require on-site computing performance. And in maritime, shipping vessels, ocean liners and offshore platforms all have computing needs that can go far beyond the edge of most networks.
How does edge computing aim to solve the common problem of processing and helping customers make sense of massive amounts of data?
In a world that is increasingly data-driven, and where that data is being generated outside of traditional data centers, edge computing places the physical computing infrastructure at the edges of the network where the data is being generated. In many cases, those sites are where the data is needed most.
It doesn't always make sense to send this data all the way back to the central data center, or the cloud, only to be processed and sent back to the same site where it was generated. This is particularly valuable when it can be processed quicker and more efficiently onsite by eliminating the latency associated with remote computing resources like cloud.
In addition, edge computing also allows organisations to process large amounts of raw data onsite before sending it, in a more refined state, efficiently to the cloud or the central data center. This can dramatically save on bandwidth and cloud computing ingress/egress costs.
What are some of the benefits and challenges that organisations are facing as they understand and deploy edge solutions?
If you are imagining edge computing in the context of remote office/branch office computing, you aren't wrong, but edge computing can also be adjacent to manufacturing systems, medical equipment, point of sales, IoT devices, and more. The needs of edge computing are widespread across every industry.
As a result, edge computing is fast becoming mission critical where on-premises IT infrastructure resources need to sit outside of the typical data center. In general, these edge computing resource needs are small, not requiring a full data center or even a small data center implementation. Edge computing may require infrastructure as small as an IoT device, or as large as a micro-data center of multiple compute appliances.
But why edge computing and not simply cloud? Cloud computing has many benefits, especially scalability and elasticity; however cloud is not without its limits. Chief among these limits are internet connectivity and latency. On-prem infrastructure assets for edge computing provide more reliable performance and connectivity to keep systems operational even if internet connectivity fails.
Unlike full data center implementations, edge computing is small enough to not warrant dedicated IT staff. Due to this, the infrastructure needs to be easy to implement and manage, and easily connected back to the primary data center or even the cloud as needed. These requirements are what make HCI technology well-suited for edge computing.
The challenge, however, is that not all HCI solutions are created equal. Some solutions are not economically viable given their high demand for resources like processor and RAM. These solutions are too expensive to buy and too expensive to operate for edge computing.
What advice would you give to those organisations?
Our advice is to work with technology vendors and partners who have solutions specifically designed for edge computing, and who have the experience of implementing edge infrastructure that delivers the benefits they are looking for.
But specifically, flexibility, overhead and cost are all key considerations. Because edge locations have previously only run a few critical applications, the existing infrastructure may be a mishmash of different point solutions, server types, and infrastructure software components which were deployed, modified, and added to over time. No good data center would be deploying in such a chaotic fashion and for good reason: delivering high availability and efficiency with such a deployment is impossible.
A good edge deployment can be looked at as a micro-data center combined with intelligent automation. Data Center functions such as compute, storage, backup, disaster recovery, and application virtualisation can be consolidated into a single, integrated platform. Infrastructure silos that are difficult to manage in a centralised data center become unmanageable at the edge, and thus, convergence of these into a single platform is both efficient and cost effective.
Similarly, a good edge infrastructure strategy is based around flexibility. After all, new applications, devices, data sources, and needs emerge continuously. Some edge applications may be resource and data storage heavy. Others may only need to run a few, very lightweight applications.
Instead, what is needed is a solution that delivers the core functions of a data center, but that is scalable both up, and down, in size. Edge computing brings with it the need to deploy many micro-data centers of varying sizes, and a proper platform should be able to scale in both directions to accommodate these needs.