Story image

The public cloud isn’t the only answer, says HPE

07 Jun 2017

It’s easy for all of us to get hooked into the hype surrounding public cloud. It feels almost inevitable that all enterprise applications will be delivered from the public cloud eventually.

Oracle CEO Mark Hurd said recently that he believes that within a decade 70-80% of organisations will depend upon cloud-based IT infrastructure.

HPE takes a very different view. They strongly believe a hybrid of on-premise and public cloud datacentre technology will dominate.

Meg Whitman pointed to research that showed 52% of enterprises are considering moving applications back from public cloud infrastructure to on-premises.

HPE doesn’t benefit from customers moving to public cloud infrastructure from companies like Amazon Web Services and Google Cloud Platform, as these hyper-scale providers often build their own servers, storage and networking hardware.

Given that, you’d expect HPE to preach a vision of the future involving their on-premises equipment.

They aren’t disputing the importance of public cloud. They just see that the public cloud hasn’t fulfilled many of its promises. Many migrations haven’t turned out to be as cost effective as planned, plus latency has become an issue for more critical transactional data and systems.

The experience of being disappointed by public cloud promises, HPE is calling the “Cloud Cliff”.

Beyond the disappointment, IT departments are worrying about cyber security, privacy, regulatory requirements and lack of control of one’s own applications.

The answer HPE believes is a hybrid cloud. That mission critical applications, with high availability and low latency requirements, would be held on premises. Then less critical infrastructure could be stored in the public cloud.

HP also adds a third option, of managed private cloud. For example, a local system integrator that hosts specialist applications like SAP.

Many factors will play into the decision about which applications will be on-premise, which could be in managed private cloud or public cloud. They’ve also put together a modern bunch of tools for helping clients dynamically manage which applications are run where. 

HPE is calling this “the right hybrid mix”, and it’s overarching key strategy of “Making hybrid IT simple”.

“We see the split between different hosting models to be truly dynamic and customers systems adjusting all the time based upon business rules and objectives.” Says Paul Miller, Vice President of Marketing, Software-Defined & Cloud Group

“Many CIO’s have a perception that on-premise or private cloud solution can’t be as inexpensive or flexible as massively scaled public cloud services. This isn’t the case, and numerous studies have shown that managed and when software-defined automations are deployed that costs can be up to 50% less on-premise than in the public cloud.” Miller says.

The mix of environments is going to be the key. Hence HPE’s focus on simplifying the hybrid cloud. They’re doing this with an impressive array of tools that will help customers analyse their existing cloud usage, as well as dynamically distribute applications across environments and even divide up spend by department usage.

Also, the growing use of hyper-converged infrastructure where the compute, storage and networking are purchased together and operate as a single system will also help to drive down complexity.

The final card that HPE has in its pocket is something called “Flexible Capacity”. It’s offering its solutions on innovative new financing models which allow enterprises to only pay for what they use even on-premises. This even goes as far as analysing usage of equipment, predicting growth and shipping out new equipment in anticipation of capacity restraints, all on pay as you go usage contracts. 

Maybe HPE is correct, on-premise equipment seems like it can be as cost-effective, scalable and flexible as public cloud solutions, without the latency and other drawbacks.

Intel building US’s first exascale supercomputer
Intel and the Department of Energy are building potentially the world’s first exascale supercomputer, capable of a quintillion calculations per second.
Vertiv appoints new Malaysia country manager
"With Wooi Keat leading the Malaysia business, I am confident that we will be able to solidify our position as the preferred partner for critical infrastructure solutions.”
NVIDIA announces enterprise servers optimised for data science
“The rapid adoption of T4 on the world’s most popular business servers signals the start of a new era in enterprise computing."
Unencrypted Gearbest database leaves over 1.5mil shoppers’ records exposed
Depending on the countries and information requirements, the data could give hackers access to online government portals, banking apps, and health insurance records.
Site24x7 enters China market with new Shanghai data centre
This is Site24x7’s fifth data centre around the globe, with the company set to announce another in Beijing in the next quarter.
Storage is all the rage, and SmartNICs are the key
Mellanox’s Kevin Deierling shares the results from a new survey that identifies the key role of the network in boosting data centre performance.
Opinion: Moving applications between cloud and data centre
OpsRamp's Bhanu Singh discusses the process of moving legacy systems and applications to the cloud, as well as pitfalls to avoid.
Global server market maintains healthy growth in Q4 2018
New data from Gartner reveals that while there was growth in the market as a whole, some of the big vendors actually declined.