Story image

The public cloud isn’t the only answer, says HPE

07 Jun 17

It’s easy for all of us to get hooked into the hype surrounding public cloud. It feels almost inevitable that all enterprise applications will be delivered from the public cloud eventually.

Oracle CEO Mark Hurd said recently that he believes that within a decade 70-80% of organisations will depend upon cloud-based IT infrastructure.

HPE takes a very different view. They strongly believe a hybrid of on-premise and public cloud datacentre technology will dominate.

Meg Whitman pointed to research that showed 52% of enterprises are considering moving applications back from public cloud infrastructure to on-premises.

HPE doesn’t benefit from customers moving to public cloud infrastructure from companies like Amazon Web Services and Google Cloud Platform, as these hyper-scale providers often build their own servers, storage and networking hardware.

Given that, you’d expect HPE to preach a vision of the future involving their on-premises equipment.

They aren’t disputing the importance of public cloud. They just see that the public cloud hasn’t fulfilled many of its promises. Many migrations haven’t turned out to be as cost effective as planned, plus latency has become an issue for more critical transactional data and systems.

The experience of being disappointed by public cloud promises, HPE is calling the “Cloud Cliff”.

Beyond the disappointment, IT departments are worrying about cyber security, privacy, regulatory requirements and lack of control of one’s own applications.

The answer HPE believes is a hybrid cloud. That mission critical applications, with high availability and low latency requirements, would be held on premises. Then less critical infrastructure could be stored in the public cloud.

HP also adds a third option, of managed private cloud. For example, a local system integrator that hosts specialist applications like SAP.

Many factors will play into the decision about which applications will be on-premise, which could be in managed private cloud or public cloud. They’ve also put together a modern bunch of tools for helping clients dynamically manage which applications are run where. 

HPE is calling this “the right hybrid mix”, and it’s overarching key strategy of “Making hybrid IT simple”.

“We see the split between different hosting models to be truly dynamic and customers systems adjusting all the time based upon business rules and objectives.” Says Paul Miller, Vice President of Marketing, Software-Defined & Cloud Group

“Many CIO’s have a perception that on-premise or private cloud solution can’t be as inexpensive or flexible as massively scaled public cloud services. This isn’t the case, and numerous studies have shown that managed and when software-defined automations are deployed that costs can be up to 50% less on-premise than in the public cloud.” Miller says.

The mix of environments is going to be the key. Hence HPE’s focus on simplifying the hybrid cloud. They’re doing this with an impressive array of tools that will help customers analyse their existing cloud usage, as well as dynamically distribute applications across environments and even divide up spend by department usage.

Also, the growing use of hyper-converged infrastructure where the compute, storage and networking are purchased together and operate as a single system will also help to drive down complexity.

The final card that HPE has in its pocket is something called “Flexible Capacity”. It’s offering its solutions on innovative new financing models which allow enterprises to only pay for what they use even on-premises. This even goes as far as analysing usage of equipment, predicting growth and shipping out new equipment in anticipation of capacity restraints, all on pay as you go usage contracts. 

Maybe HPE is correct, on-premise equipment seems like it can be as cost-effective, scalable and flexible as public cloud solutions, without the latency and other drawbacks.

Lenovo DCG moves Knight into A/NZ general manager role
Knight will now relocate to Sydney where he will be tasked with managing and growing the company’s data centre business across A/NZ.
The key to financial institutions’ path to digital dominance
By 2020, about 1.7 megabytes a second of new information will be created for every human being on the planet.
Is Supermicro innocent? 3rd party test finds no malicious hardware
One of the larger scandals within IT circles took place this year with Bloomberg firing shots at Supermicro - now Supermicro is firing back.
Record revenues from servers selling like hot cakes
The relentless demand for data has resulted in another robust quarter for the global server market with impressive growth.
Opinion: Critical data centre operations is just like F1
Schneider's David Gentry believes critical data centre operations share many parallels to a formula 1 race car team.
MulteFire announces industrial IoT network specification
The specification aims to deliver robust wireless network capabilities for Industrial IoT and enterprises.
Google Cloud, Palo Alto Networks extend partnership
Google Cloud and Palo Alto Networks have extended their partnership to include more security features and customer support for all major public clouds.
DigiCert conquers Google's distrust of Symantec certs
“This could have been an extremely disruptive event to online commerce," comments DigiCert CEO John Merrill.