Story image

How to avoid that next storage hardware refresh

11 Apr 2017

Customers tell me repeatedly that they dread the phone call from their hardware storage vendor. Every three or four years, the call is predictable:

“Great news John, due to economies of scale the cost per GB of storage has dropped and performance has increased – we can also provide you with some flash to help improve performance. Plus, we have extra options like deduplication, encryption and replication – we will need to charge you for these.”

Wait, there is more: “Yes, all you need to do is replace all that perfectly good working storage with our new array, and due to your explosive growth it will cost you slightly more than your last refresh. Our professional services team can help you with a migration project to move data from our old frame to our new frame.”

Sound familiar? How would you like to change the conversation?

“That is OK Bill, you know how we virtualised our servers all those years ago, and it changed how we purchased servers?"

“Well we are doing the same to our storage, and now we no longer need that hardware upgrade. In fact we are adding 1TB of RAM for about A$12,000, 5TB of high performance flash for about $A10,000.

For extra capacity, we are using multiple 10TB drives – they are costing us only about $A800 each and we can interconnect all our existing disparate SANs and NAS’s into a single storage pool by using Objective-Defined Storage.

Objective-Defined Storage manages in real-time data tiering between hot, warm and cold data to free up RAM and SSD when an application no longer needs those expensive resources.

All resources can be shared across all applications – even applications on different hosts. It can even create up to eight resilient live instances of critical data using our existing storage including these inexpensive high capacity drives – pretty cool. Oh, and thank you for your suggestion, but we will give it a miss.”

Software is king! And your data is your kingdom.

Why are you replacing your perfectly good storage?  Reliability? Performance? Scalability?

Using Objective-Defined Storage, which is a software-only solution that allows users to extend the life of existing storage assets, exponentially increase performance and deliver totally elastic scale-up, scale-out, scale anyway you require, including the cloud capacity.

The technology extends the life of existing storage for a couple of years, allowing users to avoid those expensive fork-lift upgrades.

Using Objective-Defined Storage, typically allows users to free up between 50 to 80 per cent of capacity of a SAN or NAS by moving cold data to high capacity low cost devices, or the cloud; deliver exponential performance improvement using commodity SSDs; and improve data availability.

Two of the major outcomes are finally eliminating cost and complexity from a storage ecosystem by deploying the storage solution for the new enterprise.

Just like virtualisation changed the way we use and deploy servers, storage virtualisation de-couples data from the underlying storage hardware and allows data to move freely and ubiquitously across all storage, regardless of vendor, make or model.

This delivers data freedom to a business and puts an end to vendor lock-in, storage silos and forklift upgrades once and for all.

Article by Greg Wyman, Vice President Asia Pacific, ioFabric

Inspur announces AI edge computing server with NVIDIA GPUs
“The dynamic nature and rapid expansion of AI workloads require an adaptive and optimised set of hardware, software and services for developers to utilise as they build their own solutions."
Cohesity and Softbank partner to offer data services in Japan
The joint venture asserts it will enable Japanese enterprises to back up, store, manage and derive insights from all of their secondary data and applications.
ADLINK and Charles announce multi-access pole-mounted edge AI solution
The new solution is a compact low profile pole or wall mountable unit based on an integration of ADLINK’s latest AI Edge Server MECS-7210 and Charles’ SC102 Micro Edge Enclosure. 
How Dell EMC and NVIDIA aim to simplify the AI data centre
Businesses are realising they need AI at scale, and so enterprise IT teams are increasingly inserting themselves into their company’s AI agenda. 
Huawei commits to Hong Kong with new cloud tech
Hosting its 2019 Cloud Summit in Hong Kong, Huawei announced it is throwing significant investment into region.
Time to build tech on the automobile, not the horse and cart
Nutanix’s Jeff Smith believes one of the core problems of businesses struggling to digitally ‘transform’ lies in the infrastructure they use, the data centre.
Cloud providers increasingly jumping into gaming market
Aa number of major cloud service providers are uniquely placed to capitalise on the lucrative cloud gaming market.
Intel building US’s first exascale supercomputer
Intel and the Department of Energy are building potentially the world’s first exascale supercomputer, capable of a quintillion calculations per second.