How to avoid that next storage hardware refresh
Customers tell me repeatedly that they dread the phone call from their hardware storage vendor. Every three or four years, the call is predictable:
"Great news John, due to economies of scale the cost per GB of storage has dropped and performance has increased – we can also provide you with some flash to help improve performance. Plus, we have extra options like deduplication, encryption and replication – we will need to charge you for these.
Wait, there is more: "Yes, all you need to do is replace all that perfectly good working storage with our new array, and due to your explosive growth it will cost you slightly more than your last refresh. Our professional services team can help you with a migration project to move data from our oldto our new frame.
Sound familiar? How would you like to change the conversation?
"That is OK Bill, you know how we virtualised our servers all those years ago, and it changed how we purchased servers?"
"Well we are doing the same to our storage, and now we no longer need that hardware upgrade. In fact we are adding 1TB of RAM for about A$12,000, 5TB of high performance flash for about $A10,000.
For extra capacity, we are using multiple 10TB drives – they are costing us only about $A800 each and we can interconnect all our existing disparate SANs and NAS's into a single storage pool by using Objective-Defined Storage.
Objective-Defined Storage manages in real-time data tiering between hot, warm and cold data to free up RAM and SSD when an application no longer needs those expensive resources.
All resources can be shared across all applications – even applications on different hosts. It can even create up to eight resilient live instances of critical data using our existing storage including these inexpensive high capacity drives – pretty cool. Oh, and thank you for your suggestion, but we will give it a miss.
Software is king! And your data is your kingdom.
Why are you replacing your perfectly good storage? Reliability? Performance? Scalability?
Using Objective-Defined Storage, which is a software-only solution that allows users to extend the life of existing storage assets, exponentially increase performance and deliver totally elastic scale-up, scale-out, scale anyway you require, including the cloud capacity.
The technology extends the life of existing storage for a couple of years, allowing users to avoid those expensive fork-lift upgrades.
Using Objective-Defined Storage, typically allows users to free up between 50 to 80 per cent of capacity of a SAN or NAS by moving cold data to high capacity low cost devices, or the cloud; deliver exponential performance improvement using commodity SSDs; and improve data availability.
Two of the major outcomes are finally eliminating cost and complexity from a storage ecosystem by deploying the storage solution for the new enterprise.
Just like virtualisation changed the way we use and deploy servers, storage virtualisation de-couples data from the underlying storage hardware and allows data to move freely and ubiquitously across all storage, regardless of vendor, make or model.
This delivers data freedom to a business and puts an end to vendor lock-in, storage silos and forklift upgrades once and for all.