Story image

How storage virtualization delivers data: Always on, always available, always fast

03 Aug 2017

Virtualization has dramatically changed how companies manage their servers and applications. Software-defined networking (SDN) is removing the complexity from networking infrastructure. The remaining challenge is storage.

In the beginning everyone was reluctant, or even scared, to be the first to move to server virtualization – however, those companies immediately received strong competitive advantages and substantial reductions in cost and complexity of managing their environment.

Traditional SAN and NAS storage products were developed for physical servers and quite simply can’t deliver the scalability, reliability, performance or value that the new enterprise requires.

Consider this, 50 to 80% of all data is inactive, cold or stale – yet it is sitting on expensive storage – why?

Because storage vendors lock us into their storage silo and make it hard to move or migrate data.

Ask a SAN, NAS, All Flash or Hyper-converged vendor if you can buy the same commodity SSD or spinning disk that they use and add them to their array oryourself.

Of course not.

That is called vendor lock-in and a storage silo.

If you need more performance - doesn’t it make sense to add more RAM and flash/SSD, or for extra capacity simply leverage AWS, Azure or Google to deliver web scale storage on a low monthly subscription?

Inevitably companies will make the transition to storage virtualization solutions that remove storage silos, create an open, single storage pool utilising any available storage from RAM, SSD, DAS, SAN, NAS, JBOD or even any public or private cloud provider.

Active data is moved to RAM and SSD, warm data to SSD and the fastest spinning disks, and cold data is eventually migrated to the slowest storage or the cloud regardless of storage vendor, make or model – all without any user intervention.

The latest in storage virtualization solutions are totally hardware agnostic and tier data in real time according to application requirements or objectives.

An example of why this is so valuable: Use VDI (Virtual Desktop Infrastructure) you will almost certainly have an issue with boot storms – which is when everyone is logging in. To solve that you have probably purchased an All Flash Array and have dedicated 24 x 7 for example 2TB of that expensive all flash resource to the VDI boot storm.

The boot storm only lasts from 07:30 to 09:30 – so doesn’t it make sense to have a storage virtualization solution that moves the VDI images in RAM and flash during the boot storm, but once the boot storm is finished, release all that expensive performance storage for another application to use – for example running an Oracle report – all without any user intervention. Again, once the report is finished running, that expensive storage is freed up for another application to use.

Storage virtualization eradicates expensive storage silos and eliminates complexity by providing one dashboard from which you can view and manage all storage assets making capacity planning vastly more simplified. If you need more performance, this is as available as moving a slider without downtime or disruption to the users or applications.

If there’s not enough RAM or flash, the system will tell you and you can invest in commodity RAM and flash and add it to the network – it is instantly available to all applications, if needed.

The latest storage virtualization solutions are built using an Objective-Defined Storage platform. Data from all existing storage hardware is absorbed into the single virtual storage pool.

For each application simply set the performance (IOPS or latency), capacity and protection (number of Live Instances, snap copies) objectives for that application or group of applications and everything else is fully orchestrated and driven by the latest in Artificial Intelligence and swarm technologies.

The outcomes of migrating to an open storage virtualization platform such as Objective Defined Storage are simple – if data is important, and required to be always on, always available and always fast, it is time to consider Objective-Defined Storage.

Deployment is simple, typically less than one hour and can be deployed one application at a time without any changes to the application or how users access their data.

Note: when looking at storage virtualization solutions, it is critical to ensure that is an open platform, manage any storage from any vendor, otherwise you may find yourself locked into a storage hardware platform and a storage silo. Just remember for your server virtualization deployments – it doesn’t matter if you had HP, DELL, Lenovo or IBM servers – the virtualization platform works across all of them.

Finally storage virtualization and Objective Defined Storage deliver data freedom from storage hardware, and enable digital transformation.

Article by Greg Wyman, VP Asia Pacific, ioFabric

Dell EMC’s six server market trends
As the evolution of cloud-based computing continues, it is important to know what’s ahead to stay ahead of the market.
Huawei FusionServer Pro built for 'intelligent transformation'
The next generation X86 servers draw on an intelligent acceleration engine, an intelligent management ending, and intelligent data center solutions for ‘diverse’ scenarios as transformation shifts from digital to intelligent.
HFW deploys digital edge strategy on Equinix
Equinix announced that global law firm HFW has collaborated with Equinix to build out its digital edge in key markets including Dubai, London, Hong Kong, Melbourne and Paris.
SEAX Singapore hosts new Epsilon PoP in Kuala Lumpur
The partnership will allow Epsilon customers and partners to onnect at the new PoP through Epsilon’s Infiny by Epsilon Software-Defined Networking (SDN) platform.
Teradata expands as-a-service offerings for Advantage platform
Data intelligence company Teradata has announced three new cloud and on-premise solutions that are now integrated into its Teradata Vantage platform.
AirTrunk raises $450m: Singapore hyperscale data center on the cards
Australian hyperscale data center specialist AirTrunk has raised SG$450 million in order to finance its expansion across Asia Pacific.
Cisco leads Australian network infrastructure market - IDC
Despite a drop in router and wireless LAN, ethernet switches are quickly gaining popularity, according to the latest statistics from IDC.
Hawaiki expands US point-of-presence to Seattle
The Hawaiki submarine cable that connects Australia, New Zealand, the Pacific Islands and Hawaii to the United States now has a new point of presence in Seattle.