Story image

How storage virtualization delivers data: Always on, always available, always fast

03 Aug 2017

Virtualization has dramatically changed how companies manage their servers and applications. Software-defined networking (SDN) is removing the complexity from networking infrastructure. The remaining challenge is storage.

In the beginning everyone was reluctant, or even scared, to be the first to move to server virtualization – however, those companies immediately received strong competitive advantages and substantial reductions in cost and complexity of managing their environment.

Traditional SAN and NAS storage products were developed for physical servers and quite simply can’t deliver the scalability, reliability, performance or value that the new enterprise requires.

Consider this, 50 to 80% of all data is inactive, cold or stale – yet it is sitting on expensive storage – why?

Because storage vendors lock us into their storage silo and make it hard to move or migrate data.

Ask a SAN, NAS, All Flash or Hyper-converged vendor if you can buy the same commodity SSD or spinning disk that they use and add them to their array or frame yourself.

Of course not.

That is called vendor lock-in and a storage silo.

If you need more performance - doesn’t it make sense to add more RAM and flash/SSD, or for extra capacity simply leverage AWS, Azure or Google to deliver web scale storage on a low monthly subscription?

Inevitably companies will make the transition to storage virtualization solutions that remove storage silos, create an open, single storage pool utilising any available storage from RAM, SSD, DAS, SAN, NAS, JBOD or even any public or private cloud provider.

Active data is moved to RAM and SSD, warm data to SSD and the fastest spinning disks, and cold data is eventually migrated to the slowest storage or the cloud regardless of storage vendor, make or model – all without any user intervention.

The latest in storage virtualization solutions are totally hardware agnostic and tier data in real time according to application requirements or objectives.

An example of why this is so valuable: Use VDI (Virtual Desktop Infrastructure) you will almost certainly have an issue with boot storms – which is when everyone is logging in. To solve that you have probably purchased an All Flash Array and have dedicated 24 x 7 for example 2TB of that expensive all flash resource to the VDI boot storm.

The boot storm only lasts from 07:30 to 09:30 – so doesn’t it make sense to have a storage virtualization solution that moves the VDI images in RAM and flash during the boot storm, but once the boot storm is finished, release all that expensive performance storage for another application to use – for example running an Oracle report – all without any user intervention. Again, once the report is finished running, that expensive storage is freed up for another application to use.

Storage virtualization eradicates expensive storage silos and eliminates complexity by providing one dashboard from which you can view and manage all storage assets making capacity planning vastly more simplified. If you need more performance, this is as available as moving a slider without downtime or disruption to the users or applications.

If there’s not enough RAM or flash, the system will tell you and you can invest in commodity RAM and flash and add it to the network – it is instantly available to all applications, if needed.

The latest storage virtualization solutions are built using an Objective-Defined Storage platform. Data from all existing storage hardware is absorbed into the single virtual storage pool.

For each application simply set the performance (IOPS or latency), capacity and protection (number of Live Instances, snap copies) objectives for that application or group of applications and everything else is fully orchestrated and driven by the latest in Artificial Intelligence and swarm technologies.

The outcomes of migrating to an open storage virtualization platform such as Objective Defined Storage are simple – if data is important, and required to be always on, always available and always fast, it is time to consider Objective-Defined Storage.

Deployment is simple, typically less than one hour and can be deployed one application at a time without any changes to the application or how users access their data.

Note: when looking at storage virtualization solutions, it is critical to ensure that is an open platform, manage any storage from any vendor, otherwise you may find yourself locked into a storage hardware platform and a storage silo. Just remember for your server virtualization deployments – it doesn’t matter if you had HP, DELL, Lenovo or IBM servers – the virtualization platform works across all of them.

Finally storage virtualization and Objective Defined Storage deliver data freedom from storage hardware, and enable digital transformation.

Article by Greg Wyman, VP Asia Pacific, ioFabric

Dropbox invests in hosting data inside Australia
Global collaboration platform Dropbox has announced it will now host Australian customer files onshore to support its growing base in the country.
Opinion: Meeting the edge computing challenge
Scale Computing's Alan Conboy discusses the importance of edge computing and the imminent challenges that lie ahead.
Alibaba Cloud discusses past and unveils ‘strategic upgrade’
Alibaba Group's Jeff Zhang spoke about the company’s aim to develop into a more technologically inclusive platform.
Protecting data centres from fire – your options
Chubb's Pierre Thorne discusses the countless potential implications of a data centre outage, and how to avoid them.
Opinion: How SD-WAN changes the game for 5G networks
5G/SD-WAN mobile edge computing and network slicing will enable and drive innovative NFV services, according to Kelly Ahuja, CEO, Versa Networks
TYAN unveils new inference-optimised GPU platforms with NVIDIA T4 accelerators
“TYAN servers with NVIDIA T4 GPUs are designed to excel at all accelerated workloads, including machine learning, deep learning, and virtual desktops.”
AMD delivers data center grunt for Google's new game streaming platform
'By combining our gaming DNA and data center technology leadership with a long-standing commitment to open platforms, AMD provides unique technologies and expertise to enable world-class cloud gaming experiences."
Inspur announces AI edge computing server with NVIDIA GPUs
“The dynamic nature and rapid expansion of AI workloads require an adaptive and optimised set of hardware, software and services for developers to utilise as they build their own solutions."