Story image

Storage squeeze: Why 2016 is the year of consolidation in the storage industry

28 Apr 16

Article by Tim Jones, senior technical specialist at Tintri

Today’s storage market is a tussle between old and new. While the technology has changed considerably in the enterprise storage market since the late nineties, the advent of both flash and VM-aware storage is further shaking up the landscape.

So how will the industry cope?

Storage through the years

When I started out, enterprise applications predominantly ran on centralised compute platforms. Skip ahead several years and the market had changed significantly; mid-range and micro (PC) based server systems were proliferating and most systems were using RAID (redundant array of inexpensive disks). At this time, the server was essentially purpose built with storage to meet the needs of the application.

In the late nineties and early 2000s, Storage Area Network (SAN) systems were introduced. These systems eased the management difficulties of decentralised servers and changed how enterprise storage was used and consumed. Shortly after this, Network Attached Storage (NAS) came onto the scene, providing storage to the masses and offering another option for shared server storage.

Having these systems meant the overall environment could be more efficient and still provide good throughput to client systems. If you required high speed data storage, the usual outcome was an entire rack filled with 73GB drives – the more disk spindles you had, the greater the performance. Hence for these systems, performance was expensive and capacity was cheap.

A new datacentre technology emerged around 2007 – virtualisation. Not only did this mean abstraction at the storage level, but also at the server level. As a result, the server design that was carefully planned for a SQL server in 1998 was no longer a simple matter. We needed to provide storage that was fast and large enough for multiple systems, and diagnose problems when one of the servers was misbehaving.

But the real turning point for storage was in 2010 when SSD and flash systems came on the market, and the traditional view of high-speed storage systems was flipped on its head – performance was cheap, capacity expensive.

Where’s the market heading?

What we now have is a slew of new players in the enterprise storage marketplace and a revolution in SAN and NAS system architectures to support SSD. With performance as table stakes, it’s management effort that’s the current differentiator between vendors.

According to Tintri’s annual State of Storage report, which surveyed hundreds of datacentre professionals globally, manageability is now acknowledged as the biggest storage pain - leapfrogging performance as the greatest thorn in a datacentre’s side.

When asked what steps they were taking to address these challenges, 68 percent of datacentre professionals said they were evaluating new technologies and 48 percent were evaluating new storage vendors.

We are also witnessing growing momentum behind VAS (VM-aware storage) arrays. VAS is specifically designed to overcome the shortcomings of the highly abstracted environment we see for 90-plus percent of the server workloads in the enterprise that are virtualised. Indeed, Tintri’s State of Storage report found that 52 percent of organisations are looking into boundary-pushing, virtualisation-specific VM-aware storage.

These macro trends are putting the squeeze on legacy storage providers that lack the agility to respond. Dell and EMC are merging into an even larger (and presumably less flexible) entity. NetApp has announced layoffs in the wake of declining product revenues. And other upstarts have struggled to find their footing amidst all the chop.

With all the change happening in the technology and in the market, now is the time to stay focused on the players who have the most compelling all-flash and VM-aware storage offerings. That’s the way to avoid the tussle and keep storage simple.

Article by Tim Jones, senior technical specialist at Tintri

MulteFire announces industrial IoT network specification
The specification aims to deliver robust wireless network capabilities for Industrial IoT and enterprises.
Google Cloud, Palo Alto Networks extend partnership
Google Cloud and Palo Alto Networks have extended their partnership to include more security features and customer support for all major public clouds.
DigiCert conquers Google's distrust of Symantec certs
“This could have been an extremely disruptive event to online commerce," comments DigiCert CEO John Merrill. 
Schneider Electric's bets for the 2019 data centre industry
From IT and telco merging to the renaissance of liquid cooling, here are the company's top predictions for the year ahead.
China to usurp Europe in becoming AI research world leader
A new study has found China is outpacing Europe and the US in terms of AI research output and growth.
Fujitsu’s WA data centre undergoing efficiency upgrade
Fujitsu's Malaga data centre in Perth has hit a four-star rating from National Australia Built Environment Rating System (NABERS).
Google says ‘circular economy’ needed for data centres
Google's Sustainability Officer believes major changes are critical in data centres to emulate the cyclical life of nature.
How to keep network infrastructure secure and available
Two OVH executives have weighed in on how network infrastructure and the challenges in that space will be evolving in the coming year.