Story image

Schneider Electric: Water, water, everywhere, nor any drop to drink…

23 May 2017

The data center industry is sitting on a massive resource which could enable huge financial savings.

I’ve taken the title for this blog from a famous old English poem, “The Rime (sic) of the Ancient Mariner” by Samuel Taylor Coleridge.

The verses relate the experiences of a sailor returned from a long sea journey. During his voyage the ship he is on is becalmed in uncharted waters near the equator.

Those aboard are tormented to death by thirst. At the same time, the ship itself is floating in the solution to the crew’s problems. Water everywhere, but nothing to drink.

I believe this is a potent metaphor for our industry. But this is not a blog about water use in the industry, as important as that is. This blog is about the ubiquity of a resource which could be instrumental in delivering important productivity and efficiency gains.

I’m talking, of course, about data. As of today, and despite all the uses we are putting data to, there’s been an ironic resistance to make the most of what is a free resource to most data centers.

The entire data center sector exists to support the communication, processing and storage of data. But besides the data we are handling, we also have data pouring out of the physical infrastructure which provisions a mountain of IT and network equipment.

Together with sensors monitoring everything from airflow to temperature and humidity, we already have a potential embarrassment of riches which can be readily tapped into.

When Data Center Infrastructure Management (DCIM) software was first defined and mooted as a way for us to get better with the way we design, operate and upgrade facilities, a perceived drawback to market adoption was the cost and complexity of instrumenting.

Without instrumentation, went the argument, the data required to deliver the promises of DCIM were not realizable. Less than a decade later, nothing could be further from the truth.

In my opinion, right now we have enough temperature sensors installed in data center cabinets around the globe for us to be making wide sweeping operational improvements and efficiency cost reductions.

What’s more, with very little effort, with the aid of a little analytics, this information and guidance could be freely and securely accessed by any data center owner or operator within a very short timescale. I’m talking days or weeks, not months or years.

In recent years it’s become fashionable to say that anything that appears too good to be true probably is.

However, big data and data analytics have already shown their worth in a variety of uses from disaster response and pest control, to crop yields and drug interactions.

Researchers at Trinity College, Dublin are even using big data to solve challenges created by the demand for big data.

A recent white paper from Schneider Electric’s Data Center Science Center talks about big data analytics, as well as six other trends including cloud computing, mobile computing, machine learning, embedded system performance and cost improvements, automation for labor savings and cyber security, which are influencing data center monitoring. 

If the data center industry is going to thrive sustainably, then we need to take advantage of the very technology which we provision. That’s not exactly rocket science.

The commercial and environmental drivers for increased efficiency at lower cost will only harden in the coming decade. We need to make good decisions by getting the right data into the hands of data center professionals.

If we don’t make a response in the immediate future, I fear we’ll share the same fate as the ancient mariners.

Article by Kim Povlsen, Schneider Electric Data Center Blog

Bluzelle launches data delivery network to futureproof the edge
“Currently applications are limited to data caching technologies that require complex configuration and management of 10+ year old technology constrained to a few data centers."
DDN completes Nexenta acquisition
DDN holds a suite of products, solutions, and services that aim to enable AI and multi-cloud.
Trend Micro introduces cloud and container workload security offering
Container security capabilities added to Trend Micro Deep Security have elevated protection across the DevOps lifecycle and runtime stack.
Veeam joins the ranks of $1bil-revenue software companies
It’s also marked a milestone of 350,000 customers and outlined how it will begin the next stage of its growth.
Veeam enables secondary storage solutions with technology partner program
Veeam has worked with its strategic technology alliance partners to provide flexible deployment options for customers that have continually led to tighter levels of integration.
Veeam Availability Orchestrator update aims to democratise DR
The ability to automatically test, document and reliably recover entire sites, as well as individual workloads from backups in a completely orchestrated way lowers the total cost of ownership (TCO) of DR.
Why flash should be considered the storage king
Not only is flash storage being used for recovery, it has found a role in R&D environments and in the cloud with big players including AWS, Azure and Google opting for block flash storage options.
NVIDIA's data center business slumps 10% in one year
The company recently released its Q1 financial results for fiscal 2020, which puts the company’s revenue at US$2.22 billion – a slight raise from $2.21 billion in the previous quarter.