Story image

Opinion: 5 data center ideas that would never have been imagined 50 years ago

28 Jun 2018

Article by Schneider Electric global marketing vice president Abby Gabriel

If you could go back in time, how would you possibly explain modern data center concepts to a late-1960s “data processing programmer”? In those days it was not uncommon to see a paper sign posted in the data center that would read “The computer is down today”. Computer systems were big and finnicky. Not many people knew how to run them or fix them.  In fact, the building that housed THE computer was not called a data center at all.  “Processing center” was the more familiar nomenclature.

Most of these processing centers were owned, operated and maintained by big banks. Trucks loaded with paper would arrive in the evening, and the data would be “crunched” overnight in the processing center. Printouts would be created and sent back to bank branches the next day. State of the art. Right?

Given the realities of that bygone era, how would our late-60’s processing operator react to the following list of modern data center “unimaginables”?

1. “You don’t need the utility to keep your data center running.”

Data centers used to rely on utility power alone. If utility power failed, you were out of luck. Now, in addition to elaborate power backup plans (supported by UPS and generators), power devices within racks are modular and hot swappable. If one fails, the other power modules take on the added load. The failed power module can be replaced by simply sliding out the bad module and replacing it with a new one.  All without interruption, and invisible to the end user who may be sitting thousands of miles away.

2. “You need to cool your data center? Use outside air.”

Computer equipment was highly sensitive in the old days, and internal environmental conditions had to be precisely controlled. Today eco-mode cooling techniques that deploy a variety of economizer technologies allow data center owners to save lots of money by using the power of mother nature to cool their data centers…all without the fear of risking downtime.

3. “If you’re short of qualified staff, just outsource.”

The concept of outsourcing was virtually unknown in the late 1960’s.  Trusting your computer operations to any outside organization was unthinkable.  Today, even the most specialized aspects of data center operations can be outsourced to any number of highly qualified experts.

4. “If you’re short on space, just order a data center in a box.”

Today, the popularity of pre-fab data centers is on the rise.  The power, cooling, and racks required are all pre-configured and preassembled for rapid delivery and for immediate “plug and play” upgrades, or for quickly commissioning “edge” data centers that help support bandwidth-intensive remote applications.

5. “To avoid downtime due to component failure, practice predictive maintenance.”

As the “Internet of Things” (IoT) revolution accelerates into full swing, it is now possible to gather much more precise data surrounding data center and facility equipment performance and to analyze that data for the purpose of predicting much more accurate future performance.  Such practices save millions of dollars each year for those businesses that still rely on break/fix and preventative approaches for maintaining their data centers.

Nostalgia is nice, but in this day and age, technology advancements are too good to ignore.

Data centers have come a long way, especially in the area of integrated data center architectures.

Schneider Electric’s EcoStruxure IT architecture, for example, can be delivered to end users through reference designs, pre-configured solutions, and prefabricated solutions. It can be configured as an entire data center or it can start out as an infrastructure product, like an Uninterruptible Power Supply (UPS) that is managed through the cloud and supported with a 24/7 service bureau.

It can be deployed all at once or it can be built in stages or pieces. EcoStruxure IT consists of three layers ̶ connected products, edge control, and analytics ̶ that are integrated to facilitate IoT connectivity and mobility, cloud analytics, and cybersecurity.

What is the benefit of embracing such an open data center architecture?  Your next big idea can be delivered over a compressed time period to help your data center produce business value and drive your organization’s competitive advantage.

Dropbox invests in hosting data inside Australia
Global collaboration platform Dropbox has announced it will now host Australian customer files onshore to support its growing base in the country.
Opinion: Meeting the edge computing challenge
Scale Computing's Alan Conboy discusses the importance of edge computing and the imminent challenges that lie ahead.
Alibaba Cloud discusses past and unveils ‘strategic upgrade’
Alibaba Group's Jeff Zhang spoke about the company’s aim to develop into a more technologically inclusive platform.
Protecting data centres from fire – your options
Chubb's Pierre Thorne discusses the countless potential implications of a data centre outage, and how to avoid them.
Opinion: How SD-WAN changes the game for 5G networks
5G/SD-WAN mobile edge computing and network slicing will enable and drive innovative NFV services, according to Kelly Ahuja, CEO, Versa Networks
TYAN unveils new inference-optimised GPU platforms with NVIDIA T4 accelerators
“TYAN servers with NVIDIA T4 GPUs are designed to excel at all accelerated workloads, including machine learning, deep learning, and virtual desktops.”
AMD delivers data center grunt for Google's new game streaming platform
'By combining our gaming DNA and data center technology leadership with a long-standing commitment to open platforms, AMD provides unique technologies and expertise to enable world-class cloud gaming experiences."
Inspur announces AI edge computing server with NVIDIA GPUs
“The dynamic nature and rapid expansion of AI workloads require an adaptive and optimised set of hardware, software and services for developers to utilise as they build their own solutions."