Story image

Opinion: 5 data center ideas that would never have been imagined 50 years ago

28 Jun 18

Article by Schneider Electric global marketing vice president Abby Gabriel

If you could go back in time, how would you possibly explain modern data center concepts to a late-1960s “data processing programmer”? In those days it was not uncommon to see a paper sign posted in the data center that would read “The computer is down today”. Computer systems were big and finnicky. Not many people knew how to run them or fix them.  In fact, the building that housed THE computer was not called a data center at all.  “Processing center” was the more familiar nomenclature.

Most of these processing centers were owned, operated and maintained by big banks. Trucks loaded with paper would arrive in the evening, and the data would be “crunched” overnight in the processing center. Printouts would be created and sent back to bank branches the next day. State of the art. Right?

Given the realities of that bygone era, how would our late-60’s processing operator react to the following list of modern data center “unimaginables”?

1. “You don’t need the utility to keep your data center running.”

Data centers used to rely on utility power alone. If utility power failed, you were out of luck. Now, in addition to elaborate power backup plans (supported by UPS and generators), power devices within racks are modular and hot swappable. If one fails, the other power modules take on the added load. The failed power module can be replaced by simply sliding out the bad module and replacing it with a new one.  All without interruption, and invisible to the end user who may be sitting thousands of miles away.

2. “You need to cool your data center? Use outside air.”

Computer equipment was highly sensitive in the old days, and internal environmental conditions had to be precisely controlled. Today eco-mode cooling techniques that deploy a variety of economizer technologies allow data center owners to save lots of money by using the power of mother nature to cool their data centers…all without the fear of risking downtime.

3. “If you’re short of qualified staff, just outsource.”

The concept of outsourcing was virtually unknown in the late 1960’s.  Trusting your computer operations to any outside organization was unthinkable.  Today, even the most specialized aspects of data center operations can be outsourced to any number of highly qualified experts.

4. “If you’re short on space, just order a data center in a box.”

Today, the popularity of pre-fab data centers is on the rise.  The power, cooling, and racks required are all pre-configured and preassembled for rapid delivery and for immediate “plug and play” upgrades, or for quickly commissioning “edge” data centers that help support bandwidth-intensive remote applications.

5. “To avoid downtime due to component failure, practice predictive maintenance.”

As the “Internet of Things” (IoT) revolution accelerates into full swing, it is now possible to gather much more precise data surrounding data center and facility equipment performance and to analyze that data for the purpose of predicting much more accurate future performance.  Such practices save millions of dollars each year for those businesses that still rely on break/fix and preventative approaches for maintaining their data centers.

Nostalgia is nice, but in this day and age, technology advancements are too good to ignore.

Data centers have come a long way, especially in the area of integrated data center architectures.

Schneider Electric’s EcoStruxure IT architecture, for example, can be delivered to end users through reference designs, pre-configured solutions, and prefabricated solutions. It can be configured as an entire data center or it can start out as an infrastructure product, like an Uninterruptible Power Supply (UPS) that is managed through the cloud and supported with a 24/7 service bureau.

It can be deployed all at once or it can be built in stages or pieces. EcoStruxure IT consists of three layers ̶ connected products, edge control, and analytics ̶ that are integrated to facilitate IoT connectivity and mobility, cloud analytics, and cybersecurity.

What is the benefit of embracing such an open data center architecture?  Your next big idea can be delivered over a compressed time period to help your data center produce business value and drive your organization’s competitive advantage.

Lenovo DCG moves Knight into A/NZ general manager role
Knight will now relocate to Sydney where he will be tasked with managing and growing the company’s data centre business across A/NZ.
The key to financial institutions’ path to digital dominance
By 2020, about 1.7 megabytes a second of new information will be created for every human being on the planet.
Is Supermicro innocent? 3rd party test finds no malicious hardware
One of the larger scandals within IT circles took place this year with Bloomberg firing shots at Supermicro - now Supermicro is firing back.
Record revenues from servers selling like hot cakes
The relentless demand for data has resulted in another robust quarter for the global server market with impressive growth.
Opinion: Critical data centre operations is just like F1
Schneider's David Gentry believes critical data centre operations share many parallels to a formula 1 race car team.
MulteFire announces industrial IoT network specification
The specification aims to deliver robust wireless network capabilities for Industrial IoT and enterprises.
Google Cloud, Palo Alto Networks extend partnership
Google Cloud and Palo Alto Networks have extended their partnership to include more security features and customer support for all major public clouds.
DigiCert conquers Google's distrust of Symantec certs
“This could have been an extremely disruptive event to online commerce," comments DigiCert CEO John Merrill.