Story image

Data center modernization: A plan for how much to outsource

24 May 2017

In order to remain competitive, businesses have to modernize their data centers either to reflect changing marketplace conditions or to accommodate shifting company business goals. That means aging data centers can quickly become a detriment to business growth.

According to Gartner, up to 74% of IT expenses in aging data centers are dedicated to operations and maintenance whereas only 26% are dedicated to business innovation. This disparity continues to widen as the data center continues to age.

The spectrum for how to modernize an existing data center is quite wide, and the cost/benefit scenarios differ.

Points on the spectrum include the upgrade of an existing facility, consolidation of existing small server rooms and/or wiring closets into a larger data center, a complete new build out, the rapid integration of new capacity via pre-fabricated power and cooling modules, migration of the data center to a colocation facility, the offloading of certain applications and platforms to private or public cloud, or some combination of these options.

One critical decision point involves how much of the data center operation remains “in house” and how much is outsourced to the cloud and/or colocation facilities.

At this stage of the decision-making process, both quantitative and qualitative differences need to be considered. A 10-year TCO (Total Cost of Ownership) may favor upgrading or building an existing data center over outsourcing.

However, the TCO business case may be overwhelmed by a business’ sensitivity to cash flow, cash cross-over point, deployment timeframe, data center life expectancy, regulatory requirements, and other strategic factors.

When trying to decide where you are on the modernization spectrum (and where you’d like to be), three important categories of variables need to be identified: data center physical characteristics, build cost variables and outsourcing cost variables.

Let’s explore these three elements.

  • Data center physical / environmental characteristics – These variables include electricity rate values, current IT load, expected final IT load (consider known business risks, potential market changes, and capacity plan), expected time interval between current state and anticipated state of development, power density (impacts the amount of space needed for IT equipment in the data center), and cost of capital.
  • Build cost variables – This category of variables includes cost per watt for building out in an identified location, building lease cost per month, bandwidth requirement costs (will vary depending upon redundancy, geography and carrier costs), yearly percentage increase of operational expense (includes electricity, maintenance and staffing), and data center efficiency (measured in Power Usage Effectiveness – PUE).  
  • Outsourcing cost variables – These include rack space cost (monthly cost of leasing a single rack), space reservation cost (reserving space in case you want to expand rack footprint in the future), space cost increase, power circuit cost (this cost increases with additional redundancy and density), bandwidth cost (this cost depends more on the application and less on the power capacity).

Once these variables are identified, it becomes easier to evaluate cashflow and total cost of ownership scenarios, and to make the decision that best suits your business.

When undertaking a data center modernization project, keep in mind that the various options on the spectrum need not be treated as separate and distinct approaches.

Data center stakeholders may want to combine options in order to better accommodate a particular migration timeline. Or cautious executives may want to simply dabble with the outsourcing option by piloting only a few select applications while still maintaining a core corporate data center.

The key critical success factor is the recognition that data center modernization is not a one-time fix, but rather a critical piece of an ongoing strategy to better service customers.

Article by Martin Heller, Schneider Electric Data Center Blog

Enterprise cloud deployments being exploited by cybercriminals
A new report has revealed a concerning number of enterprises still believe security is the responsibility of the cloud service provider.
Ping Identity Platform updated with new CX and IT automation
The new versions improve the user and administrative experience, while also aiming to meet enterprise needs to operate quickly and purposefully.
Pure Storage expands enterprise data management solutions
It has integrated StorReduce technologies for a cloud-native back up platform, and expanded its data fabric solution for cloud-based applications.
Seagate: Data trends, opportunities, and challenges at the edge
The development of edge technology and the rise of big data have brought many opportunities for data infrastructure companies to the fore.
How Renesas aims to simplify building automation
“With the trend toward energy efficiency and green design of commercial buildings, the challenge of renovating existing facilities is growing."
HPE launches new real-time processing edge platform
The platform is said to help communication service providers (CSPs) to capitalize on data-intensive, low-latency services for media delivery, connected mobility, and smart cities.
‘Digital twins’ entering mainstream use sooner than expected
The term ‘digital twin’ may sound foreign to some, but Gartner says it is rapidly becoming established among modern organisations.
Katalyst to build new subsurface data centre in Malaysia
The company plans to open a subsurface data centre in Kuala Lumpur to support its oil and gas customers.