Story image

Speak like a data center geek: Virtualization

09 Dec 2016

Our blog series, “How to Speak Like a Data Center Geek,” works to demystify the complexities of the data center, and this time, we’re taking on virtualization. This is a topic that has managed to remain on the cutting edge for more than five decades, ever since virtualization technology was applied to software starting in the 1960s.

Virtualization is, in some sense, about illusion, though not the kind that involves, um, spitting out live frogs. It can create what the O’Reilly Media book “Virtualization: A Manager’s Guide,” called “the artificial view that many computers are a single computing resource or that a single machine is really many individual computers.”

Or: “It can make a single large storage resource appear to be many smaller ones or make many smaller storage devices appear to be a single device.”

The goals of virtualization include:

  • Higher performance levels
  • Improved scalability and agility
  • Better reliability/availability
  • To create a unified security and management domain

Whatever the goal, odds are virtualization technology is at work in your data center right now. In this “Data Center Geek” entry, we’ll look at a few different layers of virtualization.

Virtualization

First, we start with a baseline definition. Virtualization is a way to extract applications and their underlying components from the hardware supporting them and present a logical or virtual view of these resources. This logical view may be strikingly different from the physical view.

Consider a virtually partitioned hard drive, for example. Physically, it’s plainly just one hard drive. But virtualization allows us to construct a logical division of the hard drive that creates two separate hard drives that operate independently, maximizing processing power.

Access virtualization

This layer allows individuals to work from wherever they are, while using whatever networking media and whatever endpoint device is available. Access virtualization technology makes it possible for nearly any type of device to access nearly any type of application without forcing the individual or the application to know too much about the underlying technology.

Application virtualization

This technology works above the operating system, making it possible for applications to be encapsulated and allowing them to execute on older or newer operating systems that would normally pose incompatibilities. Some forms of this technology allow applications to be “streamed” down to remote systems, execute there and then be removed. This approach can increase levels of security and prevent data loss.

Processing virtualization

This technology is the current media darling. This layer hides the physical hardware configuration from system services, operating systems or applications. One type makes it possible for one system to appear to be many, so it can support many independent workloads. The second type makes it possible for many systems to be viewed as a single computing resource.

Network virtualization

This layer can hide the actual hardware configuration from systems, making it possible for many groups of systems to share a single, high-performance network while thinking each of those groups has a network all to itself. See? Illusion.

Network virtualization can use system memory to provide caching, or system processors to provide compression or eliminate redundant data to enhance network performance.

Storage virtualization

Like the network virtualization layer, this layer hides where storage systems are and what type of device is actually storing applications and data. It allows many systems to share the same storage devices without knowing that others are also accessing them. This technology also makes it possible to take a snapshot of a live system so that it can be backed up without hindering online or transactional applications.

Article by Jim Poole, Equinix blog network

Atos develops edge server with security in mind
The BullSequana Edge server is able to securely manage and process IoT data close to the source of data generation so that it is treated immediately.
Sony and Microsoft to explore strategic partnership
“Our partnership brings the power of Azure and Azure AI to Sony."
Google puts Huawei on the Android naughty list
Google has apparently suspended Huawei’s licence to use the full Android platform, according to media reports.
Fujitsu and Veeam partner to offer simplified backup and recovery
This new partnership promises the increased availability of data and faster recovery from disasters and unplanned system downtime.
AAEON wins edge accolades at COMPUTEX 2019
AI edge and IoT network solutions manufacturer AAEON has picked up two accolades at the COMPUTEX d&i Awards 2019.
AI driving 'unprecedented' M&A growth
Breakthroughs in artificial intelligence are causing ‘unprecedented’ growth for mergers and acquisitions, as companies grapple for their share of an AI market that will be worth $190 billion by 2025.
Chorus partners with Nlyte, expands edge data centre offerings
Chorus announced today that it is going ahead with expanding its Chorus EdgeCentre Colocation product to three sites across New Zealand.
Schneider shares advice for solving edge computing challenges
Schneider Electric has shared the findings of a new whitepaper that delves into the issues of deploying IT at the edge.