Story image

Speak like a data center geek: Virtualization

09 Dec 2016

Our blog series, “How to Speak Like a Data Center Geek,” works to demystify the complexities of the data center, and this time, we’re taking on virtualization. This is a topic that has managed to remain on the cutting edge for more than five decades, ever since virtualization technology was applied to software starting in the 1960s.

Virtualization is, in some sense, about illusion, though not the kind that involves, um, spitting out live frogs. It can create what the O’Reilly Media book “Virtualization: A Manager’s Guide,” called “the artificial view that many computers are a single computing resource or that a single machine is really many individual computers.”

Or: “It can make a single large storage resource appear to be many smaller ones or make many smaller storage devices appear to be a single device.”

The goals of virtualization include:

  • Higher performance levels
  • Improved scalability and agility
  • Better reliability/availability
  • To create a unified security and management domain

Whatever the goal, odds are virtualization technology is at work in your data center right now. In this “Data Center Geek” entry, we’ll look at a few different layers of virtualization.

Virtualization

First, we start with a baseline definition. Virtualization is a way to extract applications and their underlying components from the hardware supporting them and present a logical or virtual view of these resources. This logical view may be strikingly different from the physical view.

Consider a virtually partitioned hard drive, for example. Physically, it’s plainly just one hard drive. But virtualization allows us to construct a logical division of the hard drive that creates two separate hard drives that operate independently, maximizing processing power.

Access virtualization

This layer allows individuals to work from wherever they are, while using whatever networking media and whatever endpoint device is available. Access virtualization technology makes it possible for nearly any type of device to access nearly any type of application without forcing the individual or the application to know too much about the underlying technology.

Application virtualization

This technology works above the operating system, making it possible for applications to be encapsulated and allowing them to execute on older or newer operating systems that would normally pose incompatibilities. Some forms of this technology allow applications to be “streamed” down to remote systems, execute there and then be removed. This approach can increase levels of security and prevent data loss.

Processing virtualization

This technology is the current media darling. This layer hides the physical hardware configuration from system services, operating systems or applications. One type makes it possible for one system to appear to be many, so it can support many independent workloads. The second type makes it possible for many systems to be viewed as a single computing resource.

Network virtualization

This layer can hide the actual hardware configuration from systems, making it possible for many groups of systems to share a single, high-performance network while thinking each of those groups has a network all to itself. See? Illusion.

Network virtualization can use system memory to provide caching, or system processors to provide compression or eliminate redundant data to enhance network performance.

Storage virtualization

Like the network virtualization layer, this layer hides where storage systems are and what type of device is actually storing applications and data. It allows many systems to share the same storage devices without knowing that others are also accessing them. This technology also makes it possible to take a snapshot of a live system so that it can be backed up without hindering online or transactional applications.

Article by Jim Poole, Equinix blog network

Enterprise cloud deployments being exploited by cybercriminals
A new report has revealed a concerning number of enterprises still believe security is the responsibility of the cloud service provider.
Ping Identity Platform updated with new CX and IT automation
The new versions improve the user and administrative experience, while also aiming to meet enterprise needs to operate quickly and purposefully.
Pure Storage expands enterprise data management solutions
It has integrated StorReduce technologies for a cloud-native back up platform, and expanded its data fabric solution for cloud-based applications.
Seagate: Data trends, opportunities, and challenges at the edge
The development of edge technology and the rise of big data have brought many opportunities for data infrastructure companies to the fore.
How Renesas aims to simplify building automation
“With the trend toward energy efficiency and green design of commercial buildings, the challenge of renovating existing facilities is growing."
HPE launches new real-time processing edge platform
The platform is said to help communication service providers (CSPs) to capitalize on data-intensive, low-latency services for media delivery, connected mobility, and smart cities.
‘Digital twins’ entering mainstream use sooner than expected
The term ‘digital twin’ may sound foreign to some, but Gartner says it is rapidly becoming established among modern organisations.
Katalyst to build new subsurface data centre in Malaysia
The company plans to open a subsurface data centre in Kuala Lumpur to support its oil and gas customers.