Story image

How to converse in cloud: Cloud balancing

09 Feb 2017

As with many cloud technologies, cloud balancing not only changes the nature of traditional load balancing, it changes the whole way applications are delivered to end users.

The Server Traffic Cop

To understand cloud balancing, it’s important to understand load balancers. Anyone in IT has most likely been working with load balancers (aka: application delivery controllers) for years. Load balancers represent one of the best ways to ensure fast application response and consistent uptime.

Load balancers act like traffic cops, standing in front of a bank of web or application servers and routing each incoming request to the server or virtual machine best equipped to quickly and efficiently fulfill it, while taking care not to overload any single server.

Load balancers continually monitor server health, so if one server drops out due to maintenance or a hardware or software failure, it knows how to distribute future requests among the remaining functioning servers.

Similarly, if an application server is added to the server farm, the load balancer will start routing some requests to the new server.

More recently, application delivery controllers (ADCs) have added a bevy of optional capabilities, including SSL termination, access control, DDoS attack protection, application firewalls and more.

They’ve also evolved from mostly hardware solutions to a combination of hardware and more standardized virtual application delivery controller software (vADC) that can be deployed globally and in the cloud.

These global, cloud load balancing environments require direct and secure interconnection to reliably, effectively and efficiently support these critical capabilities.

Going global

Global server load balancers (GSLBs) take the load balancing concept to the next level and really set the stage for cloud balancing. GSLBs present a single virtual IP address to the client while distributing web and application requests among globally dispersed data centers.

GSLBs also take into account the user’s location, the health of network and data center resources, and any number of other configurable variables.

This is a great way to provide robust disaster recovery, business continuity and performance. Directing users to the closest data center geographically ensures low latency, which is increasingly important in a streaming, mobile world.

Into the cloud

The emerging category of cloud balancing takes global load balancing further into the hybrid, multi-cloud, where applications may reside in many different private and public clouds. As with global load balancers, cloud balancers distribute requests among relevant private and public cloud services.

With cloud balancing, a new set of decision variables may come into play, including the following potentially configurable cloud balancing policy items:

  • Compliance
  • Time of day
  • Cost of delivery
  • Cloud service capacity
  • Service level agreements
  • Contractual obligations
  • Energy consumption

Ideally, cloud load balancing solutions should be well integrated, able to communicate with each other and easily managed.

This can happen by deploying a single integrated solution leveraging an Interconnection Oriented Architecture (IOA), a repeatable model for interconnecting people, locations, cloud and data.

For example, Equinix partners with F5 Networks to enable organizations to deploy F5 Big-IP load, global and cloud balancing solutions through the Equinix Cloud Exchange and Performance Hub.

Fast, direct multi-cloud interconnections and F5 cloud balancing add up to a very powerful combination for ramping up application performance and security

Article by Ryan Mallory, Equinix blog network

The silver lining in Australia’s Government cloud strategy
Cloud has been a huge part of the ‘digital transformation’ conversation within Australian government during recent years.
Aerohive achieves ISO/IEC 27001 cloud platform certification
Aerohive is the first cloud-managed networking vendor recognized by a global standard for commitment to information security management systems.
Is Google’s Stadia feasible with today’s data centres?
To get a better idea of the sheer audacity behind Google’s latest move, we spoke to Unitas Global chief technical officer Grant Kirkwood.
Survey: IT pros nostalgic over on-prem data centre visibility
There are significant security and monitoring challenges faced by IT staff responsible for managing public and private cloud deployments.
61% of CIOs believe employees leak data maliciously
Egress conducted a survey to examine the root causes of employee-driven data breaches, their frequency, and impact.
VMware allures APJ channel veteran to take the reins
Balasingam will take on the role of vice president for VMware’s partner business in Asia Pacific and Japan (APJ).
Security top priority for Filipinos when choosing a bank - Unisys
Filipinos have greatest appetite in Asia Pacific to use biometrics to access banking services
Opinion: Modular data centers mitigate colocation construction risks
Schneider's Matthew Tavares believes modular data centers are key for colocation providers seeking a competitive advantage with rapid deployment.