Story image

Microservices: A game changer for application migration to the cloud

10 Feb 2017

Reports about “microservices” are becoming increasingly common, including what they are and how enterprises are using them to run applications across cloud or in on-premises data centers.

While microservices are similar to the rapid development and deployment approaches used by DevOps teams, their adoption can be seen as the catalyst for a service oriented architecture (SOA) and cloud migration.

What are microservices?

Analysts, consultants and suppliers of applications and development tools have been discussing the concept of a common architectural style for years, but it wasn’t until May 2012 that a workshop of software architects coined the name “microservices.”

The best way to think of them is as an application architecture based upon the use of small, independent services to create larger and more complex applications and products.

Complex applications are being broken down into small pieces, or components, each of which may be developed, documented and tested in parallel by multiple distributed development teams.

This approach helps both business and development teams make products more feature-rich in a shorter amount of time. Each of the application components’ features are considered microservices, and are fashioned/tailored to work independently and cross functionally with other components.

This enables teams to execute, test, deploy and troubleshoot quickly.

Once all the functions required to build a specific application are available in the form of microservices, the application can be assembled/deployed out of those functions, quickly tested, documented and moved into live production.

These individual services communicate with one another using high-speed interconnection and can be deployed on-premises, off-premises or in a geographically distributed hybrid environment.

Issues in a specific application function can be monitored, replicated and fixed, and updates can be rolled into production with a short turnaround time.

Microservices compliment the testing process because the entire application can still function while individual components or services are being fixed and deployed.

By exercising microservices, organizations can bring up new bare-bones applications quickly and flesh them out as new functions are being developed. In large-scale systems, where new feature rollouts can happen daily, microservices are a blessing that accelerates the delivery process.

How microservices are changing the game of football

Microservices played a huge role in this year’s Super Bowl by speeding player data to NFL officials, coaches, fans and broadcast networks. Zebra Technologies’ sports data tracking system calculates NFL players’ speed, distance, closing distance, routes and formations, and then quickly adds “eventing data,” which translates all the information into a football context.

Microservices enable the capture, processing and delivery of this data with incredible speed. They capture the information about a play as it is unfolding and send it to broadcasters immediately for use in TV replays.

The Zebra system relies on RFID sensors placed around the stadium and in every player’s shoulder pads as part of the official NFL uniform, including 22 receivers in the stadium and two RFID tags on every player, two on each referee and one in the ball.

The RFID tags track movements up to 25 times a second and deliver the information, including latitudinal and longitudinal data, in about half a second. Microservices and high-speed interconnections between the Zebra systems, IoT devices and the recipients of the data can make the Super Bowl more interesting to watch in a replay than it was watching it live.

Where VMs and containers fit in

We have seen growth in the use of virtual computing environments since 2014, first with the use of virtual machines (VMs), now with the growth of containers. We also believe that the trend to develop and deploy applications based upon microservices is closely related to the move to deploy encapsulated virtual environments.

VMs make it possible for operating systems, combined with application dependencies, data management tools and applications components, to be provisioned and deployed quickly. This approach works well if the applications and components have each been written to work with different operating systems, since they can all be hosted on a single physical server without incompatibilities between or among them.

Containers are similar to VMs, but they are based on the assumption that all or each of the functions are designed to execute on a single operating system on multiple hosts or clouds.

Containers are flexible enough to enable multiple independent partitions that contain applications or their components that can be deployed under a single host operating system.

This reduces the amount of system memory and processing power required to support those independent components, thus enabling efficiency and justifiable use of infrastructure. Switching from component to component as the aggregate application executes can also be done much more quickly. This enables seamless scalability of the applications.

As with microservices, containers are lightweight. And when containers are injected with microservices, the migration across data centers and cloud is a cakewalk for DevOps and infrastructure teams.

On-premises, cloud and hybrid deployments

The evolution of containers enables enterprise applications and any distributed applications (e.g., IoT in the NFL) to be constructed using microservices, and these packaged microservices can enjoy the platform independence that separates deployment from legacy dependencies (i.e. OS, libraries, connection strings properties, etc.).

Enterprises can easily choose to deploy all or selected microservices locally or in any data center, cloud service or hybrid environment. Equinix has been on a journey using microservices and containers, and we see them increasing efficiency, ease of deployment, manageability and scale on-demand in our own and customer application deployments.

Article by Ramchandra Koty and Balasubramaniyan Kannan, Equinix blog network

How Renesas aims to simplify building automation
“With the trend toward energy efficiency and green design of commercial buildings, the challenge of renovating existing facilities is growing."
Katalyst to build new subsurface data centre in Malaysia
The company plans to open a subsurface data centre in Kuala Lumpur to support its oil and gas customers.
Infinera launches new ‘disruptive’ network architecture
The new end-to-end network architecture is said to enable instantly scalable, self-optimizing networks that adapt to the demands of specific users and applications.
Survey finds DC managers want more efficiency, not horsepower
More servers and more CPU power used to be the answer to boosting data centre performance, but it appears this is no longer the case.
NEC to build submarine cable connecting Japan prefectures
NEC Corporation announced that it has been awarded a major new submarine cable contract.
DOCOMO ranked world's top mobile operator in 5G SEP applications
NTT DOCOMO has been ranked the world's leading mobile operator in terms of applications for candidate standard-essential patents.
Exclusive: Ping Identity on security risk mitigation
“Effective security controls are measured and defined by the direct mitigation of inherent and residual risk.”
Nlyte celebrates record year and new board chairman
The company recently announced a strong 2018 calendar year after adding more new customers than any other year in its 15-year history.