Story image

Data domination: Nokia Bell Labs on optimization for the future

16 Feb 17

Telecommunication experts estimate the amount of data stored “in the cloud” or in remote data centers around the world, will quintuple in the next five years.

New research from Nokia Bell Labs may offer a way to capitalize on this notion and offer improved data transfer rates for cloud computing based traffic.

“The challenge for legacy systems that rely on fixed-rate transmission is that they lack flexibility,” says Dr. Kyle Guan, a research scientist at Nokia Bell Labs.

“At shorter distances, it is possible to transmit data at much higher rates, but fixed-rate systems lack the capability to take advantage of that opportunity.”

Guan says he worked with a newly emerged transmission technology called distance-adaptive transmission, where the equipment that receives and transmits these light signals can change the rate of transmission depending on how far the data must travel.

With this in mind, he set about building a mathematical model to determine the optimal lay-out of network infrastructure for data transfer.

“The question that I wanted to answer was how to design a network that would allow for the most efficient flow of data traffic,” says Guan.

“Specifically, in a continent-wide system, what would be the most effective [set of] locations for data centers and how should bandwidth be apportioned?

It quickly became apparent that my model would have to reflect not just the flow of traffic between data centers and end users, but also the flow of traffic between data centers.”

According to Guan, other research suggests that this second type of traffic, between the data centers, represents about one-third of total cloud traffic.

It includes activities such as data backup and load balancing, whereby tasks are completed by multiple servers to maximize application performance.

After accounting for these factors, Guan ran simulations with his model of how data traffic would flow most effectively in a network.

“My preliminary results showed that in a continental-scale network with optimized data center placement and bandwidth allocation, distance-adaptive transmission can use 50 percent less wavelength resources or light transmission, and reception equipment, compared to fixed-rate rate transmission,” explains Guan.

“On a functional level, this could allow cloud service providers to significantly increase the volume of traffic supported on the existing fiber-optic network with the same wavelength resources.”

HPE extends cloud-based AI tool InfoSight to servers
HPE asserts it is a big deal as the system can drive down operating costs, plug disruptive performance gaps, and free up time to allow IT staff to innovate.
Digital Realty opens new AU data centre – and announces another one
On the day that Digital Realty cut the ribbon for its new Sydney data centre, it revealed that it will soon begin developing another one.
'Public cloud is not a panacea' - 91% of IT leaders want hybrid
Nutanix research suggests cloud interoperability and app mobility outrank cost and security for primary hybrid cloud benefits.
Altaro introduces WAN-optimised replication for VMs
"WAN-optimised replication allows businesses to continue working in the case of damage to on-premise servers."
DDN part of data mining mission on Mars
DataDirect Networks (DDN) today announced that it will be playing a role in one of NASA’s most critical missions.
Opinion: Data centre management can learn from the Navy
While a nuclear submarine may seem like a completely different beast from a data centre, the similarities in how they should be managed are striking and many.
14 milestones Workday has achieved in 2018
We look into the key achievements of business software vendor Workday this year
HPE building new supercomputer with €38m price tag
It will be installed at the High Performance Computing Center of the University of Stuttgart and will be the world's fastest for industrial production.