Story image

Data domination: Nokia Bell Labs on optimization for the future

16 Feb 17

Telecommunication experts estimate the amount of data stored “in the cloud” or in remote data centers around the world, will quintuple in the next five years.

New research from Nokia Bell Labs may offer a way to capitalize on this notion and offer improved data transfer rates for cloud computing based traffic.

“The challenge for legacy systems that rely on fixed-rate transmission is that they lack flexibility,” says Dr. Kyle Guan, a research scientist at Nokia Bell Labs.

“At shorter distances, it is possible to transmit data at much higher rates, but fixed-rate systems lack the capability to take advantage of that opportunity.”

Guan says he worked with a newly emerged transmission technology called distance-adaptive transmission, where the equipment that receives and transmits these light signals can change the rate of transmission depending on how far the data must travel.

With this in mind, he set about building a mathematical model to determine the optimal lay-out of network infrastructure for data transfer.

“The question that I wanted to answer was how to design a network that would allow for the most efficient flow of data traffic,” says Guan.

“Specifically, in a continent-wide system, what would be the most effective [set of] locations for data centers and how should bandwidth be apportioned?

It quickly became apparent that my model would have to reflect not just the flow of traffic between data centers and end users, but also the flow of traffic between data centers.”

According to Guan, other research suggests that this second type of traffic, between the data centers, represents about one-third of total cloud traffic.

It includes activities such as data backup and load balancing, whereby tasks are completed by multiple servers to maximize application performance.

After accounting for these factors, Guan ran simulations with his model of how data traffic would flow most effectively in a network.

“My preliminary results showed that in a continental-scale network with optimized data center placement and bandwidth allocation, distance-adaptive transmission can use 50 percent less wavelength resources or light transmission, and reception equipment, compared to fixed-rate rate transmission,” explains Guan.

“On a functional level, this could allow cloud service providers to significantly increase the volume of traffic supported on the existing fiber-optic network with the same wavelength resources.”

52mil users affected by Google+’s second data breach
Google+ APIs will be shut down within the next 90 days, and the consumer platform will be disabled in April 2019 instead of August 2019 as originally planned.
How Fujitsu aims to tackle digitalisation and the data that comes with it
Fujitsu CELSIUS workstations aim to be the ideal platform for accelerating innovation and data-rich design.
QNAP launches a new hybrid structure NAS
"By combining AMD Ryzen processors with a hybrid storage structure and 10GbE SFP+ connectivity, the system packs performance into a compact 1U frame."
Ramping up security with next-gen firewalls
The classic firewall lacked the ability to distinguish between different kinds of web traffic.
Platform9 aims to allow enterprises to run Kubernetes instantly
Snapfish, HPE, and Juniper use Platform9’s hybrid cloud solution to deliver a modern cloud infrastructure-as-a-service experience.
Opinion: A data centre manager's Christmas wish list
In this time of merriment and cheer there is one thing everyone is not-so-secretly waiting for: Presents.
STT GDC to build hyperscale data centre in Singapore
ST Telemedia Global Data Centres (STT GDC) today unveiled ambitious plans for expansion with its largest data centre in Singapore to date.
Golden opportunities for enterprise e-waste reduction
E-waste is a hot topic in tech circles, and Park Place's EMEA MD believes there could be huge opportunities if data centres and enterprises improve their practices.