DataCenterNews Asia Pacific - Specialist news for cloud & data center decision-makers
Story image
Technical challenges of expanding edge networks closer to end-users
Tue, 25th Sep 2018
FYI, this story is more than a year old

A number of opinions surfaced about the technical challenges of tomorrow's networks and data centers.

Some of the questions included: what are the technical challenges of expanding edge networks closer to end-users; what and where is the edge; who are the end-users; is the edge really expanding; why does it want to be closer to users; what are those technical challenges and how are they being resolved?

The four principles of data centers

There are four key principals that drive design and urbanism which also seem to apply to various data center types.

The first is centralised versus decentralised power. Think of the powerful but centralised Roman Empire versus the smaller but rich and highly networked Greek Empire. Throughout history, it appears that mankind has continuously alternated between centralised and decentralised structures.

The second principle is that if we think of data as power, then our first assumption with regards to data center design, past, present, and future evolution would be that, in any free market, we should also expect this constant swing between centralised and decentralised types.

The third principle is that mankind must progress. Humans are generally quite restless, especially if they are bored, and will often come up with productive things to do with their time during those lulls, such as create new art or invent new technologies. We must assume that we are somehow hard-wired in this way and so, therefore, it also seems safe to assume that this trend will continue ad infinitum.

The fourth principle is people need storage. People always seem to run out of storage space, whether it's in their garden shed, on their computer's hard drive, or in their walk-in closet. The answer to this quandary when applied to data centers probably lays somewhere around enhancing the speed of information and access to it.

The dawn of edge computing

Edge computing enables data produced by Internet of Things (IoT) devices to be processed closer to where the data is created, rather than in a data center. This enables data to be analysed in almost real time before it is sent to a data center.

Also, because data is not having to travel across the network, latency is reduced, which is important for applications that require low latency to work effectively, or even as close as is humanly possible to zero latency, such as in financial services.

Edge computing can also be set up so that only important data is sent across the network, while non-mission critical data can be correlated and sent at a certain interval, such as once a day, rather than the network constantly being overwhelmed with data.  In the end, the business requirement is the driving force behind technology.

A network around certain functions is one likely shape for the edge. Smart cities and smart grids come to mind, but it remains to be seen if citizens will be happy to go down the full automation route. It is possible that at some point people will say “enough!”. Political constraints in terms of new regulations might also present challenges at the edge.

Form tends to follow function so perhaps one challenge with the edge will be to start with the right business or design brief, try identifying what is the nature of the data and what its functional requirement is so that it can be warehoused or dealt with appropriately. Different types of data and their functional requirement will likely call for different data center shapes and location.

Robot operated data centers? 

The number one threat to safety in many events is people, according to security experts we have engaged with on real projects. So, in some cases, perhaps it would be prudent for people to never enter data centers, which would create the need for a new type of data center, manned by robots.

There are also technical challenges, such as security, controlling how people access and service the network in an environment where you want to control and minimise hacking risks.

Customers that are willing to pay a premium not to move to co-location facilities and the cloud include law enforcement, defence, universities, banks, hospitals, and large multinationals.

Customer service models will likely evolve ahead of technology, and some of those successful business cases will likely drive newer technology that will shape the physical world around them, including the different looks the edge may take.

Other technical challenges to moving the edge closer to end-users include planning requirements, availability of land and power, weighing risks vs benefits, education, and looking at problems objectively rather than emotionally.

A new needs-based data center design

One size does not always fit all. Different types of data centers are starting to emerge on the edge for various reasons, including access to data, or control, or any other requirements identified in an end user's business case, such as the need for low or zero latency in certain applications.

In the not so distant future, it is highly probable that customers will choose to host with multiple data center providers, or types, simultaneously. One could imagine signing up with separate providers for processing versus storage. Paying a premium to keep a share of your data off the grid and safe, for example, makes sense. So, if the offering is right, the cost is not the only driver.

Performance and the level of service required by the different end-user business cases can be as important as cost, if not more so, in driving what the edge is going to look like. Each bespoke edge solution is going to generate its own set of technical challenges, but we are confident in people and in the fact that technology has always managed to catch up to the service of mankind over time.