As Pokémon GO exploded this past month millions of people worldwide took to the streets in an effort to “catch ‘em all”. But in the search for Pikachus and Charmanders the massive amount of computing power needed to support the number of users caused server interruptions that meant the game didn’t run a smoothly as it could have. While there is no easy resolution to these kinds of issue it has been suggested the use of Edge Computing may have helped the program run more smoothly.
For those that haven’t heard, Pokémon GO is a mobile game from Nintendo and Niantic Labs that sends users outdoors with their smart phones to try to ‘catch’ different Pokémon characters in an “augmented reality” format generated by the app on the phone. By all accounts, the game has been wildly successful, so much so that the servers supporting the game weren’t operating at 100% capacity all the time.
It’s not easy to determine what the exact problem was either. Some experts at have speculated that the game is hosted on Google Cloud Platform, based on job listings it found that spell out Google is looking for developers to “create the server infrastructure to support our hosted AR/Geo platform underpinning projects such as Pokémon GO using Java and Google Cloud.”
Whether or not that’s true, it’s easy to see how Pokémon GO can suffer performance problems. The game requires a constant back and forth of data between the user and the servers supporting the game. That often includes location information from hundreds if not thousands of users in close proximity, messages back to them that prompt the virtual images to pop up on their phones, data on how many Pokémon each one catches and more.
It’s a perfect example of where Edge data centre solutions could be of great benefit. Generally regarded as the architecture of the future, the rise of Edge Computing is gaining popularity as an alternative to conventional approaches where the data centre can be remote and geographically distant from the user. Edge places computing power, control, storage and applications closer to the end users who are using them.
For Pokémon GO in Australia, this would look like a series of Edge data centres spread around the country that could handle the back-and-forth of location data with individual users. They would also handle any outgoing messages going to players as well as gather stats and generally keep score. Only occasionally would they need to send data up to a central data centre, and only a subset of what they collect, such as scoring data.
Such a setup would dramatically reduce the latency of each back-and-forth interaction Instead of data going from a user’s phone near Adelaide to a central data centre in Sydney, for example, it would go to a much closer Edge data centre in South Australia. This could mean the app would load faster on a phone and would have less chance of going down.
In many respects, Pokémon GO mimics an Internet of Things environment, with lots of devices feeding data to a central site. As augmented reality (AR) solutions become more popular and accessible we can expect similar latency issues to arise across the board. For that reason it’s important to seek expertise early about how Edge Computing can be implemented and how it can support the growth of AR technology.
By Adam Wilkinson, National Application Manager – Data Centres, Edge Computing & IoT, Schneider Electric