Story image

How to speak like a data center geek: Artificial intelligence

11 Jan 18

It’s time to take on artificial intelligence (AI) in our “How to Speak Like a Data Center Geek” series.

Why now? Because even though it’s been around for six decades, AI has rarely been hotter.

In Equinix’s 2018 predictions, we forecast an imminent AI breakthrough into the mainstream. Already, AI powers advances like algorithmic trading, autonomous cars, even online shopping. And as digital capabilities advance, so will the capabilities of AI.

As fast as it’s moving, AI isn’t approaching the dark vision presented by popular culture (HAL 9000, meet The Matrix) or the expectations of more optimistic thinkers who say it will usher in a new era of human civilization … yet.

But who knows what’s ahead? Maybe that’s a question for Siri. In the meantime, this Geek entry is here to bring you a little AI knowledge.

Algorithm

Just like in the definition of HAL, (Heuristically programmed Algorithmic computer), algorithms are at the foundation of AI.

At the most fundamental level, they are mathematical instructions that tell computers what to do.

But in AI, those instructions aren’t always explicit, “Do X or Y”-style commands. Instead, algorithms can be used to set up rules and systems that enable a computer to learn.

Neural networks

Before we discuss neural networks, it’s useful to refer to the quick definition of artificial intelligence from Webster’s: “The capability of a machine to imitate intelligent human behavior.”

To imitate human behavior, AI needs to be able to solve problems, and a neural network is one way to do it.

A neural network is patterned after the workings of the human brain because it breaks computer learning into tiers of comprehension that connect and build on each other. Say the task is to tell the difference between the written numerals 1, 5 and 9.

The first tier receives the raw data and makes findings, perhaps noting very basic similarities and differences. Then, it passes on the relevant information to the second tier.

That second tier works with those findings – rather than the raw data used at the first tier – to make more complex learnings. It then passes on that info to the next tier, and so on until the task is completed.

Machine learning

Machine learning and AI are often used interchangeably, but machine learning is an AI subcategory, not the whole.

Specifically, machine learning uses algorithms to analyze data, draw conclusions and make a prediction or finding. As discussed in the algorithm section above, a key here is that the machine learns without being specifically programmed for a result, but by learning from “experience.”

This experience is gained by starting with models that predict various results, followed by analysis of related data, then adjustments (learning) by the machine of the model’s initial assumptions, based on the actual patterns and results uncovered.

A famous example is from machine learning pioneer Arthur Samuels, who programmed a computer to play tens of thousands of games of checkers against itself and learn what board positions were advantageous. The program then used that knowledge to beat human opponents.

Deep learning

Like machine learning is a subcategory of AI, deep learning is a subcategory of machine learning.

Key differences between machine learning and deep learning are the greater amount of data processed and far more complex algorithms used in deep learning.

It’s fair to say the advent of big data and unprecedented computer processing capabilities enabled the development of deep learning.

Like its name implies, deep learning is, well, deeper than plain old machine learning. By using large neural networks, enormous data sets and hugely powerful computers, deep learning takes on real-time tasks that are far more complex than playing checkers.

In a deep-learning demo at a lecture in China, a Microsoft executive used a speech program that transcribed his spoken words into English text, then into Chinese text, then simulated his voice saying the words in Mandarin. That’s deep.

Interconnection in AI

To function, AI needs direct and secure interconnection between a range of users, cloud applications (analytics, storage, etc.), machines, and data sources.

And this interconnection needs to happen instantly, with the various counterparties as geographically close to each other as possible. The Equinix Cloud Exchange (ECX) Fabric enables this kind of interconnection in more than 25 metros in North America and EMEA, and it will be expanding across Equinix global locations over time.

Article by Jim Poole, Equinix Blog Network 

HPE unveils AI-driven operations for ProLiant, Synergy and Apollo servers
With global learning and predictive analytics capabilities based on real-world operational data, HPE InfoSight supposedly drives down operating costs.
Deloitte bolsters AWS offerings with CloudinIT
“By joining forces we can help even more organisations adopt cloud technologies and put their customers at the heart of their digital agendas.”
Enterprises to begin closing their data centres
Dan Hushon predicts next year companies will begin bidding farewell (if they haven't already) to their onsite data centres.
Exclusive: How the separation of Amazon and AWS could affect the cloud market
"Amazon Web Services is one of the rare companies that can be a market leader but remain ruthlessly innovative and agile."
Huawei unveils new cloud region in South Africa
The announcement makes it the world’s first cloud service provider that operates a local data centre to provide cloud services in Africa.
HPE extends cloud-based AI tool InfoSight to servers
HPE asserts it is a big deal as the system can drive down operating costs, plug disruptive performance gaps, and free up time to allow IT staff to innovate.
Digital Realty opens new AU data centre – and announces another one
On the day that Digital Realty cut the ribbon for its new Sydney data centre, it revealed that it will soon begin developing another one.
'Public cloud is not a panacea' - 91% of IT leaders want hybrid
Nutanix research suggests cloud interoperability and app mobility outrank cost and security for primary hybrid cloud benefits.