DataCenterNews Asia Pacific - Specialist news for cloud & data center decision-makers
Story image
Commvault: Know your data in the cloud…to be agile
Wed, 4th Oct 2017
FYI, this story is more than a year old

I hear the Charles Darwin quote a lot lately: “It is not the strongest of the species that survives, nor the most intelligent that survives. It is the one that is most adaptable to change.

We have seen this happen to IT infrastructure over time. It has evolved to become the cloud and software-as-a-service, moving rapidly away from on premise virtualisation and physical systems.

At the same time, some things never change.

This is true when it comes to data growth. In this digital era, data only grows in one direction, and that's up. What has changed about data though, is that it has become an increasingly strategic business asset. So, if businesses need to be able to adapt to change, but also know that some things never change, then they need to be agile.

What is agility anyway?

There are so many interpretations of agility, it's mostly associated with the evolution of software development that ultimately had to adapt to meet our ever-growing demand from technology.

But agility has increasingly become more and more about its original definition; to be able to move quickly and easily, and the ability to think and understand quickly.

At the same time, it's increasingly becoming associated with being able to fail fast and fail often. This is one of the key principles that makes agile businesses so successful.

The role of data in an agile business

Data has to be agile in the digital age - it now lives everywhere and is available anytime to whoever has access to it. As a result, data is increasingly abstracting from the infrastructure it lives on.

It's acknowledged that we are moving away from “servers as pets”, towards “virtual machines as cattle”.

The concept of pets vs cattle isn't new. IT departments have traditionally treated servers as pets nursing them back to health when they get sick, i.e. they experience data loss, security attack or even system failure.

Historically this has made a lot of sense - servers are a proprietary (and a typically expensive) physical resource. They're also the main repository of an organisation's critical data assets, hence why we treat them as pets.

However, with the advent of virtualisation and cloud, infrastructure dependencies have changed significantly. Servers have become virtual machines (or even containers) which are treated more like cattle; where they are easier to provision quickly, so they are typically replaced rather than being nursed back to health.

Data management (and protection) in these new environments is still very important but the conversation is fundamentally different.

Retention, copy management, auditing and risk - these items don't go away in newer architecture.

There's a whole new series of unrealised needs when a business shifts into this new world architecture of VM's as cattle, namely they now have a need to protect data residing in non-traditional storage silos such as new public cloud object storage and programmatic needs to access, not restore, the data, amongst others.

There's also the data now delivered in serverless architecture and by Software-as-a-Service where infrastructure is entirely removed from the equation.

This makes it increasingly difficult to be in control and have visibility of distributed data silos.

In this agile world, with agile apps, on agile infrastructure, creating ever-increasing data, it's more important than ever for businesses to know their data in the (multi)-cloud.

Knowing data in the cloud to be agile is to be able to fail fast but also recover fast. This can only be achieved by taking the approach of an intelligent, software defined, highly scalable data platform, which is built on a combination of three key factors:

A distributed dynamic index - which allows business to know everything about their data, in every location simultaneously, irrespective of infrastructure or cloud type

A single virtual repository - bringing all an organisation's data together without needing to physically move it allows them to have access to any piece of information across “fractured data center” silos, providing a truly federated instant search capability

Management and operations layer - providing automation, orchestration and provisioning to cope with the massive growth of data volumes and types across these multiple environments

Having a software-defined, agile data platform helps to address the challenges associated with the modern distributed workloads that need to be able to shift seamlessly between clouds.

It also introduces new opportunities in managing data in a hybrid, multi cloud environment by providing true workload portability and freedom from both hardware vendor and cloud lock in.

Ultimately, a single data platform allows businesses to move data and information quickly and easily; to know and understand their data better; and, allows them to fail fast, fail often and recover even faster.

This is the very nature of agility and will allow organisations to continue to adapt to change while keeping on top of the things that never change.