Recently we were able to conduct a Q-A with Scality chief product officer Paul Speciale to garner his thoughts on where cloud is going and how organisations will store their data in the future.
The first thing that was brought up was the much discussed topic – is the on-premises data center really dead?
“Surprisingly, no. In 2016 and 2017, there was a noticeable shift toward public clouds in discussions with IT influencers and buyers. Public clouds promised so many advantages in rapid-start, time-to-deploy, reduced capital expenditures and just overall simplicity, that it was really hard to argue against,” says Speciale.
“As a vendor of on-premises IT infrastructure software, we were obviously concerned that this was an unstoppable force. It was always the case, though, that customers with use cases that demand high levels of data control, performance and security (financial, government, and healthcare, for example) see on-premises as their primary deployment target.
Speciale says a noticeable change of thinking occurred in 2018.
“There was a significant uptick in cloud users noting that “maybe we were too hasty in going to cloud”, and “we didn't realise the disadvantages in committing and locking-in to one vendor's cloud”, and of course the dreaded “wow, our monthly cloud bills really surprised us”,” says Speciale.
“What has happened is that users have now realised that there is a smarter way to use the cloud, and to leverage specific services that add value to their businesses. Rather than blindly “storing all data in the cloud” for example, it made more sense to think of leveraging the public cloud for selected data, and for a bounded period of time, as part of their overall workflows. This gives the benefits of using what the cloud gives, but not incurring the full cost burden or lock-in of public cloud usage.
Another frequently discussed topic is hybrid cloud, which Speciale is unsurprisingly quite knowledgable about.
“So this is the common IT model of today, at least it is how many users describe it. Given the differentiation across the various cloud vendors, multi-cloud will happen for more businesses over time. Today, what customers want is a smart-blend of the control and cost-effectiveness of on-premises IT (and yes, it is significantly cheaper to keep data in an on-premises object store for 3 years than it is to keep it in the public cloud) - plus specific use of value-added services in a public cloud,” says Speciale.
“This might be a need for a content-delivery (CDN) service in AWS, or a transcription service in Azure, or running some microservices-based app in Google Cloud. The net of it is that the best architecture may prove to be a hybrid architecture split smartly between on-prem and cloud, to deliver the service in the best, fastest and more cost-effective way.
At the heart of all this is the humble data center, so I asked Speciale how he thinks it will evolve in the coming years.
“We see that business workflows will continue to span across locations and clouds. This is already happening in media - entertainment, where content must be delivered to millions of users on a wide range of devices. This means content originating on-premises (think of a news broadcasting organisation), but that content has to be transcoded into a dozen or more formats, and then delivered to consumers in dozens of countries,” says Speciale.
“This is already being solved by leading broadcast companies by blending on-premises compute and storage with the use of compute-bursting in AWS, Azure, GCP and also the use of regional clouds. We now see the same “active business workflow” trend spanning multiple data centers (really private clouds) and public clouds in financial services, insurance, biopharma, manufacturing and healthcare.
And so that's data centers, but what about general storage?
“The ability to manage data will be valued more highly than the specific place, solution or cloud within which it is stored. Once we see businesses embrace the ideas above: that not one public cloud is the answer to everything, and that on-premises IT is indeed required to be cost-effective and to maintain control; what becomes critical is the ability to manage all of that data,” says Speciale.
“What does data management mean here? It means, first of all, having visibility to your data: where is it stored, how much of it is accessed and when, and how much does it cost me to store it in each place? Once data is stored in many places, the need to locate it through intelligent search will be critical. And of course, ensuring that data is near the apps and users that consume it means that data management also will entail data mobility, via migration, replication or other workflows. The future trend is indeed toward multi-cloud data management.
Speciale alluded to multiclouds earlier, but I then asked him to illuminate further.
Clouds are becoming specialised in several ways. An obvious way is that each cloud vendor will offer its own unique services. Some will have advantages for different applications, for example Azure has a unique Video Indexer not yet seen in the other clouds. Google is certainly leading the way on container and Kubernetes-based services,” says Speciale.
“The other dimension is regionality and sovereignty, as businesses realise they must keep data located in specific geographies such as China, we'll see the local clouds in those countries become part of global business workflows.
And finally to wrap it up, I asked Speciale what would hold the most importance looking ahead – software or hardware.
“Software will be the key to what we see as a main driver in managing applications and data across multiple clouds, whether private or public. Hardware will always be an enabler for more cost-effective and faster deployment but enabling key business workflows will be achieved through intelligence in software,” concludes Speciale.