Dispelling the myths of public cloud
Cloud computing has grown exponentially over recent years, with businesses of all sizes now able to capitalise on this innovative technology and the plethora of services on offer continuing to grow. In particular, the rise of public cloud adoption has placed immense pressure on the private cloud companies and as a result, a number of misconceptions have been disseminated which have cast a cloud of confusion regarding the benefits of public cloud.
Early adopters of cloud technology will be familiar with the setup of a private cloud service provider, whereby typically they will have rented dedicated hardware in a computer rack within the service provider's data center – and this model has remained largely unchanged.
Whilst these facilities can provide basic services such as virtual servers, storage and backups, the industry giants including Microsoft, Amazon and Google can now offer a significantly wider range of cloud computing services. Traditional data centers will face challenges as hardware becomes a redundant concept – clients no longer need to be constrained by the limits of rented hardware, or even the locations of the private cloud providers.
Here are some of the primary considerations when it comes to the question of public vs. private cloud, as with public cloud outpacing private cloud infrastructure, businesses should look to continue this trend.
Stability and security
With the number of sophisticated data breaches increasing year on year, the stability and security of data should be a primary factor for businesses and cloud providers alike. Public cloud providers have made huge investments into all areas of their platforms, but given the threat posed from cyber attacks, a particular focus has been placed on security protocols.
As part of this strategy, public cloud providers allocate huge resources to ensuring that they achieve and maintain key security certifications from governments and industry bodies across the world. One example of how Microsoft stays ahead of the game is its 'Red Team'. This elite team of hackers is tasked with simulating real-time security breaches to test the resilience of the system and ensure that incident responses are fine tuned. This coupled with constant security monitoring means that Microsoft can continually test, validate and improve its security and effectiveness across all of its services.
The same cannot be said for the vast majority of private cloud providers that just do not have the same resources and scale to safeguard their systems in this manner. Whilst the systems and data will be managed and protected locally in a private environment, do they have the capacity and expertise necessary to ensure that the security technology is continually updated in line with the latest threats?
Despite the obvious focus on digital security systems, physical security is equally important to ensure data is not stolen, destroyed by malicious intent, or unfortunate accident.
Typically, private data centers will provide rented rack space for clients which will have security protocols in place to keep systems and data under lock and key, but the level of security could vary significantly from provider to provider. This again cannot be compared with the major public cloud providers that wholly own their data centers and deploy sophisticated measures including CCTV, security staff and full-body screening of all personnel entering and leaving the site.
This dedication to the highest levels of security protocols is one of the many reasons that makes public cloud so appealing. In particular, this is why critical service industries such as government, the military, health providers and more have procured public cloud providers as it is imperative that their industries remain stable and immune to security risks.
Hardware costs
Within a private cloud setup there is a strong reliance on hardware, so when it comes to resilience, single points of failure must be avoided – all adding to the cost and complexity of the setup. Furthermore, if Disaster Recovery also needs to be incorporated, all of the hardware and data must be replicated in a completely separate facility – multiplying the associated costs and complexity even further.
Whilst hardware is still required in a public cloud environment, the user or administrator isn't aware of it as all of the services are virtual. As such, when it comes to resilience, there is no need to duplicate hardware as resilience is built in at the core level.
For example, within Microsoft Azure, virtual 'disks' are replicated at least three times within the same facility and can also be backed up to separate locations in the same region or another country. Therefore if a physical disk failed, the data would still be available and would transition seamlessly, with no impact on the user.
The requirement for hardware is the main contributing factor when determining costs for each client in a private cloud setting; i.e. the more devices that are provisioned for a client, the higher the associated cost. This is not the case within a public cloud environment as resources are flexible and can be scaled up or down to suit real-time requirements.
For example, if a client's workload increases, additional capacity can be provided in minutes to meet demand. Of course, this can also be achieved within a private cloud infrastructure, but to do so would typically require more hardware to be deployed, which could take weeks, and may also require a minimum term contract commitment – a far cry from the real-time elasticity of public cloud.
Making the move
As is often the case when deploying significant process changes within a business, it's unlikely that a company would have the necessary skills and experience in-house to make the jump to the cloud. Mistakes can be costly, so it pays to seek out an expert who can advise and take on the heavy lifting accordingly.