AI for Regional Cloud Providers
US-based tech giants have leveraged their first-mover advantage to develop a commanding lead in the fast growing $250+ billion global cloud computing market in terms of revenue and market share. As these Tier-1 giants continue to grow internationally, they represent a significant threat to regional and international cloud service providers.
That being said, we believe that cloud services remains a strong and defensible growth business for forward-thinking and well-positioned Tier-2 providers, regardless of location or current market share.
Significant near-term growth opportunities do in fact exist for international public cloud service providers, but it is essential that these players position themselves quickly to defend and grow their market share as hyperscale cloud providers continue to expand both their global data center coverage and enterprise service offerings.
According to Gartner, countries such as Poland, Brazil and Australia are already among those with the highest percentage of total IT spending on Public Cloud Services; and countries such as Russia, Indonesia and India are among those with the highest growth rates in IT spending on cloud.
There are several factors driving demand for independent Tier-2 providers – which we define as regional, country-level and niche cloud service providers primarily outside of the US. Most important among these are: data security, performance, and customer preference. We would add to this list the growing trend of Artificial Intelligence (AI) and Machine Learning (ML), which is an underappreciated driver that will be key to maintaining competitiveness and unlocking future growth.
As AI technology evolves, AI workloads are demanding ever greater amounts of specialized compute for both training and inference. These workloads often come with significant data privacy and data location issues, all of which present natural opportunities for well-placed regional cloud providers. But in order to capture the opportunity presented by AI, service offerings must be comprehensive and designed to ensure customer success – or regional cloud providers could find themselves losing out to fast-growing Tier-1 competitors.
The Local Advantage: Data Privacy, Performance and Preferences
As more and more businesses opt to migrate their legacy applications and data to the cloud, or choose cloud-native services to host their core business platforms, they are confronting a range of regulatory and compliance obligations regarding data security, which can limit their options in choosing a cloud service and storage provider.
For many businesses, it is critical to choose a cloud solutions partner who can offer in-country data storage, with the benefits data sovereignty brings.
The UK Financial Conduct Authority (FCA), for example, requires that firms ensure their cloud providers do not store data in jurisdictions that may inhibit effective access for UK regulators.
Transferring data between the US and EU also comes with significant legal and budgetary requirements, as US companies are required to register annually with the EU’s ‘Privacy Shield’ system and nominate a dedicated ombudsperson to respond to any consumer information request directly.
In Russia, an amendment to Russia’s Personal Data Protection Act went into effect in 2015, that requires that personal data of Russian citizens must be stored on Russian servers.
Brazil, Mexico, Argentina and Colombia all require that cross-border data transfers be restricted to countries whose data protection laws are of a similar quality to those of the home country – potentially creating legal issues as regulations regularly change in various data domiciles abroad.
All of these issues can be avoided by domiciling sensitive data in its country of origin.
Another key issue for clients choosing among cloud and hybrid service options is the performance of their applications in the cloud environment. For a number of services, including data streaming and voice applications, latency within the network is primary among those parameters. In many cases, same-country providers have a distinct advantage over Tier-1 providers in this area.
Furthermore, in data-centric AI/ML applications, co-location of large training databases with ML-accelerated hardware, such as GPUs and TPUs, can be crucial to efficient scaling of production applications. Again, same-country independent providers are at an advantage to Tier-1 giants half a world away.
Finally, the selection of a cloud provider is a mission-critical decision for clients, and is associated with high costs and a long time horizon. Here, forward-looking cloud service providers can out-perform Tier-1’s with localized relationship management, customer support and tech support.
AI in the Cloud
According to Gartner’s 2020 CIO Survey, enterprises expect to double the number of AI projects in place within the next year, and over 40% of them plan to deploy new AI solutions by the end of 2020.1
As early as 2017, an MIT survey showed that ML and AI were the fastest growing cloud-based workloads.2 More recently, technology market research firm Omdia forecast that by 2025 AI will account for as much as 50 percent of total public cloud services revenue.3
As Omdia notes in their 2019 AI Market Forecasts report, AI adoption in the cloud means that, “essentially, another public cloud services market will be added on top of the current market.”
And these workloads are by no means a lock-in for Tier 1 cloud providers. GPU maker NVIDIA’s CEO, Jensen Huang, recently noted that the “flood” of AI workloads headed for the world’s data centers are proliferating more and more beyond the hyperscale cloud giants. According to Huang, ML will continue growing as a portion of the total computing power used inside cloud data centers, and he expects to see ML in data centers operated in every country and industry in the near future – from managed service providers to banks. “It’s going to be everywhere.”4
By 2022, IT market researcher IDC estimates that spending on AI systems will exceed US$79 billion globally, growing at an annual rate of 38 percent.5 But as we move further into the future, market research firm Omdia projects the AI software market will reach $118.6 billion by 2025.
Furthermore, researchers see spending spread broadly across 30 major industries, including telecommunications, energy, manufacturing, banking and insurance; but also in the future among more highly geographically dispersed and potentially disruptable industries such as advertising, education, legal, sports, consumer, entertainment, gaming, retail, fashion and healthcare.
According to Gartner VP David Cearley, “Over the next 10 years, virtually every app, application and service will incorporate some level of AI.”
“The global AI market is entering a new phase in 2020 where the narrative is shifting from asking whether or not AI is viable, to declaring that AI is now a requirement for most enter- prises that are trying to compete on a global level,” according to Gartner. They also add that “through 2023, AI will be one of the top workloads that drive IT infrastructure decisions in large enterprises.”6
And while AI implementation is still accelerating within large enterprises as they move from pilot projects to broader implementation, the next wave of adoption will see a long tail of small and medium sized businesses, local governments, startups and nonprofits begin to move the needle on overall spending in the sector. As in previous enterprise technology cycles, AI adoption was initially driven by Fortune 100 companies, major global banks and hyperscale technology leaders. But, as AI-related technologies mature and use cases continue to be proven out, we are now seeing the scope of AI adoption broaden significantly.
This next leg of growth in AI adoption will have especially outsized implications for cloud service providers outside of the traditional big three of AWS, Azure and GCP.
Cloud AI Adoption: High volume, high performance and high sensitivity
In a recent survey of enterprise AI practitioners by supercomputer maker Cray, 32% of respondents said that data locality and security was their top priority in choosing infrastructure for enterprise AI workloads, but an even higher 39% named performance as the top priority.7
As we look at the requirements of developers who are seeking to move their AI/ML pipeline to the cloud, we must look beyond basic deployment questions and speed metrics and into broader ML workflows, including: data preparation and analysis, model training and evaluation, and tracking and understanding of ML artifacts and dependencies.
Creating a cloud-based AI pipeline requires significant MLOps (DevOps for AI/ML) expertise and manpower to create, manage and maintain the environments for these ML workflows. Abstracting away the specifics of the physical and network infrastructure with sophisticated systems such as Kubernetes or Docker Swarm, which require specific high-level expertise to setup and maintain.
Another key consideration that is often overlooked by companies early in their AI journey is co-location of data with the AI/ML-focused computational power needed for training.
How Independent Cloud Providers get their piece of the growing AI market
Up to this point, second-tier cloud providers have seen their greatest market successes by operating and competing with the majors on the principles of regional presence, openness and simplicity. And these same principles will also be the key to cloud success in the AI space.
- Regional presence will be even more crucial in the AI cloud world than it was for traditional enterprise applications and storage, given data security regulations and the need for co-location of massive data storage and AI-accelerated hardware.
- Openness is a key value required to allow AI developers to use the tools of their choice in developing the applications of the future, and to allow for AI workloads to be transported between on-premise, cloud and hybrid environments. Major cloud providers fail in this respect as they continue to push their in-house solutions, tools and languages in order to force vendor lock-in.
- Simplicity will be essential in providing a robust and easy to use MLOPs environment that frees expensive and limited developer talent to spend time creating AI solutions and not managing their cloud environments, operations and ML pipelines.
As more and more companies, from startups to enterprises, scale AI into production, there will be massive demand for AI/ML infrastructure and services.
While cloud giants are seeking to conquer this space with proprietary platforms and services that lock users into their offerings and toolsets, local cloud providers have a unique opportunity to alleviate data sovereignty and security fears while providing users with openness, simplicity and regional presence that fit their customers needs and allay their fears. It is essential that forward-thinking locals do not surrender this enormous market to foreign Tier-1 cloud giants. AI can and should be the game where the home-team wins.