Featuring:

Mike Gualtieri, VP, Principal Analyst  and Lee Sustar, Principal Analyst

Show Notes:

Who’s the biggest benefactor of the ongoing AI boom? It’s not developers or marketing professionals or customer service agents. It’s cloud providers. Much of the current AI and generative AI (genAI) boom wouldn’t be possible without cloud, but is that relationship on solid ground? In this episode, Vice President and Principal Analyst Mike Gualtieri and Principal Analyst Lee Sustar discuss the benefits and challenges of AI in the cloud. 

The episode starts with a discussion on the importance of data quality in AI workloads and how cloud providers can help address data quality and readiness issues. Sustar says one trend he’s noticed recently from cloud providers is the evolution of genAI enablement platforms, which integrate data analytics and AI in a semi-packaged, workload-specific way. 

The conversation then shifts to the opportunities and challenges for cloud providers and hyperscalers in the AI space. Sustar says cloud providers are making massive investments and developing infrastructure to catch up with the first mover in the cloud AI market, Amazon Web Services (AWS). “We know how expensive it is because Google reported forthrightly that they would lose hundreds of millions of dollars … because they had to run around the world matching AWS rack for rack,” he says. 

Will there be any new players coming into this market? Gualtieri highlights one company/provider that focuses on AI and scientific workloads that demand high GPU capacity but agrees with Sustar’s assessment that it would be very difficult to launch a startup today that could compete with the cloud hyperscalers in the market now. 

The mention of chips and processors diverts the conversation briefly toward how the various cloud players like Google and Microsoft are addressing the demand for AI chips, which are in short supply. Some are partnering with chipmakers such as NVIDIA while others are developing their own chips, like Google’s new Axion CPU, which was announced at Google Cloud Next

From there, the conversation turns to cost of cloud AI. Gualtieri says that while many users equate cloud with cost savings, the truth is that cloud can be expensive, especially for “hot” workloads that leverage AI around the clock. Another factor discussed that impacts cloud cost is the location of the data being leveraged by AI. If the data resides in a data center and the AI inferencing happens in the cloud, that can add more cost to the process. Both analysts provide examples of more efficient ways to run more demanding AI workloads in the cloud. 

After each analyst provides some pitfalls to avoid when leveraging AI in the cloud, the episode closes with some predictions for the future of cloud AI. Sustar gives an update on the prediction that Oracle will take a more prominent place in the cloud market, while Gualtieri proposes a unique approach for managing cloud workloads in the future that you won’t want to miss.