Sales in Amazon's Web Services division improved 12% to $23.1 billion in the third quarter, the company said on Thursday, on par with expectations from analysts polled by FactSet. During Amazon's earnings call, CEO Andrew Jassy emphasized the work Amazon has been doing in AI.
Generative AI is "top of mind for most companies," Jassy said. The company is investing in the computing power behind large language models at three levels: training and running LLMs, hosting LLMs and hosting applications that use LLMs.
At the training level, Jassy pointed to two products, AWS Inferentia and AWS Trainium, that the company markets as "accelerators" for machine learning tasks. The products are specialized cloud computers designed to efficiently perform inference and training tasks.
In the example of a chatbot, an inference task is the computation that goes on to generate a response to a prompt. A training task, as the name suggests, is the computation required to process large amounts of training data to create the model.
At the level of hosting LLMs, which Jassy described as "large language models as a service," the company offers Amazon Bedrock. The platform allows users to spin up their own instances of language models from Cohere, Anthropic and, soon, Meta (the company that owns Facebook).
"Customers can take those models, customize them using their own data — without leaking that data back into the generalized LLM — and have access to the same security, access control and features that they have with the rest of their applications within AWS, all through a managed service," Jassy said.
As companies — including banks — continue to decide what models they want to use, which fit their various purposes best and how exactly to tune the fine details of the models, Amazon's managed AI services serve as a useful testing ground.
In the company's third-quarter earnings call, Krishna emphasized new and existing AI models banks and others could use to help developers, customer service people and workers in general become more efficient.
"Bedrock helps customers with this fluidity," Jassy said, allowing them to "rapidly experiment with and move between model types and sizes and enabling them to pick the right tool for the right job."
At the layer of applications that run LLMs, Amazon offers its own AI coding companion it calls Amazon CodeWhisper. The product can access an enterprise's code base securely, allowing it to provide better tips and write better code, Jassy said.
During his comments, Jassy made clear that while generative AI has become a major focus for AWS, all parts of the company are working on generative AI applications to transform their customers' experiences, including providing better product recommendations; forecasting inventory at various locations; making it easier for sellers to create new product pages by entering less information and letting the models do the rest; and more.
Overall, Amazon beat expectations in the third quarter with higher-than-expected sales. Analysts expected an average of $7.8 billion in earnings before interest (EBIT) and taxes during the third quarter, according to S&P Global Market Intelligence. Actual EBIT was $11.2 billion. Amazon's revenue during the quarter was $143 billion, on par with analyst expectations.
AWS's third-quarter performance does not account for some of the cloud deals it closed this month, according to Jassy.
"We signed several new deals in September with an effective date in October that won't show up in any GAAP reported number for Q3, but the collection of which is higher than our total reported deal volume for all of Q3," he said. "Deal signings are always lumpy and the revenue happens over several years, but we like the recent deal momentum we're seeing."