The AI growth potential of Amazon.com's cloud service AWS is severely underestimated, as its deeply collaborative partner Anthropic's API business is bringing significant revenue contributions to AWS.
A latest analysis report on September 3rd shows that Anthropic's deep cooperation with Amazon.com AWS is bringing significant growth momentum to the cloud service giant, but the market has not yet fully recognized the potential of this AI-driven growth. If AWS can maintain its training workload cooperation with Anthropic, the company is expected to achieve above-expectation revenue growth in the fourth quarter.
Analysts estimate that Anthropic currently (Q2 2025) contributes approximately 1% growth to AWS, but with the dual driving force of Claude 5 training and existing inference revenue, this contribution could rise to 4% per quarter. The key point is that Anthropic's API business scale has already surpassed OpenAI and is growing at a more rapid pace.
The report states that Anthropic will bring approximately $1.6 billion in inference revenue to AWS in 2025, with its Annual Recurring Revenue (ARR) expected to surge from $1 billion at the beginning of the year to $9 billion by year-end. However, there are complaints within the industry about restrictions on accessing Anthropic models through the AWS Bedrock platform, indicating that the cooperation between the two companies may face some challenges.
Barclays maintains its "Overweight" rating for Amazon.com with a target price of $275, representing a 21.7% upside potential from the current stock price.
**Anthropic API Business Scale Already Surpasses OpenAI**
According to Barclays analysis, Anthropic has established a significant advantage over OpenAI in the API business sector.
Data shows that 90% of Anthropic's revenue comes from API business, while only 26% of OpenAI's revenue comes from API, with its main revenue still relying on the ChatGPT consumer product.
Specifically, Anthropic's API business achieved $512 million in revenue in 2024, expected to surge to $3.907 billion in 2025, a year-over-year increase of 662%. In comparison, OpenAI's API business had $1 billion in revenue in 2024, expected to grow to $1.8 billion in 2025, with a growth rate of 80%.
This difference mainly stems from the explosive growth of AI Integrated Development Environment (IDE) applications.
AI programming tools like Cursor and Lovable obtain model authorization through Anthropic's Direct API, paying per million token costs. Barclays estimates that each Cursor Pro user contributes approximately $5 in monthly revenue to AWS.
The launch of models like Claude 3.5 (released in June 2024) and Claude 3.7 (released in February 2025), particularly the latter's incorporation of reasoning and chain-of-thought capabilities, has driven Anthropic's leading position in the API field.
The report points out that AI Integrated Development Environments as a category are expected to exceed $1 billion ARR in 2025, while this figure was almost zero in 2024.
**AWS Expected to Achieve Above-Expectation Growth in Q4**
Barclays expects that if Amazon.com's cloud service AWS can maintain its cooperation with Anthropic, fourth-quarter revenue growth is expected to be approximately 2% higher than market expectations.
Currently, market consensus expects AWS fourth-quarter revenue growth of 18%, but Anthropic's contribution may drive actual growth rates to significantly exceed this expectation.
Analysts point out that Anthropic may begin pre-training Claude 5 in the fourth quarter, which will contribute approximately 1.5% growth to AWS. Combined with inference revenue, Anthropic's total growth contribution to AWS could reach 4%.
The research report states that Barclays' model assumes AWS non-AI workloads maintain a 14%-16% year-over-year growth trend, which is consistent with performance in recent quarters. Based on this foundation, Anthropic's additional contribution will significantly boost overall growth rates.
**AI Capacity Expansion Supports Long-term Growth Prospects**
To support the rapid growth of AI business, AWS is significantly expanding AI computing capacity.
Barclays estimates that AWS may possess over 1 million H100-equivalent AI capacity by the end of 2025, benefiting from the deployment of Blackwell GPUs and 400,000 Trainium "Project Rainier" chips.
However, whether AWS has sufficient AI capacity to support all growth activities of Anthropic's business remains controversial. During the second-quarter earnings conference call, Amazon.com management continued the standard expression that demand exceeds supply, without providing much new confidence.
Barclays' analysis shows that newly added AI capacity since ChatGPT's launch is expected to exceed 1 million H100-equivalent computing power by the end of 2025. This capacity expansion is crucial for supporting the rapid growth of partners like Anthropic.
**Deep Cooperation Faces Potential Challenges**
Barclays states that despite the significant benefits brought by Anthropic's cooperation with AWS, the relationship between both parties also faces some tests.
According to previous media reports, there are some complaints within the industry about accessing Anthropic models through Amazon.com AWS Bedrock, indicating that there may be some degree of friction in the Amazon.com AWS/Anthropic relationship.
More noteworthy is that important clients like Cursor (one of Anthropic's largest IDE clients) are beginning to switch to OpenAI's GPT-5 API as their default choice. Although users can manually switch back to Anthropic, this stickiness is far less powerful than white-label paid search default settings in the consumer sector.
Despite some uncertainties, Barclays remains optimistic about AWS's long-term prospects in the AI field.
Analysts point out that while the "broad expansion" phase of AI deployment in 2025 has not yet truly arrived, a few large AI laboratories are currently creating most AI revenue for hyperscale cloud service providers, and AWS's deep cooperation with Anthropic positions it at the core of this trend.
It's worth noting that Barclays' analysis is based on the assumption that 70% of Anthropic's revenue is hosted on AWS, with the remaining 30% on Google Cloud Platform (GCP).
Anthropic is the only AI laboratory with three different infrastructure teams, separately managing training and inference on three different architectures: GPU, TPU, and Trainium, providing flexibility in terms of cost and availability.