04/22/2026 | Press release | Distributed by Public on 04/22/2026 06:44
Your browser does not support the audio element.
The pace of technological change since last year's Cloud Next has never been faster, and Google Cloud has incredible momentum.
Our first-party models now process more than 16 billion tokens per minute via direct API use by our customers, up from 10 billion last quarter. To support and drive this growth, in 2026, just over half of our overall machine learning compute investment is expected to go towards the Cloud business to benefit our cloud customers and partners.
You can read all about our momentum and the extraordinary range of partnerships and innovations we're announcing at Cloud Next.
I want to highlight just four key areas.
Last fall we introduced Gemini Enterprise, the end-to-end system for the agentic era - the connective tissue between your data, your people and your goals.
It has great momentum: In Q1, we saw 40% growth in paid monthly active users quarter-over-quarter.
Through this rapid growth, we've seen how every employee in every organization can become a builder. This is an incredible shift, but it comes with complexity. The conversation has gone from "Can we build an agent?" to "How do we manage thousands of them?"
That's why we're introducing our new Gemini Enterprise Agent Platform. It provides the secure, full-stack connective tissue you need to build, scale, govern and optimize your agents with confidence - a mission control for the agentic enterprise.
While AI can increase security risks, our Cloud customers now have AI on their side to protect their organizations. Today we are unveiling a range of new agentic solutions for threat detection, as part of an AI-powered cybersecurity platform that combines Google's Threat Intelligence and Security Operations with Wiz's Cloud and AI Security Platform.
In addition, we are launching Wiz's new AI Application Protection Platform (AI-APP), which provides autonomous protection, from code to cloud to runtime, across multicloud, hybrid and AI environments.
In the era of AI agents, infrastructure needs to evolve to take on the most demanding AI workloads. This year, we're bringing the eighth generation of our Tensor Processing Units with a dual chip approach:
We'll offer these to Cloud customers as a core part of our selection of compute processors, along with a portfolio of NVIDIA GPU instances. Read more in our blog post.
To be the best partner, we always want to be "customer zero" for our own technologies. This helps us imagine, test, build and scale the best Google technologies for our cloud customers, for today and tomorrow. Our database service Bigtable, which powers so many Google services, and our TPUs, which have been so important in training and powering our Gemini models, are great examples.
Here are a few more recent ones:
First, coding.
Second, security.
Third, our operations.
Congratulations to our Google Cloud team, and a huge thanks to our partners who are building the future with us. We'll have a lot more to share on how we're bringing the latest technology to everyone at Google I/O on May 19.
Your information will be used in accordance with Google's privacy policy.
SubscribeDone. Just one step more.
Check your inbox to confirm your subscription.
You are already subscribed to our newsletter.
You can also subscribe with a different email address .