Google LLC

10/30/2025 | Press release | Distributed by Public on 10/30/2025 11:32

New tools in Google AI Studio to explore, debug and share logs

We're introducing a new logs and datasets feature in Google AI Studio, to help developers assess the quality of AI outputs and build with more confidence.

A key challenge in developing AI-first applications is getting consistent, high-quality results - especially as you iterate and grow. These new tools help to improve observability and streamline your debugging workflows, giving you quick and simple insights into how your application is working for both you and your end users. They also lay the groundwork for a broader set of evaluation capabilities.

The new logging and datasets tool in Google AI Studio

Easily track everything, no new code required

Setup is simple: All you need to do is click "Enable logging" in the AI Studio dashboard to make API calls for your billing-enabled project visible in the dashboard. This will automatically track all supported GenerateContent API calls from that billing-enabled Cloud project - whether they're successful or not. This creates a user interaction history for your AI systems without requiring code changes.

You can get logging at no monetary cost in all regions where the Gemini API is available. Use the table to view response codes and filter by status to quickly identify logs to debug. You can also dive into specific log attributes, like inputs, outputs, and API tool usage, to trace a user complaint back to the exact model interaction. This makes debugging, testing, and refining your app much more effective.

Click "Enable Logging" and get interaction history for all API calls from there on

Turn insights into product excellence

Every user interaction is a chance to improve your product and the model's ability to deliver better responses. You can export your logs as specific datasets (in CSV or JSONL format) for testing and offline evaluation. By identifying examples in your logs where quality and performance dipped (or excelled), you can build a reliable and reproducible baseline of expected results.

You can use these datasets for prompt refinement, performance tracking and more. For example, you can use the Gemini Batch API to run batch evaluations using datasets built up over time, see the Datasets Cookbook for an example. This allows you to test the changes to your Gemini model selection or application logic before you deploy them to users.

You also have the option to share specific datasets with Google to provide feedback on end-to-end model behavior for your specific use case. Shared datasets will be used to improve and develop Google products and services, including improving and training our models.

Create datasets and filter by status, share feedback with Google and export as needed

Get started today

Start prototyping and building AI-first apps in the Google AI Studio Build mode. Once you enable logging at the project level, you can monitor your application from its first prototype all the way to production. Read more about the tools in our docs and join our Developer Forum to share feedback.

POSTED IN:
  • Developers
  • AI
Google LLC published this content on October 30, 2025, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on October 30, 2025 at 17:33 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]