Google Gen AI

Adds instrumentation for Google Gen AI SDK.

Import name: Sentry.googleGenAIIntegration

The googleGenAIIntegration adds instrumentation for the @google/genai SDK to capture spans by automatically wrapping Google Gen AI client calls and recording LLM interactions with configurable input/output recording.

For Cloudflare Workers, you need to manually instrument the Google Gen AI client using the instrumentGoogleGenAIClient helper:

Copied
import * as Sentry from "@sentry/cloudflare"; import { GoogleGenAI } from "@google/genai";  const genAI = new GoogleGenAI(process.env.API_KEY); const client = Sentry.instrumentGoogleGenAIClient(genAI, {  recordInputs: true,  recordOutputs: true, });  // Use the wrapped client instead of the original genAI instance const result = await client.models.generateContent("Hello!"); 

Type: boolean

Records inputs to Google Gen AI SDK method calls (such as prompts and messages).

Defaults to true if sendDefaultPii is true.

Copied
Sentry.init({  integrations: [Sentry.googleGenAIIntegration({ recordInputs: true })], }); 

Type: boolean

Records outputs from Google Gen AI SDK method calls (such as generated text and responses).

Defaults to true if sendDefaultPii is true.

Copied
Sentry.init({  integrations: [Sentry.googleGenAIIntegration({ recordOutputs: true })], }); 

By default this integration adds tracing support to Google Gen AI SDK method calls including:

  • models.generateContent() - Make an API request to generate content with a given model.
  • models.generateContentStream() - Make an API request to generate content with a given model and yields the response in chunks.
  • chats.create() - Create chat sessions.
  • sendMessage() - Send messages in chat sessions.
  • sendMessageStream() - Stream messages in chat sessions.

The integration will automatically detect streaming vs non-streaming requests and handle them appropriately.

  • @google/genai: >=0.10.0 <2
Was this helpful?
Help improve this content
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").