Skip to content

Conversation

@RulaKhaled
Copy link
Member

Adds support for Google GenAI manual instrumentation in @sentry/cloudflare and @sentry/vercel-edge.
To instrument the Google GenAI client, wrap it with Sentry.instrumentGoogleGenAIClient and set recording settings.

import * as Sentry from '@sentry/cloudflare'; import { GoogleGenAI } from '@google/genai'; const genAI = new GoogleGenAI({ apiKey: 'your-api-key' }); const client = Sentry.instrumentGoogleGenAIClient(genAI, { recordInputs: true, recordOutputs: true }); // use the wrapped client with models api const model = client.models.generateContent({ model: 'gemini-1.5-pro', contents: [{ role: 'user', parts: [{ text: 'Hello!' }] }] }); // or use chat functionality const chat = client.chats.create({ model: 'gemini-1.5-flash' }); const response = await chat.sendMessage({ message: 'Tell me a joke' }); 
@github-actions
Copy link
Contributor

node-overhead report 🧳

Note: This is a synthetic benchmark with a minimal express app and does not necessarily reflect the real-world performance impact in an application.

Scenario Requests/s % of Baseline Prev. Requests/s Change %
GET Baseline 9,026 - 8,714 +4%
GET With Sentry 1,424 16% 1,412 +1%
GET With Sentry (error only) 6,107 68% 6,096 +0%
POST Baseline 1,202 - 1,205 -0%
POST With Sentry 518 43% 537 -4%
POST With Sentry (error only) 1,064 89% 1,042 +2%
MYSQL Baseline 3,344 - 3,297 +1%
MYSQL With Sentry 491 15% 437 +12%
MYSQL With Sentry (error only) 2,708 81% 2,709 -0%

View base workflow run

@RulaKhaled RulaKhaled merged commit 8298495 into develop Sep 22, 2025
188 checks passed
@RulaKhaled RulaKhaled deleted the export-google-genai-vercel-and-cloudflare branch September 22, 2025 12:16
mydea added a commit that referenced this pull request Sep 24, 2025
Rewording this as it is not actually correct, this is not auto-instrumented but needs to be done manually - see #17723
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

3 participants