Skip to content

Conversation

@Dima-Mediator
Copy link
Contributor

Title

Capture Gemini reasoning tokens usage in streaming mode

Relevant issues

Fixes #10554

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🐛 Bug Fix

Changes

Capturing gemini (vertex_ai) usage tokens in streaming mode as thoughtsTokenCount from usageMetadata.
Currently (before this PR), gemini reasoning tokens were ignore in streaming mode.

This is a another attempt after #10666 (closed)

Screenshot 2025-05-13 at 01 45 35

@vercel
Copy link

vercel bot commented May 13, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 13, 2025 6:18am
@Dima-Mediator
Copy link
Contributor Author

@krrishdholakia please... You already reviewed a preview PR (#10666), this just corrects it as you indicated.

@krrishdholakia krrishdholakia merged commit 11740ce into BerriAI:main May 15, 2025
6 checks passed
@matannahmani
Copy link

awesome work guys

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

4 participants