Skip to content

Conversation

@jsonbailey
Copy link
Contributor

@jsonbailey jsonbailey commented Nov 4, 2025

Note

Renames the metrics helper to getAIMetricsFromResponse across LangChain, OpenAI, and Vercel providers, updating code, tests, and docs; adds deprecated aliases for LangChain/OpenAI.

  • Providers:
    • LangChain: Replace createAIMetrics with getAIMetricsFromResponse in invokeModel; add deprecated createAIMetrics wrapper.
    • OpenAI: Replace createAIMetrics with getAIMetricsFromResponse in invokeModel; add deprecated createAIMetrics wrapper.
    • Vercel: Update usage to getAIMetricsFromResponse (no code changes shown beyond docs/tests).
  • Tests:
    • Update tests to target getAIMetricsFromResponse for all three providers.
  • Docs:
    • Update README examples to use getAIMetricsFromResponse in tracking calls.

Written by Cursor Bugbot for commit bfad20e. This will update automatically on new commits. Configure here.

@jsonbailey jsonbailey requested a review from a team as a code owner November 4, 2025 22:08
@github-actions
Copy link
Contributor

github-actions bot commented Nov 4, 2025

@launchdarkly/browser size report
This is the brotli compressed size of the ESM build.
Compressed size: 169118 bytes
Compressed size limit: 200000
Uncompressed size: 789399 bytes

@github-actions
Copy link
Contributor

github-actions bot commented Nov 4, 2025

@launchdarkly/js-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 24988 bytes
Compressed size limit: 26000
Uncompressed size: 122411 bytes

@github-actions
Copy link
Contributor

github-actions bot commented Nov 4, 2025

@launchdarkly/js-client-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 17636 bytes
Compressed size limit: 20000
Uncompressed size: 90259 bytes

@github-actions
Copy link
Contributor

github-actions bot commented Nov 4, 2025

@launchdarkly/js-client-sdk size report
This is the brotli compressed size of the ESM build.
Compressed size: 21721 bytes
Compressed size limit: 25000
Uncompressed size: 74698 bytes

@jsonbailey jsonbailey merged commit 05b4667 into main Nov 4, 2025
33 checks passed
@jsonbailey jsonbailey deleted the jb/rename-metric-method-providers branch November 4, 2025 22:18
@github-actions github-actions bot mentioned this pull request Nov 4, 2025
jsonbailey added a commit that referenced this pull request Nov 5, 2025
🤖 I have created a release *beep* *boop* --- <details><summary>server-sdk-ai: 0.13.0</summary> ## [0.13.0](server-sdk-ai-v0.12.3...server-sdk-ai-v0.13.0) (2025-11-04) ### Features * Add support for trackStreamMetricsOf method ([#971](#971)) ([e18979e](e18979e)) ### Bug Fixes * Deprecated toVercelAISDK, trackVercelAISDKStreamTextMetrics, use `@launchdarkly/server-sdk-ai-vercel` package ([e18979e](e18979e)) </details> <details><summary>server-sdk-ai-langchain: 0.2.0</summary> ## [0.2.0](server-sdk-ai-langchain-v0.1.3...server-sdk-ai-langchain-v0.2.0) (2025-11-04) ### Features * Renamed createAIMetrics to getAIMetricsFromResponse ([#977](#977)) ([05b4667](05b4667)) ### Dependencies * The following workspace dependencies were updated * devDependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.3 to ^0.13.0 </details> <details><summary>server-sdk-ai-openai: 0.2.0</summary> ## [0.2.0](server-sdk-ai-openai-v0.1.2...server-sdk-ai-openai-v0.2.0) (2025-11-04) ### Features * Renamed createAIMetrics to getAIMetricsFromResponse ([#977](#977)) ([05b4667](05b4667)) ### Dependencies * The following workspace dependencies were updated * devDependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.3 to ^0.13.0 </details> <details><summary>server-sdk-ai-vercel: 0.2.0</summary> ## [0.2.0](server-sdk-ai-vercel-v0.1.2...server-sdk-ai-vercel-v0.2.0) (2025-11-04) ### ⚠ BREAKING CHANGES * VercelProvider now requires type safe parameters for Vercel models ### Features * Add support for tracking streaming text metics with ([28d3650](28d3650)) * Add toVercelAISDK method to support easy model creation ([#972](#972)) ([28d3650](28d3650)) * Renamed createAIMetrics to getAIMetricsFromResponse ([#977](#977)) ([05b4667](05b4667)) ### Bug Fixes * Check finishReason for an error when determining model success ([28d3650](28d3650)) * Prefer totalUsage over usage when mapping to LDTokenUsage ([28d3650](28d3650)) * Properly convert LD model parameters to Vercel model parameters ([28d3650](28d3650)) * VercelProvider now requires type safe parameters for Vercel models ([28d3650](28d3650)) ### Dependencies * The following workspace dependencies were updated * devDependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.3 to ^0.13.0 </details> --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please). <!-- CURSOR_SUMMARY --> --- > [!NOTE] > Release bumps: server AI SDK to 0.13.0 with stream metrics; LangChain/OpenAI/Vercel providers to 0.2.0 including metric API rename and Vercel type-safe params plus fixes. > > - **AI SDK (`packages/sdk/server-ai`) — 0.13.0** > - Feature: add `trackStreamMetricsOf`. > - Fix: deprecate `toVercelAISDK` and related helpers (moved to Vercel provider). > - **AI Providers — 0.2.0** > - `server-ai-langchain`/`server-ai-openai`: > - Rename `createAIMetrics` to `getAIMetricsFromResponse`. > - `server-ai-vercel`: > - Breaking: require type-safe params for Vercel models. > - Features: streaming text metrics tracking; `toVercelAISDK` helper. > - Fixes: check `finishReason` for errors; prefer `totalUsage`; correct LD→Vercel param mapping. > - **Examples/Manifest** > - Update versions to `@launchdarkly/server-sdk-ai@0.13.0` and providers `@0.2.0` in examples and `.release-please-manifest.json`. > > <sup>Written by [Cursor Bugbot](https://cursor.com/dashboard?tab=bugbot) for commit e2b5498. This will update automatically on new commits. Configure [here](https://cursor.com/dashboard?tab=bugbot).</sup> <!-- /CURSOR_SUMMARY --> --------- Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> Co-authored-by: jsonbailey <jbailey@launchdarkly.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

3 participants