Skip to content

Conversation

@jsonbailey
Copy link
Contributor

@jsonbailey jsonbailey commented Oct 29, 2025

Note

Introduces invokeStructuredModel for structured outputs and wraps invokeModel with try/catch to return consistent failure responses with logging; updates tests accordingly.

  • LangChainProvider (src/LangChainProvider.ts):
    • Structured Output: Add invokeStructuredModel(messages, responseStructure) using withStructuredOutput(...), returning { data, rawResponse, metrics } and handling errors with warnings and failure metrics.
    • Error Handling: Wrap invokeModel in try/catch; on failure, log warning and return empty assistant message with success=false.
    • Types: Import StructuredResponse type.
    • Model Init: Minor option reordering in createLangChainModel (spread parameters before modelProvider).
  • Tests (__tests__/LangChainProvider.test.ts):
    • Add tests for invokeModel error path (logs and failure response).
    • Add tests validating invokeStructuredModel success and error behaviors.

Written by Cursor Bugbot for commit 7560025. This will update automatically on new commits. Configure here.

@jsonbailey jsonbailey requested a review from a team as a code owner October 29, 2025 19:33
@jsonbailey
Copy link
Contributor Author

This PR is reliant on #969 being merged first.

@github-actions
Copy link
Contributor

@launchdarkly/browser size report
This is the brotli compressed size of the ESM build.
Compressed size: 169118 bytes
Compressed size limit: 200000
Uncompressed size: 789399 bytes

@github-actions
Copy link
Contributor

@launchdarkly/js-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 24988 bytes
Compressed size limit: 26000
Uncompressed size: 122411 bytes

@github-actions
Copy link
Contributor

@launchdarkly/js-client-sdk size report
This is the brotli compressed size of the ESM build.
Compressed size: 21721 bytes
Compressed size limit: 25000
Uncompressed size: 74698 bytes

@github-actions
Copy link
Contributor

@launchdarkly/js-client-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 17636 bytes
Compressed size limit: 20000
Uncompressed size: 90259 bytes

@jsonbailey jsonbailey changed the title feat!: Add invokeStructuredModel method to support new Judge online evals feat!: Support invoke with structured output in LangChain provider Nov 5, 2025
@jsonbailey jsonbailey merged commit 0427908 into main Nov 5, 2025
32 checks passed
@jsonbailey jsonbailey deleted the jb/sdk-1522/structed-model-provider-langchain branch November 5, 2025 21:29
@github-actions github-actions bot mentioned this pull request Nov 5, 2025
jsonbailey added a commit that referenced this pull request Nov 6, 2025
🤖 I have created a release *beep* *boop* --- <details><summary>server-sdk-ai: 0.14.0</summary> ## [0.14.0](server-sdk-ai-v0.13.0...server-sdk-ai-v0.14.0) (2025-11-06) ### ⚠ BREAKING CHANGES * Removed deprecated Vercel methods ([#983](#983)) * Add support for real time judge evals ([#969](#969)) * AI Config defaults require the "enabled" attribute * Renamed LDAIAgentConfig to LDAIAgentConfigRequest for clarity * Renamed LDAIAgent to LDAIAgentConfig *note the previous use of this name * Renamed LDAIAgentDefault to LDAIAgentConfigDefault for clarity * Renamed LDAIDefaults to LDAICompletionConfigDefault for clarity ### Features * Add support for real time judge evals ([#969](#969)) ([6ecd9ab](6ecd9ab)) * Added createJudge method ([6ecd9ab](6ecd9ab)) * Added judgeConfig method to AI SDK to retrieve an AI Judge Config ([6ecd9ab](6ecd9ab)) * Added trackEvalScores method to config tracker ([6ecd9ab](6ecd9ab)) * Chat will evaluate responses with configured judges ([6ecd9ab](6ecd9ab)) * Include AI SDK version in tracking information ([#985](#985)) ([ef90564](ef90564)) * Removed deprecated Vercel methods ([#983](#983)) ([960a499](960a499)) ### Bug Fixes * AI Config defaults require the "enabled" attribute ([6ecd9ab](6ecd9ab)) * Renamed LDAIAgent to LDAIAgentConfig *note the previous use of this name ([6ecd9ab](6ecd9ab)) * Renamed LDAIAgentConfig to LDAIAgentConfigRequest for clarity ([6ecd9ab](6ecd9ab)) * Renamed LDAIAgentDefault to LDAIAgentConfigDefault for clarity ([6ecd9ab](6ecd9ab)) * Renamed LDAIDefaults to LDAICompletionConfigDefault for clarity ([6ecd9ab](6ecd9ab)) </details> <details><summary>server-sdk-ai-langchain: 0.3.0</summary> ## [0.3.0](server-sdk-ai-langchain-v0.2.0...server-sdk-ai-langchain-v0.3.0) (2025-11-06) ### ⚠ BREAKING CHANGES * Support invoke with structured output in LangChain provider ([#970](#970)) ### Features * Support invoke with structured output in LangChain provider ([#970](#970)) ([0427908](0427908)) ### Dependencies * The following workspace dependencies were updated * devDependencies * @launchdarkly/server-sdk-ai bumped from ^0.13.0 to ^0.14.0 * peerDependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.2 to ^0.14.0 </details> <details><summary>server-sdk-ai-openai: 0.3.0</summary> ## [0.3.0](server-sdk-ai-openai-v0.2.0...server-sdk-ai-openai-v0.3.0) (2025-11-06) ### ⚠ BREAKING CHANGES * Support invoke with structured output in OpenAI provider ([#980](#980)) ### Features * Support invoke with structured output in OpenAI provider ([#980](#980)) ([515dbdf](515dbdf)) ### Dependencies * The following workspace dependencies were updated * devDependencies * @launchdarkly/server-sdk-ai bumped from ^0.13.0 to ^0.14.0 * peerDependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.2 to ^0.14.0 </details> <details><summary>server-sdk-ai-vercel: 0.3.0</summary> ## [0.3.0](server-sdk-ai-vercel-v0.2.0...server-sdk-ai-vercel-v0.3.0) (2025-11-06) ### ⚠ BREAKING CHANGES * Support invoke with structured output in VercelAI provider ([#981](#981)) ### Features * Support invoke with structured output in VercelAI provider ([#981](#981)) ([d0cb41d](d0cb41d)) ### Dependencies * The following workspace dependencies were updated * devDependencies * @launchdarkly/server-sdk-ai bumped from ^0.13.0 to ^0.14.0 * peerDependencies * @launchdarkly/server-sdk-ai bumped from ^0.12.2 to ^0.14.0 </details> --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please). <!-- CURSOR_SUMMARY --> --- > [!NOTE] > Release server-ai 0.14.0 (judge evals, breaking renames/removals) and update LangChain/OpenAI/Vercel providers to 0.3.0 with structured output; refresh examples and manifests to new versions. > > - **SDK (`packages/sdk/server-ai`) — `0.14.0`** > - Adds real-time judge evaluations and related APIs (`createJudge`, `judgeConfig`, `trackEvalScores`); includes SDK version in tracking. > - Breaking: removes deprecated Vercel methods; requires `enabled` in AI Config defaults; renames several AI config types. > - **AI Providers — `0.3.0`** > - `@launchdarkly/server-sdk-ai-langchain`, `-openai`, `-vercel`: add structured output support for `invoke` (breaking changes). > - Bump peer/dev dependency on `@launchdarkly/server-sdk-ai` to `^0.14.0`. > - **Examples** > - Update example apps to use `@launchdarkly/server-sdk-ai@0.14.0` and provider packages `^0.3.0`. > - **Release metadata** > - Update `.release-please-manifest.json` with new versions. > > <sup>Written by [Cursor Bugbot](https://cursor.com/dashboard?tab=bugbot) for commit 00cd808. This will update automatically on new commits. Configure [here](https://cursor.com/dashboard?tab=bugbot).</sup> <!-- /CURSOR_SUMMARY --> --------- Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> Co-authored-by: jsonbailey <jbailey@launchdarkly.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

3 participants