Skip to content

[Bug]: Losing tool name in tool_calls when streaming choice contains both text and tool_calls #18238

@a5223594

Description

@a5223594

What happened?

In the file litellm/llms/anthropic/experimental_pass_through/adapters/transformation.py, the method _translate_streaming_openai_chunk_to_anthropic has an issue. If the LLM response contains a streaming choice that includes both text and tool_calls, the tool name inside tool_calls gets lost (is not preserved). Only the text is handled, and tool_calls are ignored. For example, when the streaming choice contains:

StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, content='首先初始化项目:', role=None, function_call=None, tool_calls=[ChatCompletionDeltaToolCall(id='toolu_bdrk_013xRVejhv3ybmLEGCoZib2b', function=Function(arguments='', name='Bash'), type='function', index=0)], audio=None), logprobs=None)

the code only processes the text and ignores the tool_calls, so the tool name is not captured as expected.

Relevant log output

Relevant code location: File: litellm/llms/anthropic/experimental_pass_through/adapters/transformation.py Method: _translate_streaming_openai_chunk_to_anthropic Relevant code: for choice in choices: if choice.delta.content is not None and len(choice.delta.content) > 0: text += choice.delta.content elif choice.delta.tool_calls is not None: partial_json = "" for tool in choice.delta.tool_calls: if ( tool.function is not None and tool.function.arguments is not None ): Bug: Only `text` is handled, while `tool_calls` are ignored when both are present.

What part of LiteLLM is this about?

SDK (litellm Python package)

What LiteLLM version are you on ?

latest

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions