fix: omit parallel_tool_calls when not explicitly enabled (COM-406) #10671
+20
−13
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
Fixes COM-406:
parallel_tool_calls=falsebreaks LiteLLM/Bedrock routes with native tools.Problem
The OpenAI provider was always sending
parallel_tool_calls=falsewith native tools (viametadata.parallelToolCalls ?? false). This caused errors with non-OpenAI backends accessed through LiteLLM:parallel_tool_calls=falsewhen multiple tools are providedSolution
Changed the logic to only include
parallel_tool_calls: truewhenmetadata.parallelToolCallsis explicitly set totrue. The parameter is now omitted entirely when not explicitly enabled, which allows LiteLLM routes to work with any backend.Changes
src/api/providers/openai.ts- Updated 4 locations whereparallel_tool_callswas being set (streaming, non-streaming, and O3 family models)src/api/providers/__tests__/openai-native-tools.spec.ts- Updated test to verify parameter is NOT included whenparallelToolCallsis not explicitlytrueTesting
All 121 OpenAI provider tests pass, including the updated test that verifies the fix.
Important
Fixes
parallel_tool_callslogic inopenai.tsto only include it when explicitly enabled, resolving LiteLLM/Bedrock compatibility issues.parallel_tool_callsis now only included whenmetadata.parallelToolCallsis explicitlytrueinopenai.ts.parallel_tool_callswhen not enabled.openai-native-tools.spec.tsto verifyparallel_tool_callsis omitted when not explicitly set totrue.openai.ts: Updated logic in 4 locations for streaming, non-streaming, and O3 family models.openai-native-tools.spec.ts: Added test verification for the fix.This description was created by
for dbb85f8. You can customize this summary. It will automatically update as commits are pushed.
Closes #10553
(from Linear COM-406)