Skip to content

Conversation

@daniel-lxs
Copy link
Member

@daniel-lxs daniel-lxs commented Jan 13, 2026

Summary

Fixes COM-406: parallel_tool_calls=false breaks LiteLLM/Bedrock routes with native tools.

Problem

The OpenAI provider was always sending parallel_tool_calls=false with native tools (via metadata.parallelToolCalls ?? false). This caused errors with non-OpenAI backends accessed through LiteLLM:

  • Bedrock/Claude: Don't support the parameter at all - any value breaks
  • Gemini: Supports parallel tool calls, but rejects parallel_tool_calls=false when multiple tools are provided

Solution

Changed the logic to only include parallel_tool_calls: true when metadata.parallelToolCalls is explicitly set to true. The parameter is now omitted entirely when not explicitly enabled, which allows LiteLLM routes to work with any backend.

Changes

  • src/api/providers/openai.ts - Updated 4 locations where parallel_tool_calls was being set (streaming, non-streaming, and O3 family models)
  • src/api/providers/__tests__/openai-native-tools.spec.ts - Updated test to verify parameter is NOT included when parallelToolCalls is not explicitly true

Testing

All 121 OpenAI provider tests pass, including the updated test that verifies the fix.


Important

Fixes parallel_tool_calls logic in openai.ts to only include it when explicitly enabled, resolving LiteLLM/Bedrock compatibility issues.

  • Behavior:
    • parallel_tool_calls is now only included when metadata.parallelToolCalls is explicitly true in openai.ts.
    • Fixes compatibility issues with LiteLLM/Bedrock routes by omitting parallel_tool_calls when not enabled.
  • Testing:
    • Updated test in openai-native-tools.spec.ts to verify parallel_tool_calls is omitted when not explicitly set to true.
  • Files Affected:
    • openai.ts: Updated logic in 4 locations for streaming, non-streaming, and O3 family models.
    • openai-native-tools.spec.ts: Added test verification for the fix.

This description was created by Ellipsis for dbb85f8. You can customize this summary. It will automatically update as commits are pushed.

Closes #10553
(from Linear COM-406)

The OpenAI provider was always sending parallel_tool_calls=false with native tools,
which broke LiteLLM routes to non-OpenAI backends:
- Bedrock/Claude: Don't support the parameter at all
- Gemini: Rejects false when multiple tools are provided

Now the parameter is only included when parallelToolCalls is explicitly set to true,
allowing LiteLLM routes to work with any backend.

Fixes: COM-406
@daniel-lxs daniel-lxs requested review from cte, jr and mrubens as code owners January 13, 2026 00:08
@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. bug Something isn't working labels Jan 13, 2026
@roomote
Copy link
Contributor

roomote bot commented Jan 13, 2026

Rooviewer Clock   See task on Roo Cloud

Review complete. No issues found.

The PR correctly fixes the parallel_tool_calls compatibility issue with LiteLLM/Bedrock backends by only including the parameter when explicitly enabled. The changes are consistent across all 4 locations in openai.ts and properly tested.

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working size:M This PR changes 30-99 lines, ignoring generated files.

Projects

Status: PR [Needs Review]

Development

Successfully merging this pull request may close these issues.

[BUG] parallel_tool_calls=false breaks LiteLLM/Bedrock routes with native tools

2 participants