-
Notifications
You must be signed in to change notification settings - Fork 4.1k
fix: OpenAI Responses API parallel tool calls losing call_ids #9659
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
fix: OpenAI Responses API parallel tool calls losing call_ids #9659
Conversation
When GPT-5 makes parallel tool calls, each function_call has a unique call_id (fc_*). These IDs were being overwritten instead of accumulated, causing the error: "No tool call found for function call output with call_id" Changes: - sessionSlice.ts: Accumulate IDs in responsesOutputItemIds[] array - openaiTypeConverters.ts: Emit function_call for EACH toolCall with matching ID Ref: https://platform.openai.com/docs/guides/function-calling
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No issues found across 3 files
RomneyDa
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@houssemzaier thanks for this contribution. I don't fully understand why the call IDs are being lost, and I wonder if it's a bug in the conversion logic which doesn't require adding an additional metadata. Our data model and many of our providers support parallel tool calls without issues. Could you clarify why this approach is needed rather than just fixing the overwriting bug?
|
Thanks for the review @RomneyDa! To reproduce the issue:
Prompt: What happens without the fix:
Why this happens: The overwriting bug exists because Continue merges parallel tool calls into a single
Why this fix: Why not change the data model? If you'd prefer a different approach, I'm happy to align, but I'd like to keep this PR scoped to the minimal/compatible fix so we don't have to rework the whole tool-call pipeline. |
@houssemzaier I might be missing something but is the If there is a difference in the nature of the IDs being stored than I would agree that this is the right approach |
|
@RomneyDa Yep ! there are two different identifiers on a Responses API function_call:
For returning the tool result, only call_id is used for pairing: { The 400 happens when OpenAI can't find a pending function_call matching that call_id , which usually means the next request doesn't include the original function_call item(s) in the replayed history (or it got altered during That's why we keep the streamed function_call item ids around for parallel calls. |
|
@houssemzaier thanks for explaining! Then looks good to me. |
|
Looks like a linting check is failing, may need to split up a function or something. Our nesting rule is a bit strict |
Fixes #8773
When GPT-5 makes parallel tool calls via the Responses API, each
function_callhas a uniquecall_id(fc_*). These IDs were being overwritten instead of accumulated, causing:Changes
gui/src/redux/slices/sessionSlice.ts: Accumulate IDs inresponsesOutputItemIds[]array during streamingcore/llm/openaiTypeConverters.ts: Emitfunction_callfor each tool call with positional ID matchingcore/llm/openaiTypeConverters.test.ts: Added comprehensive tests for parallel tool call scenariosHow it works
fromResponsesChunksetsmessage.metadata.responsesOutputItemId, we now accumulate these into an arrayresponsesOutputItemIds[]call_idwhen sent back to OpenAITest plan
Related: #8935
Continue Tasks
Powered by Continue
Summary by cubic
Fixes lost call_id mapping for parallel tool calls in the OpenAI Responses API, preventing “No tool call found for function call output” errors and restoring correct pairing between function_call and tool outputs.
Written for commit 4ad4a6e. Summary will update on new commits.