-
Notifications
You must be signed in to change notification settings - Fork 1.2k
feat: migrate Conversations API to FastAPI router #4342
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
✱ Stainless preview buildsThis PR will update the Edit this comment to update it. It will appear in the SDK's changelogs. ✅ llama-stack-client-node studio · code · diff
✅ llama-stack-client-go studio · code · diff
⏳ These are partial results; builds are still running. This comment is auto-generated by GitHub Actions and is automatically kept up to date as you push. |
0e9e2e2 to
57175d2
Compare
|
This pull request has merge conflicts that must be resolved before it can be merged. @leseb please rebase it. https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork |
57175d2 to
0173317
Compare
|
Conformance test is RED! |
Yeah but the "breaking change" is actually the spec now correctly reflecting what the API has always returned. The old @webmethod system was likely not properly expanding the Union type in the schema. The old spec showed only OpenAIResponseMessage - this was incomplete. We are still returning Previously we had @webmethod(route="/conversations/{conversation_id}/items/{item_id}", method="GET", level=LLAMA_STACK_API_V1)
async def retrieve(self, conversation_id: str, item_id: str) -> ConversationItem:With the resulting spec: /v1/conversations/{conversation_id}/items/{item_id}:
get:
responses:
'200':
description: The conversation item.
content:
application/json:
schema:
$ref: '#/components/schemas/OpenAIResponseMessage'The new code in @router.get(
"/conversations/{conversation_id}/items/{item_id}",
response_model=ConversationItem,
summary="Retrieve an item.",
description="Retrieve a conversation item.",
responses={200: {"description": "The conversation item."}},
)
async def retrieve_item(
conversation_id: Annotated[str, Path(description="The conversation identifier.")],
item_id: Annotated[str, Path(description="The item identifier.")],
) -> ConversationItem:
request = RetrieveItemRequest(conversation_id=conversation_id, item_id=item_id)
return await impl.retrieve(request)With the resulting spec: get:
responses:
'200':
description: The conversation item.
content:
application/json:
schema:
oneOf:
- $ref: '#/components/schemas/OpenAIResponseMessage-Output'
- $ref: '#/components/schemas/OpenAIResponseOutputMessageWebSearchToolCall'
- $ref: '#/components/schemas/OpenAIResponseOutputMessageFileSearchToolCall'
- $ref: '#/components/schemas/OpenAIResponseOutputMessageFunctionToolCall'
- $ref: '#/components/schemas/OpenAIResponseInputFunctionToolCallOutput'
- $ref: '#/components/schemas/OpenAIResponseMCPApprovalRequest'
- $ref: '#/components/schemas/OpenAIResponseMCPApprovalResponse'
- $ref: '#/components/schemas/OpenAIResponseOutputMessageMCPCall'
- $ref: '#/components/schemas/OpenAIResponseOutputMessageMCPListTools'
discriminator:
propertyName: type
mapping:
message: '#/components/schemas/OpenAIResponseMessage-Output'
web_search_call: '#/components/schemas/OpenAIResponseOutputMessageWebSearchToolCall'
file_search_call: '#/components/schemas/OpenAIResponseOutputMessageFileSearchToolCall'
function_call: '#/components/schemas/OpenAIResponseOutputMessageFunctionToolCall'
function_call_output: '#/components/schemas/OpenAIResponseInputFunctionToolCallOutput'
mcp_approval_request: '#/components/schemas/OpenAIResponseMCPApprovalRequest'
mcp_approval_response: '#/components/schemas/OpenAIResponseMCPApprovalResponse'
mcp_call: '#/components/schemas/OpenAIResponseOutputMessageMCPCall'
mcp_list_tools: '#/components/schemas/OpenAIResponseOutputMessageMCPListTools'
title: Response Retrieve Item V1 Conversations Conversation Id Items Item Id GetReminder for ConversationItem = Annotated[
OpenAIResponseMessage
| OpenAIResponseOutputMessageWebSearchToolCall
| OpenAIResponseOutputMessageFileSearchToolCall
| OpenAIResponseOutputMessageFunctionToolCall
| OpenAIResponseInputFunctionToolCallOutput
| OpenAIResponseMCPApprovalRequest
| OpenAIResponseMCPApprovalResponse
| OpenAIResponseOutputMessageMCPCall
| OpenAIResponseOutputMessageMCPListTools
| OpenAIResponseOutputMessageMCPCall
| OpenAIResponseOutputMessageMCPListTools,
Field(discriminator="type"),
]
register_schema(ConversationItem, name="ConversationItem")This is likely a bug in our generator and another reason why the migration is important :). The other message like: This is also a consequence of the Union NOT properly being handled, since we have a oneOf, there's no guarantee that any field is present across all variants. Makes sense? |
0173317 to
c10d251
Compare
|
This pull request has merge conflicts that must be resolved before it can be merged. @leseb please rebase it. https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork |
Convert the Conversations API from the legacy @webmethod decorator system to FastAPI routers, following the established pattern from the Batches API. The only notable difference is that now the conversations API is properly listed when inspecting routes. Fixes: llamastack#4341 Signed-off-by: Sébastien Han <seb@redhat.com>
c10d251 to
f825d6b
Compare
|
yes, I agree with @leseb 's comment above, this failing conformance seems like a misnomer rather than something to worry about. We might see some of these failures with these router conversions as they "correct" the API types. |
What does this PR do?
Convert the Conversations API from the legacy @webmethod decorator system to FastAPI routers, following the established pattern from the Batches API.
The only notable difference is that now the conversations API is properly listed when inspecting routes.
Fixes: #4341
Test Plan
Unit and integration tests:
Also, inspect endpoint: