-
Notifications
You must be signed in to change notification settings - Fork 3.3k
[agentserver] azure-ai-agentserver -core, -invocation, and -responses packages #45925
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
5da12d4
57c63a3
df45208
3d93cf2
682afbc
11f7747
9d03cf8
e040678
89e351e
f8c8cec
ee6ee6b
d787997
8559738
6b0bf7c
490a8ef
d4c2800
b4b290c
93128da
84592d6
2d56d1b
d6efaf7
bc7ee65
d4acc08
1174338
a44525d
a98b3ea
594728b
5afc89f
1c72226
0c68c5a
90c1a9b
55ddab6
534a435
9000a31
9fc9e17
8f6525c
831f6cf
fb0407c
7395b71
48ac861
5d5fabf
36d234a
cb3098b
d015835
c0db1d8
5556989
923e18d
4bd38ab
f29f8bb
ec36581
69e3c20
342b0b9
fcab75c
637a5d0
e5cc1a4
63c15c2
c36ec43
85351f9
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,7 +1,16 @@ | ||
| # Release History | ||
|
|
||
| ## 1.0.0b1 (2025-11-07) | ||
| ## 2.0.0b1 (Unreleased) | ||
|
|
||
| ### Features Added | ||
|
|
||
| First version | ||
| - Renamed package from `azure-ai-agentserver-hosting` to `azure-ai-agentserver-core`. | ||
| - `AgentHost` host framework with health probe, graceful shutdown, and port binding. | ||
| - `TracingHelper` for OpenTelemetry tracing with Azure Monitor and OTLP exporters. | ||
| - Auto-enable tracing when Application Insights or OTLP endpoint is configured. | ||
| - W3C Trace Context propagation and `leaf_customer_span_id` baggage re-parenting. | ||
| - `error_response()` utility for standard error envelope responses. | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Looks like the actual API is |
||
| - `get_logger()` for library-scoped logging. | ||
| - `StructuredLogFilter` and `LogScope` for per-request structured logging. | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. These two don't exist in the code base. |
||
| - `register_routes()` for pluggable protocol composition. | ||
| - Hypercorn-based ASGI server with HTTP/1.1 support. | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,105 +1,143 @@ | ||
| # Azure AI Agent Server Adapter for Python | ||
| # Azure AI AgentHost Core for Python | ||
|
|
||
| The `azure-ai-agentserver-core` package provides the foundation host framework for building Azure AI Hosted Agent containers. It handles the protocol-agnostic infrastructure — health probes, graceful shutdown, OpenTelemetry tracing, and ASGI serving — so that protocol packages can focus on their endpoint logic. | ||
|
|
||
| ## Getting started | ||
|
|
||
| ### Install the package | ||
|
|
||
| ```bash | ||
| pip install azure-ai-agentserver-core | ||
| ``` | ||
|
|
||
| To enable OpenTelemetry tracing with Azure Monitor and OTLP exporters: | ||
|
|
||
| ```bash | ||
| pip install azure-ai-agentserver-core[tracing] | ||
| ``` | ||
|
|
||
| ### Prerequisites | ||
|
|
||
| - Python 3.10 or later | ||
|
|
||
| ## Key concepts | ||
|
|
||
| This is the core package for Azure AI Agent server. It hosts your agent as a container on the cloud. | ||
| ### AgentHost | ||
|
|
||
| `AgentHost` is the host process for Azure AI Hosted Agent containers. It provides: | ||
|
|
||
| - **Health probe** — `GET /healthy` returns `200 OK` when the server is ready. | ||
| - **Graceful shutdown** — On `SIGTERM` the server drains in-flight requests (default 30 s timeout) before exiting. | ||
| - **OpenTelemetry tracing** — Automatic span creation with Azure Monitor and OTLP export when configured. | ||
| - **Hypercorn ASGI server** — Serves on `0.0.0.0:${PORT:-8088}` with HTTP/1.1. | ||
|
|
||
| You can talk to your agent using azure-ai-project sdk. | ||
| Protocol packages (e.g. `azure-ai-agentserver-invocations`) plug into `AgentHost` by calling `register_routes()` to add their endpoints. | ||
|
|
||
| ### Environment variables | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Should |
||
|
|
||
| | Variable | Description | Default | | ||
| |---|---|---| | ||
| | `PORT` | Listen port | `8088` | | ||
| | `FOUNDRY_AGENT_NAME` | Agent name (used in tracing) | `""` | | ||
| | `FOUNDRY_AGENT_VERSION` | Agent version (used in tracing) | `""` | | ||
| | `FOUNDRY_PROJECT_ENDPOINT` | Azure AI Foundry project endpoint | `""` | | ||
| | `APPLICATIONINSIGHTS_CONNECTION_STRING` | Azure Monitor connection string | — | | ||
| | `OTEL_EXPORTER_OTLP_ENDPOINT` | OTLP collector endpoint | — | | ||
| | `AGENT_GRACEFUL_SHUTDOWN_TIMEOUT` | Shutdown drain timeout (seconds) | `30` | | ||
| | `AGENT_LOG_LEVEL` | Log level (`DEBUG`, `INFO`, etc.) | `INFO` | | ||
|
|
||
| ## Examples | ||
|
|
||
| If your agent is not built using a supported framework such as LangGraph and Agent-framework, you can still make it compatible with Microsoft AI Foundry by manually implementing the predefined interface. | ||
| `AgentHost` is typically used with a protocol package. The simplest setup with the invocations protocol: | ||
|
|
||
| ```python | ||
| import datetime | ||
| from azure.ai.agentserver.core import AgentHost | ||
| from azure.ai.agentserver.invocations import InvocationHandler | ||
| from starlette.responses import JSONResponse | ||
|
|
||
| from azure.ai.agentserver.core import FoundryCBAgent | ||
| from azure.ai.agentserver.core.models import ( | ||
| CreateResponse, | ||
| Response as OpenAIResponse, | ||
| ) | ||
| from azure.ai.agentserver.core.models.projects import ( | ||
| ItemContentOutputText, | ||
| ResponsesAssistantMessageItemResource, | ||
| ResponseTextDeltaEvent, | ||
| ResponseTextDoneEvent, | ||
| ) | ||
| server = AgentHost() | ||
| invocations = InvocationHandler(server) | ||
|
|
||
| @invocations.invoke_handler | ||
| async def handle(request): | ||
| body = await request.json() | ||
| return JSONResponse({"greeting": f"Hello, {body['name']}!"}) | ||
|
|
||
| server.run() | ||
| ``` | ||
|
|
||
| ### Using AgentHost standalone | ||
|
|
||
| For custom protocol implementations, use `AgentHost` directly and register your own routes: | ||
|
|
||
| def stream_events(text: str): | ||
| assembled = "" | ||
| for i, token in enumerate(text.split(" ")): | ||
| piece = token if i == len(text.split(" ")) - 1 else token + " " | ||
| assembled += piece | ||
| yield ResponseTextDeltaEvent(delta=piece) | ||
| # Done with text | ||
| yield ResponseTextDoneEvent(text=assembled) | ||
|
|
||
|
|
||
| async def agent_run(request_body: CreateResponse): | ||
| agent = request_body.agent | ||
| print(f"agent:{agent}") | ||
|
|
||
| if request_body.stream: | ||
| return stream_events("I am mock agent with no intelligence in stream mode.") | ||
|
|
||
| # Build assistant output content | ||
| output_content = [ | ||
| ItemContentOutputText( | ||
| text="I am mock agent with no intelligence.", | ||
| annotations=[], | ||
| ) | ||
| ] | ||
|
|
||
| response = OpenAIResponse( | ||
| metadata={}, | ||
| temperature=0.0, | ||
| top_p=0.0, | ||
| user="me", | ||
| id="id", | ||
| created_at=datetime.datetime.now(), | ||
| output=[ | ||
| ResponsesAssistantMessageItemResource( | ||
| status="completed", | ||
| content=output_content, | ||
| ) | ||
| ], | ||
| ) | ||
| return response | ||
|
|
||
|
|
||
| my_agent = FoundryCBAgent() | ||
| my_agent.agent_run = agent_run | ||
|
|
||
| if __name__ == "__main__": | ||
| my_agent.run() | ||
| ```python | ||
| from azure.ai.agentserver.core import AgentHost | ||
| from starlette.requests import Request | ||
| from starlette.responses import JSONResponse | ||
| from starlette.routing import Route | ||
|
|
||
| async def my_endpoint(request: Request): | ||
| return JSONResponse({"status": "ok"}) | ||
|
|
||
| server = AgentHost() | ||
| server.register_routes([Route("/my-endpoint", my_endpoint, methods=["POST"])]) | ||
| server.run() | ||
| ``` | ||
|
|
||
| ### Shutdown handler | ||
|
|
||
| Register a cleanup function that runs during graceful shutdown: | ||
|
|
||
| ```python | ||
| server = AgentHost() | ||
|
|
||
| @server.shutdown_handler | ||
| async def on_shutdown(): | ||
| # Close database connections, flush buffers, etc. | ||
| pass | ||
| ``` | ||
|
|
||
| ### Configuring tracing | ||
|
|
||
| Tracing is enabled automatically when an Application Insights connection string is available: | ||
|
|
||
| ```python | ||
| server = AgentHost( | ||
| application_insights_connection_string="InstrumentationKey=...", | ||
| ) | ||
| ``` | ||
|
|
||
| Or via environment variable: | ||
|
|
||
| ```bash | ||
| export APPLICATIONINSIGHTS_CONNECTION_STRING="InstrumentationKey=..." | ||
| python my_agent.py | ||
| ``` | ||
|
|
||
| ## Troubleshooting | ||
|
|
||
| First run your agent with azure-ai-agentserver-core locally. | ||
| ### Logging | ||
|
|
||
| If it works on local by failed on cloud. Check your logs in the application insight connected to your Azure AI Foundry Project. | ||
| Set the log level to `DEBUG` for detailed diagnostics: | ||
|
|
||
| ```python | ||
| server = AgentHost(log_level="DEBUG") | ||
| ``` | ||
|
|
||
| ### Reporting issues | ||
| Or via environment variable: | ||
|
|
||
| ```bash | ||
| export AGENT_LOG_LEVEL=DEBUG | ||
| ``` | ||
|
|
||
| To report an issue with the client library, or request additional features, please open a GitHub issue [here](https://github.com/Azure/azure-sdk-for-python/issues). Mention the package name "azure-ai-agents" in the title or content. | ||
| ### Reporting issues | ||
|
|
||
| To report an issue with the client library, or request additional features, please open a GitHub issue [here](https://github.com/Azure/azure-sdk-for-python/issues). | ||
|
|
||
| ## Next steps | ||
|
|
||
| Please visit [Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/agentserver/azure-ai-agentserver-core/samples) folder. There are several cases for you to build your agent with azure-ai-agentserver | ||
|
|
||
| - Install [`azure-ai-agentserver-invocations`](https://pypi.org/project/azure-ai-agentserver-invocations/) to add the invocation protocol endpoints. | ||
| - See the [container image spec](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/agentserver) for the full hosted agent contract. | ||
|
|
||
| ## Contributing | ||
|
|
||
|
|
@@ -117,3 +155,5 @@ This project has adopted the | |
| [Microsoft Open Source Code of Conduct][code_of_conduct]. For more information, | ||
| see the Code of Conduct FAQ or contact opencode@microsoft.com with any | ||
| additional questions or comments. | ||
|
|
||
| [code_of_conduct]: https://opensource.microsoft.com/codeofconduct/ | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1 +1 @@ | ||
| __path__ = __import__("pkgutil").extend_path(__path__, __name__) # type: ignore | ||
| __path__ = __import__("pkgutil").extend_path(__path__, __name__) |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1 +1 @@ | ||
| __path__ = __import__("pkgutil").extend_path(__path__, __name__) # type: ignore | ||
| __path__ = __import__("pkgutil").extend_path(__path__, __name__) |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1 +1 @@ | ||
| __path__ = __import__("pkgutil").extend_path(__path__, __name__) # type: ignore | ||
| __path__ = __import__("pkgutil").extend_path(__path__, __name__) |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,14 +1,35 @@ | ||
| # --------------------------------------------------------- | ||
| # Copyright (c) Microsoft Corporation. All rights reserved. | ||
| # --------------------------------------------------------- | ||
| """Azure AI AgentHost core framework. | ||
|
|
||
| Provides the :class:`AgentHost` host and shared utilities for | ||
| building Azure AI Hosted Agent containers. | ||
|
|
||
| Public API:: | ||
|
|
||
| from azure.ai.agentserver.core import ( | ||
| AgentLogger, | ||
| AgentHost, | ||
| Constants, | ||
| ErrorResponse, | ||
| TracingHelper, | ||
| ) | ||
| """ | ||
| __path__ = __import__("pkgutil").extend_path(__path__, __name__) | ||
|
|
||
| from ._base import AgentHost | ||
| from ._constants import Constants | ||
| from ._errors import ErrorResponse | ||
| from ._logger import AgentLogger | ||
| from ._tracing import TracingHelper | ||
| from ._version import VERSION | ||
| from .logger import configure as config_logging | ||
| from .server.base import FoundryCBAgent | ||
| from .server.common.agent_run_context import AgentRunContext | ||
|
|
||
| config_logging() | ||
|
|
||
| __all__ = ["FoundryCBAgent", "AgentRunContext"] | ||
| __all__ = [ | ||
| "AgentLogger", | ||
| "AgentHost", | ||
| "Constants", | ||
| "ErrorResponse", | ||
| "TracingHelper", | ||
| ] | ||
| __version__ = VERSION |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't think this line is necessary. Will just need to add details to a breaking changes section regarding the new APIs.