Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
58 commits
Select commit Hold shift + click to select a range
5da12d4
create starlette package (#45754)
lusu-msft Mar 17, 2026
57c63a3
Lusu/response 0317 (#45757)
lusu-msft Mar 17, 2026
df45208
Lusu/response 0317 rename (#45764)
lusu-msft Mar 18, 2026
3d93cf2
response server apis (#45807)
lusu-msft Mar 19, 2026
682afbc
remove old path
lusu-msft Mar 19, 2026
11f7747
add samples
lusu-msft Mar 19, 2026
9d03cf8
update pyproject
lusu-msft Mar 19, 2026
e040678
refining host
lusu-msft Mar 20, 2026
89e351e
refined builders and hosting
lusu-msft Mar 20, 2026
f8c8cec
refined package structur
lusu-msft Mar 20, 2026
ee6ee6b
remove unused code
lusu-msft Mar 20, 2026
d787997
add keep alive
lusu-msft Mar 20, 2026
8559738
fix minors
lusu-msft Mar 20, 2026
6b0bf7c
fix pytests
lusu-msft Mar 20, 2026
490a8ef
add file headers
lusu-msft Mar 20, 2026
d4c2800
add docstring for hosting
lusu-msft Mar 20, 2026
b4b290c
add conftest
lusu-msft Mar 20, 2026
93128da
add docstring
lusu-msft Mar 20, 2026
84592d6
exclude scripts
lusu-msft Mar 20, 2026
2d56d1b
add dev_requirement
lusu-msft Mar 20, 2026
d6efaf7
fix build
lusu-msft Mar 20, 2026
bc7ee65
fix build
lusu-msft Mar 22, 2026
d4acc08
fix generate models
lusu-msft Mar 23, 2026
1174338
fix pylint build
lusu-msft Mar 23, 2026
a44525d
add ResponseIncompleteReason
lusu-msft Mar 23, 2026
a98b3ea
refined stream
lusu-msft Mar 23, 2026
594728b
fix mypy
lusu-msft Mar 23, 2026
5afc89f
add ci
lusu-msft Mar 23, 2026
1c72226
fix build
lusu-msft Mar 23, 2026
0c68c5a
fix keyword
lusu-msft Mar 23, 2026
90c1a9b
fix stream with background
lusu-msft Mar 23, 2026
55ddab6
ensure all contract cases covered
lusu-msft Mar 23, 2026
534a435
refined _rounting.py
lusu-msft Mar 24, 2026
9000a31
fix stream payload
lusu-msft Mar 24, 2026
9fc9e17
fix pylint
lusu-msft Mar 24, 2026
8f6525c
fix build
lusu-msft Mar 24, 2026
831f6cf
refining orchestrator
lusu-msft Mar 24, 2026
fb0407c
refining storing
lusu-msft Mar 24, 2026
7395b71
add foundry storage provider
lusu-msft Mar 25, 2026
48ac861
update response handler to decorator
lusu-msft Mar 25, 2026
5d5fabf
remove dataclass
lusu-msft Mar 25, 2026
36d234a
fix build
lusu-msft Mar 25, 2026
cb3098b
fix sphinx
lusu-msft Mar 25, 2026
d015835
try fix langchain-azure-ai version
lusu-msft Mar 25, 2026
c0db1d8
Merge branch 'main' into agentserver/invoke-reponses
lusu-msft Mar 25, 2026
5556989
Add agent server hosting and invocation packages (#45916)
zhiyong-gayang Mar 25, 2026
923e18d
Refactor responses package to leverage hosting for server hosting and…
zhiyong-gayang Mar 25, 2026
4bd38ab
[agentserver-responses] refining response orchestration (#45923)
lusu-msft Mar 26, 2026
f29f8bb
[agentserver][responses] fix replay stuck (#45930)
lusu-msft Mar 26, 2026
ec36581
Lusu/response refining (#45936)
lusu-msft Mar 26, 2026
69e3c20
Lusu/response refining (#45944)
lusu-msft Mar 26, 2026
342b0b9
rename -hosting to -core (#45946)
lusu-msft Mar 26, 2026
fcab75c
rename AgentHost (#45948)
lusu-msft Mar 26, 2026
637a5d0
Merge branch 'main' into agentserver/invoke-reponses
lusu-msft Mar 26, 2026
e5cc1a4
Remove redundant default_fetch_history_count_value from ResponsesServ…
Copilot Mar 26, 2026
63c15c2
remove old -af and -lg from ci
lusu-msft Mar 26, 2026
c36ec43
Merge branch 'agentserver/invoke-reponses' of https://github.com/Azur…
lusu-msft Mar 26, 2026
85351f9
fix multiprotocol sample
lusu-msft Mar 26, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ dependencies = [
"agent-framework-azure-ai==1.0.0b251007",
"agent-framework-core==1.0.0b251007",
"opentelemetry-exporter-otlp-proto-grpc>=1.36.0",
"opentelemetry-semantic-conventions-ai==0.4.13"
]

[build-system]
Expand Down
13 changes: 11 additions & 2 deletions sdk/agentserver/azure-ai-agentserver-core/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,16 @@
# Release History

## 1.0.0b1 (2025-11-07)
## 2.0.0b1 (Unreleased)

### Features Added

First version
- Renamed package from `azure-ai-agentserver-hosting` to `azure-ai-agentserver-core`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't think this line is necessary. Will just need to add details to a breaking changes section regarding the new APIs.

- `AgentHost` host framework with health probe, graceful shutdown, and port binding.
- `TracingHelper` for OpenTelemetry tracing with Azure Monitor and OTLP exporters.
- Auto-enable tracing when Application Insights or OTLP endpoint is configured.
- W3C Trace Context propagation and `leaf_customer_span_id` baggage re-parenting.
- `error_response()` utility for standard error envelope responses.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like the actual API is ErrorResponse.create(). Same with AgentLogger.get() for get_logger(). A module-level function instead of a class with a single static method is probably preferred in Python. Did you get feedback regarding this?

- `get_logger()` for library-scoped logging.
- `StructuredLogFilter` and `LogScope` for per-request structured logging.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These two don't exist in the code base.

- `register_routes()` for pluggable protocol composition.
- Hypercorn-based ASGI server with HTTP/1.1 support.
4 changes: 2 additions & 2 deletions sdk/agentserver/azure-ai-agentserver-core/LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,10 @@ furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
SOFTWARE.
1 change: 0 additions & 1 deletion sdk/agentserver/azure-ai-agentserver-core/MANIFEST.in
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@ include *.md
include LICENSE
recursive-include tests *.py
recursive-include samples *.py *.md
recursive-include doc *.rst *.md
include azure/__init__.py
include azure/ai/__init__.py
include azure/ai/agentserver/__init__.py
Expand Down
178 changes: 109 additions & 69 deletions sdk/agentserver/azure-ai-agentserver-core/README.md
Original file line number Diff line number Diff line change
@@ -1,105 +1,143 @@
# Azure AI Agent Server Adapter for Python
# Azure AI AgentHost Core for Python

The `azure-ai-agentserver-core` package provides the foundation host framework for building Azure AI Hosted Agent containers. It handles the protocol-agnostic infrastructure — health probes, graceful shutdown, OpenTelemetry tracing, and ASGI serving — so that protocol packages can focus on their endpoint logic.

## Getting started

### Install the package

```bash
pip install azure-ai-agentserver-core
```

To enable OpenTelemetry tracing with Azure Monitor and OTLP exporters:

```bash
pip install azure-ai-agentserver-core[tracing]
```

### Prerequisites

- Python 3.10 or later

## Key concepts

This is the core package for Azure AI Agent server. It hosts your agent as a container on the cloud.
### AgentHost

`AgentHost` is the host process for Azure AI Hosted Agent containers. It provides:

- **Health probe** — `GET /healthy` returns `200 OK` when the server is ready.
- **Graceful shutdown** — On `SIGTERM` the server drains in-flight requests (default 30 s timeout) before exiting.
- **OpenTelemetry tracing** — Automatic span creation with Azure Monitor and OTLP export when configured.
- **Hypercorn ASGI server** — Serves on `0.0.0.0:${PORT:-8088}` with HTTP/1.1.

You can talk to your agent using azure-ai-project sdk.
Protocol packages (e.g. `azure-ai-agentserver-invocations`) plug into `AgentHost` by calling `register_routes()` to add their endpoints.

### Environment variables
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should FOUNDRY_PROJECT_ARM_ID or FOUNDRY_AGENT_SESSION_ID also be added to this table?


| Variable | Description | Default |
|---|---|---|
| `PORT` | Listen port | `8088` |
| `FOUNDRY_AGENT_NAME` | Agent name (used in tracing) | `""` |
| `FOUNDRY_AGENT_VERSION` | Agent version (used in tracing) | `""` |
| `FOUNDRY_PROJECT_ENDPOINT` | Azure AI Foundry project endpoint | `""` |
| `APPLICATIONINSIGHTS_CONNECTION_STRING` | Azure Monitor connection string | — |
| `OTEL_EXPORTER_OTLP_ENDPOINT` | OTLP collector endpoint | — |
| `AGENT_GRACEFUL_SHUTDOWN_TIMEOUT` | Shutdown drain timeout (seconds) | `30` |
| `AGENT_LOG_LEVEL` | Log level (`DEBUG`, `INFO`, etc.) | `INFO` |

## Examples

If your agent is not built using a supported framework such as LangGraph and Agent-framework, you can still make it compatible with Microsoft AI Foundry by manually implementing the predefined interface.
`AgentHost` is typically used with a protocol package. The simplest setup with the invocations protocol:

```python
import datetime
from azure.ai.agentserver.core import AgentHost
from azure.ai.agentserver.invocations import InvocationHandler
from starlette.responses import JSONResponse

from azure.ai.agentserver.core import FoundryCBAgent
from azure.ai.agentserver.core.models import (
CreateResponse,
Response as OpenAIResponse,
)
from azure.ai.agentserver.core.models.projects import (
ItemContentOutputText,
ResponsesAssistantMessageItemResource,
ResponseTextDeltaEvent,
ResponseTextDoneEvent,
)
server = AgentHost()
invocations = InvocationHandler(server)

@invocations.invoke_handler
async def handle(request):
body = await request.json()
return JSONResponse({"greeting": f"Hello, {body['name']}!"})

server.run()
```

### Using AgentHost standalone

For custom protocol implementations, use `AgentHost` directly and register your own routes:

def stream_events(text: str):
assembled = ""
for i, token in enumerate(text.split(" ")):
piece = token if i == len(text.split(" ")) - 1 else token + " "
assembled += piece
yield ResponseTextDeltaEvent(delta=piece)
# Done with text
yield ResponseTextDoneEvent(text=assembled)


async def agent_run(request_body: CreateResponse):
agent = request_body.agent
print(f"agent:{agent}")

if request_body.stream:
return stream_events("I am mock agent with no intelligence in stream mode.")

# Build assistant output content
output_content = [
ItemContentOutputText(
text="I am mock agent with no intelligence.",
annotations=[],
)
]

response = OpenAIResponse(
metadata={},
temperature=0.0,
top_p=0.0,
user="me",
id="id",
created_at=datetime.datetime.now(),
output=[
ResponsesAssistantMessageItemResource(
status="completed",
content=output_content,
)
],
)
return response


my_agent = FoundryCBAgent()
my_agent.agent_run = agent_run

if __name__ == "__main__":
my_agent.run()
```python
from azure.ai.agentserver.core import AgentHost
from starlette.requests import Request
from starlette.responses import JSONResponse
from starlette.routing import Route

async def my_endpoint(request: Request):
return JSONResponse({"status": "ok"})

server = AgentHost()
server.register_routes([Route("/my-endpoint", my_endpoint, methods=["POST"])])
server.run()
```

### Shutdown handler

Register a cleanup function that runs during graceful shutdown:

```python
server = AgentHost()

@server.shutdown_handler
async def on_shutdown():
# Close database connections, flush buffers, etc.
pass
```

### Configuring tracing

Tracing is enabled automatically when an Application Insights connection string is available:

```python
server = AgentHost(
application_insights_connection_string="InstrumentationKey=...",
)
```

Or via environment variable:

```bash
export APPLICATIONINSIGHTS_CONNECTION_STRING="InstrumentationKey=..."
python my_agent.py
```

## Troubleshooting

First run your agent with azure-ai-agentserver-core locally.
### Logging

If it works on local by failed on cloud. Check your logs in the application insight connected to your Azure AI Foundry Project.
Set the log level to `DEBUG` for detailed diagnostics:

```python
server = AgentHost(log_level="DEBUG")
```

### Reporting issues
Or via environment variable:

```bash
export AGENT_LOG_LEVEL=DEBUG
```

To report an issue with the client library, or request additional features, please open a GitHub issue [here](https://github.com/Azure/azure-sdk-for-python/issues). Mention the package name "azure-ai-agents" in the title or content.
### Reporting issues

To report an issue with the client library, or request additional features, please open a GitHub issue [here](https://github.com/Azure/azure-sdk-for-python/issues).

## Next steps

Please visit [Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/agentserver/azure-ai-agentserver-core/samples) folder. There are several cases for you to build your agent with azure-ai-agentserver

- Install [`azure-ai-agentserver-invocations`](https://pypi.org/project/azure-ai-agentserver-invocations/) to add the invocation protocol endpoints.
- See the [container image spec](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/agentserver) for the full hosted agent contract.

## Contributing

Expand All @@ -117,3 +155,5 @@ This project has adopted the
[Microsoft Open Source Code of Conduct][code_of_conduct]. For more information,
see the Code of Conduct FAQ or contact opencode@microsoft.com with any
additional questions or comments.

[code_of_conduct]: https://opensource.microsoft.com/codeofconduct/
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__path__ = __import__("pkgutil").extend_path(__path__, __name__) # type: ignore
__path__ = __import__("pkgutil").extend_path(__path__, __name__)
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__path__ = __import__("pkgutil").extend_path(__path__, __name__) # type: ignore
__path__ = __import__("pkgutil").extend_path(__path__, __name__)
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__path__ = __import__("pkgutil").extend_path(__path__, __name__) # type: ignore
__path__ = __import__("pkgutil").extend_path(__path__, __name__)
Original file line number Diff line number Diff line change
@@ -1,14 +1,35 @@
# ---------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# ---------------------------------------------------------
"""Azure AI AgentHost core framework.

Provides the :class:`AgentHost` host and shared utilities for
building Azure AI Hosted Agent containers.

Public API::

from azure.ai.agentserver.core import (
AgentLogger,
AgentHost,
Constants,
ErrorResponse,
TracingHelper,
)
"""
__path__ = __import__("pkgutil").extend_path(__path__, __name__)

from ._base import AgentHost
from ._constants import Constants
from ._errors import ErrorResponse
from ._logger import AgentLogger
from ._tracing import TracingHelper
from ._version import VERSION
from .logger import configure as config_logging
from .server.base import FoundryCBAgent
from .server.common.agent_run_context import AgentRunContext

config_logging()

__all__ = ["FoundryCBAgent", "AgentRunContext"]
__all__ = [
"AgentLogger",
"AgentHost",
"Constants",
"ErrorResponse",
"TracingHelper",
]
__version__ = VERSION
Loading