Conversation
There was a problem hiding this comment.
Pull Request Overview
This PR adds a comprehensive example demonstrating how to use NVIDIA NIM models with the Agent Framework through Azure AI Foundry by configuring the OpenAI Chat Client to point to Azure AI Foundry endpoints.
- Adds a complete Python example showing both streaming and non-streaming responses with tool calling capabilities
- Includes detailed setup documentation with prerequisites and environment variable configuration
- Provides comprehensive README documentation for the NVIDIA examples folder
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.
| File | Description |
|---|---|
python/samples/getting_started/agents/nvidia/nvidia_nim_with_openai_chat_client.py |
Complete example implementation showing NVIDIA NIM integration with OpenAI Chat Client, including tool functions and both streaming/non-streaming usage patterns |
python/samples/getting_started/agents/nvidia/README.md |
Documentation explaining setup prerequisites, environment variables, API compatibility, and available models for NVIDIA NIM integration |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
python/samples/getting_started/agents/nvidia/nvidia_nim_with_openai_chat_client.py
Outdated
Show resolved
Hide resolved
|
@ekzhu @markwallace-microsoft @eavanvalkenburg - Please hold off on merging this PR. I have a few updates to make (working with @azeltov) and will push the changes within the next couple of hours |
|
Simplified the abstractions and example. Thanks to @azeltov. We will be adding more to this |
|
We are good to merge now. Tested and confirmed with @azeltov |
Introduces an example demonstrating the integration of NVIDIA NIM models deployed on Azure AI Foundry with the Agent Framework, leveraging the OpenAI Chat Client. This allows users to interact with NVIDIA NIM models using a familiar OpenAI-compatible API, enabling features like tool calling, streaming responses, and standard chat completion. Includes setup instructions and environment variable configuration for seamless deployment and usage.
Updates the environment variable names to align with OpenAI naming conventions. This change ensures consistency and clarity when configuring the NVIDIA NIM with OpenAI Chat Client.
Refactors the NVIDIA NIM agent example to use a custom chat client for handling the specific message format requirements of NVIDIA NIM models. This change streamlines the example and makes it more focused on demonstrating how to use NVIDIA NIM models with the Agent Framework on Azure AI Foundry.
b517b5d to
9ce3e7e
Compare
| ) | ||
|
|
||
|
|
||
| class NVIDIANIMChatClient(OpenAIChatClient): |
There was a problem hiding this comment.
Directed to @eavanvalkenburg: we have other clients based on the OpenAIChatClient, why wouldn't we want to bring this NVIDIANIMChatClient into its own package, instead of it being purely in the samples?
There was a problem hiding this comment.
Good question @moonbox3 I've asked Shawn for his opinion
|
Closing because this is stale, feel free to reopen, I wouldn't mind having a native NVidia NIM client, but it would require some work with the new structure (raw clients etc.) |
Adds an example demonstrating how to use NVIDIA NIM models with the Agent Framework through Azure AI Foundry.
The example shows how to configure the OpenAI Chat Client to use NVIDIA NIM models deployed on Azure AI Foundry, including both streaming and non-streaming responses with tool calling capabilities.