-
-
Notifications
You must be signed in to change notification settings - Fork 5k
Description
The Feature
Thanks @krrishdholakia for this integration!
Would it be possible to dynamically set the Arize Phoenix prompt id in the Litellm Proxy calls?
This will help when users have a large number of prompts to manage in Phoenix, and adding each prompt to Litellm configs will be cumbersome.
So, in the below proxy Curl call, if we could directly pass the Arize Phoenix prompt id instead as well.
Motivation, pitch
Currently seems like the Arize Prompt ID needs to be hardcoded into the proxy config.
prompts:
- prompt_id: "simple_prompt"
litellm_params:
prompt_id: "UHJvbXB0VmVyc2lvbjox"
prompt_integration: "arize_phoenix"
api_base: https://app.phoenix.arize.com/s/your-workspace
api_key: os.environ/PHOENIX_API_KEY
ignore_prompt_manager_model: true # optional: use model from config instead
ignore_prompt_manager_optional_params: true # optional: ignore temp, max_tokens from prompt
Make request**
curl -X POST 'http://0.0.0.0:4000/chat/completions'
-H 'Content-Type: application/json'
-H 'Authorization: Bearer sk-1234'
-d '{
"model": "gpt-3.5-turbo",
"prompt_id": "simple_prompt", # Can this directly use the prompt id from arize
"prompt_variables": {
"question": "Explain quantum computing"
}
}'
LiteLLM is hiring a founding backend engineer, are you interested in joining us and shipping to all our users?
No
Twitter / LinkedIn details
No response