Skip to content

[Feature]: Allow dynamic prompts for Arize Phoenix through the the gateway/proxy #17905

@metalshanked

Description

@metalshanked

The Feature

Thanks @krrishdholakia for this integration!
Would it be possible to dynamically set the Arize Phoenix prompt id in the Litellm Proxy calls?

This will help when users have a large number of prompts to manage in Phoenix, and adding each prompt to Litellm configs will be cumbersome.

So, in the below proxy Curl call, if we could directly pass the Arize Phoenix prompt id instead as well.

Motivation, pitch

Currently seems like the Arize Prompt ID needs to be hardcoded into the proxy config.

prompts:

  • prompt_id: "simple_prompt"
    litellm_params:
    prompt_id: "UHJvbXB0VmVyc2lvbjox"
    prompt_integration: "arize_phoenix"
    api_base: https://app.phoenix.arize.com/s/your-workspace
    api_key: os.environ/PHOENIX_API_KEY
    ignore_prompt_manager_model: true # optional: use model from config instead
    ignore_prompt_manager_optional_params: true # optional: ignore temp, max_tokens from prompt
    Make request**

curl -X POST 'http://0.0.0.0:4000/chat/completions'
-H 'Content-Type: application/json'
-H 'Authorization: Bearer sk-1234'
-d '{
"model": "gpt-3.5-turbo",
"prompt_id": "simple_prompt", # Can this directly use the prompt id from arize
"prompt_variables": {
"question": "Explain quantum computing"
}
}'

LiteLLM is hiring a founding backend engineer, are you interested in joining us and shipping to all our users?

No

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions