-
-
Notifications
You must be signed in to change notification settings - Fork 5k
Open
Labels
Description
What happened?
Description
Analyzer and Anonymizer services are working fine, however LiteLLM setup fails when using the Presidio guardrail.
Error:
Steps to Reproduce
- Ensure Presidio Analyzer and Anonymizer services are running successfully.
- Configure LiteLLM with the configuration below.
- Run LiteLLM with
guardrail: presidioandmode: pre_call. - Observe the ContentTypeError and 404 response.
Expected Behavior
LiteLLM should be able to successfully call Presidio Analyzer and then Anonymizer and receive a valid JSON response.
Actual Behavior
LiteLLM attempts a call to the Presidio Analyzer but ends with unexpected mimetype text/html.
LiteLLM Configuration
model_list:
- model_name: claude-sonnet-4@20250514
litellm_params:
model: vertex_ai/claude-3-5-sonnet-v2@20241022
vertex_ai_project: os.environ/PROJECT_ID
vertex_ai_location: "global"
- model_name: gemini-2.5-pro
litellm_params:
model: vertex_ai/gemini-2.5-pro
vertex_ai_project: os.environ/PROJECT_ID
vertex_ai_location: "global"
- model_name: gemini-2.5-flash
litellm_params:
model: vertex_ai/gemini-2.5-flash
vertex_ai_project: os.environ/PROJECT_ID
vertex_ai_location: "global"
- model_name: gemini-3-pro-preview
litellm_params:
model: vertex_ai/gemini-3-pro-preview
vertex_ai_project: os.environ/PROJECT_ID
vertex_ai_location: "global"
- model_name: gpt-40
litellm_params:
model: azure/gpt-4o-chatgpt
api_base: https://openai-poc-useast2.openai.azure.com/
api_version: "2024-08-01-preview"
api_key: os.environ/AZURE_API_KEY
- model_name: gpt-4.1
litellm_params:
model: azure/gpt-4.1-*****
api_base: https://openai-poc-useast2.openai.azure.com/
api_version: "2024-08-01-preview"
api_key: os.environ/AZURE_API_KEY
- model_name: gpt-5
litellm_params:
model: azure/gpt-5-chat-*****
api_base: https://openai-poc-useast2.openai.azure.com/
api_version: "2025-01-01-preview"
api_key: os.environ/AZURE_API_KEY
- model_name: gemini-2.5-pro-preview-05-06
litellm_params:
model: vertex_ai/gemini-2.5-pro
vertex_ai_project: os.environ/PROJECT_ID
vertex_ai_location: "global"
- model_name: gemini-2.5-flash-preview-04-17
litellm_params:
model: vertex_ai/gemini-2.5-flash
vertex_ai_project: os.environ/PROJECT_ID
vertex_ai_location: "global"
- model_name: gemini-2.0-flash-exp
litellm_params:
model: vertex_ai/gemini-2.5-flash
vertex_ai_project: os.environ/PROJECT_ID
vertex_ai_location: "global"
guardrails:
- guardrail_name: "presidio-pii"
litellm_params:
guardrail: presidio
mode: "pre_call"
default_on: true
presidio_language: "en"
general_settings:
master_key: ********
litellm_settings:
drop_params: true
### Relevant log output
```shell
aiohttp.client_exceptions.ContentTypeError: 404, message='Attempt to decode JSON with unexpected mimetype: text/html; charset=utf-8', url='https://****-presidio-analyzer-******.us-west1.run.app/analyze'Are you a ML Ops Team?
No
What LiteLLM version are you on ?
main branch
Twitter / LinkedIn details
No response