-
-
Notifications
You must be signed in to change notification settings - Fork 5k
Open
Labels
Description
What happened?
What happened?
When attempting to use bedrock/amazon.nova-2-lite-v1:0 through LiteLLM, requests fail with a Bedrock API error indicating that textGenerationConfig is not a valid parameter for Nova models.
Configuration:
- model_name: nova-2-lite
litellm_params:
model: bedrock/amazon.nova-2-lite-v1:0
model_id: ${INFERENCE_PROFILE_ARN}Request:
curl --location 'http://localhost:4000/chat/completions' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer sk-1234' \
--data '{
"model": "nova-2-lite",
"messages": [
{"role": "user", "content": "Hello"}
]
}'Relevant log output
litellm.ServiceUnavailableError: litellm.MidStreamFallbackError: litellm.BadRequestError: BedrockException - {"message":"Malformed input request: #: extraneous key [textGenerationConfig] is not permitted, reformat your input and try again."}. Received Model Group=nova-2-lite
Available Model Group Fallbacks=None
Root Cause:
LiteLLM's Bedrock handler is sending textGenerationConfig (used by other Bedrock models like Claude) instead of inferenceConfig (required by Nova models). According to AWS Nova API
documentation, Nova models use a different parameter structure with inferenceConfig.
Expected Behavior:
The Bedrock provider should detect Nova models and use inferenceConfig instead of textGenerationConfig.
Additional Context:
- PR [New Model] Add Amazon Nova as first party provider for chat completions #17351 added first-party amazon-nova/ provider support, but the bedrock/ handler still needs updating
- Other Nova models (nova-pro, nova-lite, nova-micro) likely have the same issue
- This affects users using Bedrock inference profiles who cannot use the direct Nova API
Are you a ML Ops Team?
No
What LiteLLM version are you on?
v1.80.7
Twitter / LinkedIn details
No response
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.80.7
Twitter / LinkedIn details
No response