Skip to content

[Bedrock] service_tier rejected for openai.gpt-oss-120b-1:0 unless allowed_openai_params is manually set #17911

@kuldeepjain

Description

@kuldeepjain

What happened? (Also tell us what you expected to happen)

When calling the Bedrock model openai.gpt-oss-120b-1:0 using the Chat Completions API, LiteLLM rejects the service_tier parameter with an UnsupportedParamsError.

Expected behavior:
LiteLLM should automatically forward service_tier to Bedrock for models that support AWS Service Tiers, without requiring the caller to explicitly whitelist it through allowed_openai_params.

AWS Bedrock does support service tiers for this model:


Request Payload (fails)

{
  "messages": [
    { "role": "user", "content": "Hi" }
  ],
  "max_tokens": 100,
  "service_tier": "priority"
}

Response – 400 Bad Request

UnsupportedParamsError: bedrock does not support parameters: ['service_tier'], for model=openai.gpt-oss-120b-1:0.
To drop these, set drop_params=True or for proxy: litellm_settings: drop_params: true.
If you want to use these params dynamically send allowed_openai_params=['service_tier'] in your request.

Request Payload (works only when manually whitelisting)

{
  "messages": [
    { "role": "user", "content": "Hi" }
  ],
  "max_tokens": 100,
  "service_tier": "priority",
  "allowed_openai_params": ["service_tier"]
}

Response Payload

{
  "id": "chatcmpl-b402c763-fda9-42bb-9894-84e5f2351662",
  "model": "openai.gpt-oss-120b-1:0",
  "choices": [
    {
      "message": {
        "role": "assistant",
        "content": "",
        "reasoning_content": "The user says \"Hi\". Lik",
        "thinking_blocks": [
          { "type": "thinking", "thinking": "The user says \"Hi\". Lik" }
        ]
      }
    }
  ]
}

Steps to Reproduce

  1. Call Bedrock model openai.gpt-oss-120b-1:0 with a standard Chat Completions payload that includes "service_tier": "priority"
  2. Observe UnsupportedParamsError
  3. Add "allowed_openai_params": ["service_tier"]
  4. Observe that the request now succeeds

Expected Behavior

  • LiteLLM should automatically allow service_tier for supported Bedrock models
  • The caller should not need to manually set allowed_openai_params
  • The parameter should not be treated as unsupported when AWS Bedrock accepts it

Relevant Log Output

  File "/app/app/service/v3/services/chat_service.py", line 541, in _acompletion_with_retry
    return await acompletion(**params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.11/site-packages/litellm/utils.py", line 1637, in wrapper_async
    raise e
  File "/app/.venv/lib/python3.11/site-packages/litellm/utils.py", line 1483, in wrapper_async
    result = await original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.11/site-packages/litellm/main.py", line 614, in acompletion
    raise exception_type(
  File "/app/.venv/lib/python3.11/site-packages/litellm/main.py", line 587, in acompletion
    init_response = await loop.run_in_executor(None, func_with_context)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.11/site-packages/litellm/utils.py", line 1098, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.11/site-packages/litellm/main.py", line 3767, in completion
    raise exception_type(
  File "/app/.venv/lib/python3.11/site-packages/litellm/main.py", line 1318, in completion
    optional_params = get_optional_params(
                      ^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.11/site-packages/litellm/utils.py", line 3469, in get_optional_params
    _check_valid_arg(
  File "/app/.venv/lib/python3.11/site-packages/litellm/utils.py", line 3452, in _check_valid_arg
    raise UnsupportedParamsError(
litellm.exceptions.UnsupportedParamsError: litellm.UnsupportedParamsError: bedrock does not support parameters: ['service_tier'], for model=openai.gpt-oss-120b-1:0. To drop these, set `litellm.drop_params=True` or for proxy:
`litellm_settings:
 drop_params: true`

Full logs included above.


LiteLLM version

1.79.1


Are you a ML Ops Team?

Yes
(We operate the ML Platform that exposes LLM models internally.)

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions