-
Notifications
You must be signed in to change notification settings - Fork 887
Open
Description
Certain LLM APIs (notably https://ai.ezif.in/ and likely other kivest endpoints) default to streaming Server-Sent Events (SSE) responses instead of returning standard JSON. The OpenEvolve LLM client expects non-streaming JSON with a choices array, causing the error:
'str' object has no attribute 'choices'
Root Cause
The API at https://ai.ezif.in/chat/completions returns SSE data by default:
data: {"id":"chatcmpl-xxx","object":"chat.completion.chunk",...}
data: [DONE]
When the OpenAI client parses this raw text string, it becomes a string rather than a response object, so accessing .choices fails.
Fix
Add "stream": false to the API request parameters in openevolve/llm/openai.py:
# In generate_with_context method, after building params:
# Always request non-streaming responses
params["stream"] = False
I think this is standard OpenAI API behavior and doesn't harm any other API endpoints.
Reproduction Steps
- Configure OpenEvolve with an API that defaults to streaming (e.g., kivest/ezif)
- Run any evolution iteration
- Error appears in logs after successful HTTP request
Additional Notes
- The fix is in openevolve/llm/openai.py line ~170
- This is a one-line fix that makes the behavior explicit
- The streaming parameter is part of the OpenAI API spec and I believe it is supported by all compatible endpoints
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels