-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Open
Labels
EnhancementNew feature or requestNew feature or requestIssue/PR - TriageNew issue. Needs quick review to confirm validity and assign labels.New issue. Needs quick review to confirm validity and assign labels.
Description
Problem (one or two sentences)
Users are unable to take advantage of the newly supported prompt caching feature by Cerebras' model zai-glm-4.7.
The provider's page clearly states prompt caching is supported: https://inference-docs.cerebras.ai/models/zai-glm-47
Context (who is affected and when)
Whenever the model zai-glm-4.7 is selected in the Cerebras provider, it is set as disabled.
Desired behavior (conceptual, not technical)
Enable prompt caching by default.
Constraints / preferences (optional)
No response
Request checklist
- I've searched existing Issues and Discussions for duplicates
- This describes a specific problem with clear context and impact
Roo Code Task Links (optional)
No response
Acceptance criteria (optional)
No response
Proposed approach (optional)
Set the model zai-glm-4.7's prompt caching to true.
| supportsPromptCache: false, |
Trade-offs / risks (optional)
No response
Metadata
Metadata
Assignees
Labels
EnhancementNew feature or requestNew feature or requestIssue/PR - TriageNew issue. Needs quick review to confirm validity and assign labels.New issue. Needs quick review to confirm validity and assign labels.
Type
Projects
Status
Triage