Skip to content

Comments

Add cautionary language about resource-based tuner#4237

Merged
brianmacdonald-temporal merged 2 commits intomainfrom
resource-tuner-caveats
Feb 25, 2026
Merged

Add cautionary language about resource-based tuner#4237
brianmacdonald-temporal merged 2 commits intomainfrom
resource-tuner-caveats

Conversation

@Sushisource
Copy link
Member

@Sushisource Sushisource commented Feb 24, 2026

What does this PR do?

Notes to reviewers

┆Attachments: EDU-5944 Add cautionary language about resource-based tuner

@Sushisource Sushisource requested a review from a team as a code owner February 24, 2026 22:29
@vercel
Copy link

vercel bot commented Feb 24, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
temporal-documentation Ready Ready Preview, Comment Feb 25, 2026 2:12pm

Request Review

@github-actions
Copy link
Contributor

github-actions bot commented Feb 24, 2026

📖 Docs PR preview links

@brianmacdonald-temporal brianmacdonald-temporal merged commit f46ef7b into main Feb 25, 2026
10 checks passed
@brianmacdonald-temporal brianmacdonald-temporal deleted the resource-tuner-caveats branch February 25, 2026 14:31

The following use cases are particularly well suited to resource-based auto-tuning slot suppliers:

Scenarios with tasks that have variable, or very high, resource needs should rely on fixed-size
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure this is scary enough. Do we want to communicate that resource-based tuning can introduce performance regressions with highly variable resource needs? IMO, encountering this in the wild, I would still try resource based tuning because hey, it can offer reasonable performance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants