-
Notifications
You must be signed in to change notification settings - Fork 124
Open
Description
Add performance benchmarks to CI to track model instantiation speed and prevent performance regressions.
Context
From discussion in #640, there's concern about the performance overhead of HashModel/JsonModel instantiation compared to plain Pydantic BaseModel. While some overhead is expected due to Redis OM's additional functionality (metaclass processing, PK generation, field validation), we should track this to:
- Prevent performance regressions
- Measure improvements from optimizations
- Provide visibility into instantiation costs
Proposed Benchmarks
- Model instantiation timing (HashModel, JsonModel, EmbeddedJsonModel vs BaseModel)
- Bulk save operations
- Query performance (find, filter operations)
Implementation Options
- Use
pytest-benchmarkfor benchmark tests - Run benchmarks in CI and report results
- Consider using GitHub Actions for benchmark tracking (e.g.,
github-action-benchmark)
Reference
- Original performance analysis: Creating a model performance issue #640
Metadata
Metadata
Assignees
Labels
No labels