-
-
Notifications
You must be signed in to change notification settings - Fork 977
fix(run-engine): fix queue cache memory leak and replace MemoryStore with LRU cache #2945
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
WalkthroughIntroduces a new LRU-backed in-memory cache implementation (LRUMemoryStore, createLRUMemoryStore) and exports it from internal-packages/cache, adds the lru-cache dependency and Vitest config, and adds comprehensive tests for the LRUMemoryStore. Replaces many in-process MemoryStore usages across the codebase with createLRUMemoryStore(...) calls (webapp services, run-engine, queues, rate-limiting, idempotency, etc.). Also updates one private method signature in runAttemptSystem to change how queue cache keys are derived and used. Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes 🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (2 warnings)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. 📜 Recent review detailsConfiguration used: Repository UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (21)
✏️ Tip: You can disable this entire section by setting Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Review CompleteYour review story is ready! Comment !reviewfast on this PR to re-generate the story. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
8bd2b77 to
2871f4a
Compare
@unkey/cacheMemoryStore with new LRUMemoryStore for O(1) operations and better memory boundsProblem
Cache Key Bug
The queue cache in
#resolveTaskRunExecutionQueuewas keyed byrunId, creating one cache entry per run instead of per queue. With 1-2 hour TTLs and 5000 entry soft cap, these accumulated causing memory growth.MemoryStore Performance
The
@unkey/cacheMemoryStore uses O(n) synchronous iteration for eviction, blocking the event loop at high throughput.Solution
Cache Key Fix
Changed cache key from
params.runIdto queue identifier:LRU Cache
Created LRUMemoryStore adapter using lru-cache package:
Test Results: