Package.json file
Not applicable - issue identified via static code analysis of the core repository
Node.js version
v24.14.0
Database and its version
Not applicable - issue is at application layer, not database-specific
Operating system name and version
Windows 11
Browser name
No response
What happended?
Medusa version: v2.x
Module(s) affected: @medusajs/product, @medusajs/cart
Summary
When updateProducts() is called with a selector object (not a string ID),
it internally calls productService_.list(selector, {}, ctx) with an empty
config — no take, no skip. This loads all matching products into memory
before updating them.
The same pattern exists in updateCarts() when called with a selector.
Affected Code
packages/modules/product/src/services/product-module-service.ts ~L1613:
// When idOrSelector is a FilterableProductProps object:
const products = await this.productService_.list(
idOrSelector,
{}, // no take, no skip
sharedContext
)
packages/modules/cart/src/services/cart-module.ts ~L403:
const carts = await this.cartService_.list(
{ ...dataOrIdOrSelector },
{ select: ["id"] }, // fetches only ids but still unbounded
sharedContext
)
Impact
In production stores with large datasets, calling these methods with a broad
selector (e.g. { status: "active" }, { collection_id: "x" }) triggers a
full table scan loaded into Node.js memory. This can cause:
- Memory exhaustion / OOM crashes
- Degraded performance during bulk update operations
- No upper bound on DB query cost
Suggested Fix
Add batching logic similar to how updateProducts_() at L1748 handles it:
// Safe pattern already used in the codebase:
const products = await this.productService_.list(
{ id: productIds },
{ take: productIds.length },
sharedContext
)
For selector-based updates, either:
- Process in batches using
take/skip pagination
- Or at minimum document the risk and add a warning log
Related
This is the runtime consequence of the TSDoc inaccuracy fixed in PR #14920 —
the docs said limit was 15, reality is no limit at all.
Expected behavior
When updateProducts() or updateCarts() is called with a broad selector object,
the internal list() query should be bounded — either by batching results using
take/skip, or by enforcing a maximum fetch size — to prevent unbounded memory
consumption regardless of dataset size.
Actual behavior
The internal list() call uses an empty config object {} with no take or skip.
On large datasets (e.g. 100k+ products or millions of carts), this loads all
matching records into Node.js heap memory simultaneously, risking OOM crashes
and severe performance degradation.
Link to reproduction repo
No reproduction repo needed — the issue is directly reproducible by reading the source at the referenced line numbers in the core repository. Confirmed via static analysis on the develop branch (commit: current HEAD).
Package.json file
Not applicable - issue identified via static code analysis of the core repositoryNode.js version
v24.14.0
Database and its version
Not applicable - issue is at application layer, not database-specific
Operating system name and version
Windows 11
Browser name
No response
What happended?
Medusa version: v2.x
Module(s) affected:
@medusajs/product,@medusajs/cartSummary
When
updateProducts()is called with a selector object (not a string ID),it internally calls
productService_.list(selector, {}, ctx)with an emptyconfig — no
take, noskip. This loads all matching products into memorybefore updating them.
The same pattern exists in
updateCarts()when called with a selector.Affected Code
packages/modules/product/src/services/product-module-service.ts~L1613:packages/modules/cart/src/services/cart-module.ts~L403:Impact
In production stores with large datasets, calling these methods with a broad
selector (e.g.
{ status: "active" },{ collection_id: "x" }) triggers afull table scan loaded into Node.js memory. This can cause:
Suggested Fix
Add batching logic similar to how
updateProducts_()at L1748 handles it:For selector-based updates, either:
take/skippaginationRelated
This is the runtime consequence of the TSDoc inaccuracy fixed in PR #14920 —
the docs said limit was 15, reality is no limit at all.
Expected behavior
When updateProducts() or updateCarts() is called with a broad selector object,
the internal list() query should be bounded — either by batching results using
take/skip, or by enforcing a maximum fetch size — to prevent unbounded memory
consumption regardless of dataset size.
Actual behavior
The internal list() call uses an empty config object
{}with no take or skip.On large datasets (e.g. 100k+ products or millions of carts), this loads all
matching records into Node.js heap memory simultaneously, risking OOM crashes
and severe performance degradation.
Link to reproduction repo
No reproduction repo needed — the issue is directly reproducible by reading the source at the referenced line numbers in the core repository. Confirmed via static analysis on the develop branch (commit: current HEAD).