npm @medusajs/medusa 2.11.0
v2.11.0: Caching primitives, manual refunds, and improved promotion limits

latest releases: 2.11.1-preview-20251023060204, 2.11.1-preview-20251023031637, 2.11.1-preview-20251023000334...
2 days ago

Highlights

Caching layer

This release adds new caching primitives to Medusa's toolbox.

1. New caching module, @medusajs/caching

We have added a new caching module in early preview with strong tools to integrate a cache into your Medusa application. The module comprises three key elements:

  • Cache service: the main module service with a simple API for interacting with the cache
  • Cache provider: the provider service responsible for delegating operations to the configured cache provider
  • Cache strategy: the strategy responsible for computing cache keys and invalidation tags

Cache invalidation
Knowing when and how to invalidate cache entries is a challenging task. With the new caching module, we have baked this logic directly into the tooling, so you don't have to manage the complexity yourself.

We plan to share a technical deep dive covering the invalidation mechanism in detail, but here's the short version: whenever entries are inserted, the cache strategy computes tags based on the data. Invalidation of these entries relies on internal module events emitted whenever mutational service methods are called, e.g. service.update({ ... }). The cache strategy intercepts these events, computes the relevant tags using the event data, and instructs the cache provider to clear all entries associated with those tags.

Read more about this in our documentation.

2. New Redis caching provider, @medusajs/caching-redis

As described above, the new caching module follows our existing provider architecture, allowing different technologies to be used interchangeably without affecting the core application. In this release, we’ve introduced a Redis caching provider, which integrates directly with the caching module and serves as the underlying storage layer for the cached data.

3. New cache option on Medusa Query

Finally, to use the cache, we have introduced a new cache option in our query.graph API:

const { data } = await query.graph({
  entity: "product",
  filters: { id: "prod_1234" },
  options: { cache: { enable: true } }
})

This new option allows you to leverage the full power of the cache without having to think about computing keys and invalidation tags. These are automatically computed when query.graph is configured to use the cache.

Installation
Since the caching module is still in early preview, you need to enable a feature flag to use it in addition to registering the Redis cache provider in your medusa-config.ts. The new packages come pre-installed in the latest version of @medusajs/medusa.

// medusa-config.ts
module.exports = defineConfig({
  projectConfig: { ... },
  modules: [
    {
      resolve: "@medusajs/medusa/caching",
      options: {
        providers: [
          {
            id: "caching-redis",
            resolve: "@medusajs/caching-redis",
            options: {
              redisUrl: process.env.REDIS_URL,
            },
          },
        ],
      },
    },
  ],
  featureFlags: {
    caching: true,
  },
});

By default, the cache is integrated across several business-critical APIs to boost performance and throughput. This includes all cart-related operations, where we cache the following data:

  • Regions
  • Promotion codes
  • Variant price sets
  • Variants
  • Shipping options
  • Sales channels
  • Customers

With caching enabled, we’ve observed performance improvements of up to 70% across these operations. Read our blog post
for detailed benchmarks across different Medusa versions.

Bear in mind, our new caching tools are in preview, so they should be used with caution.

Bundle core Medusa dependencies

Breaking change

This release restructures the required dependencies of Medusa projects. We are consolidating all non-Medusa dependencies into a new @medusajs/deps package, which is installed internally in @medusajs/medusa. We are making this change to take control over our core dependencies. This will enable us to add new dependencies and do minor/major upgrades of core dependencies without breaking users' applications.

This is a breaking change, and the following steps are required to use versions 2.11.0 and over:

  1. Remove the following packages from dependencies in your package.json:
"dependencies": {
    "@medusajs/admin-sdk": "2.11.0",
    "@medusajs/cli": "2.11.0",
    "@medusajs/framework": "2.11.0",
    "@medusajs/medusa": "2.11.0",
-    "@mikro-orm/core": "6.4.3",
-    "@mikro-orm/knex": "6.4.3",
-    "@mikro-orm/migrations": "6.4.3",
-    "@mikro-orm/postgresql": "6.4.3",
-    "awilix": "^8.0.1",
-    "pg": "^8.13.0"
  },
"devDependencies": {
    "@medusajs/test-utils": "2.11.0",
-    "@mikro-orm/cli": "6.4.3",
  ...
  1. Install dependencies using your preferred package manager
  2. Run the included codemod (see section below)
  3. Start your application. If you have issues, please open a GitHub Issue and we will help you out as soon as possible

Codemod

By side effect, this update also requires you to update explicit imports from these dependencies in your own project. To help you with this, we have included a codemod that automatically replaces imports throughout your codebase.

To use the codemod, create a file replace-imports.js in the root of your Medusa application with the following content

Codemod
#!/usr/bin/env node

const fs = require("fs")
const path = require("path")
const { execSync } = require("child_process")

/**
 * Script to replace imports and require statements from mikro-orm/{subpath}, awilix, and pg
 * to their @medusajs/framework equivalents
 */

// Define the replacement mappings
const replacements = [
  // MikroORM imports - replace mikro-orm/{subpath} with @medusajs/framework/mikro-orm/{subpath}
  {
    pattern: /from\s+['"]@?mikro-orm\/([^'"]+)['"]/g,
    // eslint-disable-next-line quotes
    replacement: 'from "@medusajs/framework/mikro-orm/$1"',
  },
  // Awilix imports - replace awilix with @medusajs/framework/awilix
  {
    pattern: /from\s+['"]awilix['"]/g,
    // eslint-disable-next-line quotes
    replacement: 'from "@medusajs/framework/awilix"',
  },
  // PG imports - replace pg with @medusajs/framework/pg
  {
    pattern: /from\s+['"]pg['"]/g,
    // eslint-disable-next-line quotes
    replacement: 'from "@medusajs/framework/pg"',
  },
  // OpenTelemetry imports - replace @opentelemetry/instrumentation-pg, @opentelemetry/resources, 
  // @opentelemetry/sdk-node, and @opentelemetry/sdk-trace-node with @medusajs/framework/opentelemetry/{subpath}
  {
    pattern: /from\s+['"]@?opentelemetry\/(instrumentation-pg|resources|sdk-node|sdk-trace-node)['"]/g,
    // eslint-disable-next-line quotes
    replacement: 'from "@medusajs/framework/opentelemetry/$1"',
  },
  // MikroORM require statements - replace require('@?mikro-orm/{subpath}') with require('@medusajs/framework/mikro-orm/{subpath}')
  {
    pattern: /require\s*\(\s*['"]@?mikro-orm\/([^'"]+)['"]\s*\)/g,
    // eslint-disable-next-line quotes
    replacement: 'require("@medusajs/framework/mikro-orm/$1")',
  },
  // Awilix require statements - replace require('awilix') with require('@medusajs/framework/awilix')
  {
    pattern: /require\s*\(\s*['"]awilix['"]\s*\)/g,
    // eslint-disable-next-line quotes
    replacement: 'require("@medusajs/framework/awilix")',
  },
  // PG require statements - replace require('pg') with require('@medusajs/framework/pg')
  {
    pattern: /require\s*\(\s*['"]pg['"]\s*\)/g,
    // eslint-disable-next-line quotes
    replacement: 'require("@medusajs/framework/pg")',
  },
  // OpenTelemetry require statements - replace require('@opentelemetry/instrumentation-pg'), 
  // require('@opentelemetry/resources'), require('@opentelemetry/sdk-node'), and 
  // require('@opentelemetry/sdk-trace-node') with require('@medusajs/framework/opentelemetry/{subpath}')
  {
    pattern: /require\s*\(\s*['"]@?opentelemetry\/(instrumentation-pg|resources|sdk-node|sdk-trace-node)['"]\s*\)/g,
    // eslint-disable-next-line quotes
    replacement: 'require("@medusajs/framework/opentelemetry/$1")',
  },
]

function processFile(filePath) {
  try {
    const content = fs.readFileSync(filePath, "utf8")
    let modifiedContent = content
    let wasModified = false

    replacements.forEach(({ pattern, replacement }) => {
      const newContent = modifiedContent.replace(pattern, replacement)
      if (newContent !== modifiedContent) {
        wasModified = true
        modifiedContent = newContent
      }
    })

    if (wasModified) {
      fs.writeFileSync(filePath, modifiedContent)
      console.log(`āœ“ Updated: ${filePath}`)
      return true
    }

    return false
  } catch (error) {
    console.error(`āœ— Error processing ${filePath}:`, error.message)
    return false
  }
}

function getTargetFiles() {
  try {
    // Get the current script's filename to exclude it from processing
    const currentScript = path.basename(__filename)
    
    // Find TypeScript/JavaScript files, excluding common directories that typically don't contain target imports
    const findCommand = `find . -name node_modules -prune -o -name .git -prune -o -name dist -prune -o -name build -prune -o -name coverage -prune -o -name "*.ts" -print -o -name "*.js" -print -o -name "*.tsx" -print -o -name "*.jsx" -print`
    const files = execSync(findCommand, {
      encoding: "utf8",
      maxBuffer: 50 * 1024 * 1024, // 50MB buffer
    })
      .split("\n")
      .filter((line) => line.trim())

    console.log(files)

    const targetFiles = []
    let processedCount = 0

    console.log(`šŸ“„ Scanning ${files.length} files for target imports and require statements...`)

    for (const file of files) {
      try {
        // Skip the current script file
        const fileName = path.basename(file)
        if (fileName === currentScript) {
          processedCount++
          continue
        }
        const content = fs.readFileSync(file, "utf8")
        if (
          /from\s+['"]@?mikro-orm\//.test(content) ||
          /from\s+['"]awilix['"]/.test(content) ||
          /from\s+['"]pg['"]/.test(content) ||
          /require\s*\(\s*['"]@?mikro-orm\//.test(content) ||
          /require\s*\(\s*['"]awilix['"]/.test(content) ||
          /require\s*\(\s*['"]pg['"]/.test(content)
        ) {
          targetFiles.push(file.startsWith("./") ? file.slice(2) : file)
        }
        processedCount++
        if (processedCount % 100 === 0) {
          process.stdout.write(
            `\ršŸ“„ Processed ${processedCount}/${files.length} files...`
          )
        }
      } catch (fileError) {
        // Skip files that can't be read
        continue
      }
    }

    if (processedCount > 0) {
      console.log(`\ršŸ“„ Processed ${processedCount} files.                    `)
    }

    return targetFiles
  } catch (error) {
    console.error("Error finding target files:", error.message)
    return []
  }
}

function main() {
  console.log("šŸ”„ Finding files with target imports and require statements...")

  const targetFiles = getTargetFiles()

  if (targetFiles.length === 0) {
    console.log("ā„¹ļø  No files found with target imports or require statements.")
    return
  }

  console.log(`šŸ“ Found ${targetFiles.length} files to process`)

  let modifiedCount = 0
  let errorCount = 0

  targetFiles.forEach((filePath) => {
    const fullPath = path.resolve(filePath)
    if (fs.existsSync(fullPath)) {
      if (processFile(fullPath)) {
        modifiedCount++
      }
    } else {
      console.warn(`āš ļø  File not found: ${filePath}`)
      errorCount++
    }
  })

  console.log("\nšŸ“Š Summary:")
  console.log(`   Files processed: ${targetFiles.length}`)
  console.log(`   Files modified: ${modifiedCount}`)
  console.log(`   Errors: ${errorCount}`)

  if (modifiedCount > 0) {
    console.log("\nāœ… Import replacement completed successfully!")
    console.log("\nšŸ’” Next steps:")
    console.log("   1. Review the changes with: git diff")
    console.log("   2. Run your tests to ensure everything works correctly")
    console.log("   3. Commit the changes if you're satisfied")
  } else {
    console.log(
      "\nāœ… No modifications needed - all imports are already correct!"
    )
  }
}

// Run if called directly
if (require.main === module) {
  main()
}

module.exports = { processFile, getTargetFiles, main }

This script scans your project for files that import the removed dependencies, and replaces those imports with their new equivalents from Medusa.

Next, execute the script with the following command:

node replace-imports.js

This will scan your project files, apply the necessary import replacements, and provide a summary of the changes made.

If you still have issues with imports after running the codemod, please create an issue on GitHub, so we can help.

The release of the new Caching Module deprecates the old cache modules:

  • @medusajs/cache
  • @medusajs/cache-redis
  • @medusajs/cache-inmemory

The deprecation happens to move our caching tooling to follow the general provider architecture. These modules will be removed in a future release. We recommend replacing existing usage as soon as possible.

Manual order refunds

This release adds manual order refunds. It is now possible to issue refunds that are not tied to returned items. Previously, refunds were only possible as part of a return or exchange where items are returned and the difference due was consequently negative. This version introduces full support for ad-hoc refunds with new Refund Reasons.

The key changes are:

  • Ad-hoc refunds: issue refunds on orders regardless of the difference due
  • Refund Reasons UI: create and manage custom refund reasons to categorize refunds for accounting purposes
  • Refund drawer update: The refund drawer on the order details page now includes an option to select a Refund Reason
  • Credit lines: when issuing ad-hoc refunds, Medusa creates corresponding credit lines on the order for accurate financial tracking

Promotion limits per customer

This release adds support for defining promotion usage limits per customer. Campaign budgets can now track usage based on an attribute (e.g., customer_id, email) and enforce limits accordingly.

The key changes are:

  • New Campaign Budget Type added USE_BY_ATTRIBUTE to support per-attribute usage tracking.
  • New entity CampaignBudgetUsage added: tracks usage counts per attribute.
  • Extended CampaignBudget entity:
    • Added attribute field (e.g. customer_id) to define the usage key
    • Added usages relation expanding CampaignBudgetUsage
    • limit on Campaign Budgets now applies per attribute for USE_BY_ATTRIBUTE budgets

Promotion usage is registered in the cart completion flow.

Admin + API Support:
Updated admin to support creating and managing customer attribute budgets.

New once promotion allocation method

This release adds a new once allocation strategy to promotions that limits application to a maximum number of items across the entire cart, rather than per line item.

Example cases:

  • "Get $10 off, applied to one item only"
  • "20% off up to 2 items in your cart"

The behavior is as follows:

  • Applies promotion to maximum max_quantity items across entire cart
  • Always prioritizes lowest-priced eligible items first
  • Distributes sequentially across items until quota exhausted
  • Requires max_quantity field to be set

Other noteworthy changes

Updated variants response in GET /store/products

Breaking change

When you fetch variants from the Store Products API, we compute inventory based on the locations associated with the specified sales channel. Up until now, we have returned inventory_quantity: 0 in case the location did not have any inventory levels for the variants. Going forward, we will return inventory_quantity: null. Read more in the PR.

Updated refetch entity utilities

Breaking change

The utilities refetchEntity and refetchEntities have had their method signatures changed to improve usability.

Before

async refetchEntities(entryPoint: string, idOrFilter: string | object, scope: MedusaContainer, fields: string[], pagination?: MedusaRequest["queryConfig"]["pagination"], withDeleted?: boolean);

async refetchEntity(entryPoint: string, idOrFilter: string | object, scope: MedusaContainer, fields: string[]);

Now

async refetchEntities({ entity, idOrFilter, scope, fields, pagination, withDeleted, options })

async refetchEntity({ entity, idOrFilter, scope, fields, options })

Updated Locations & Shipping settings page

We have changed the Locations & Shipping page to use a regular datatable to allow browsing more than 20 locations.

Features

Bugs

Documentation

Chores

Other Changes

  • feat: add metadata to shipping options api endpoints by @bqst in #13554
  • fix(core-flows,types,medusa): pass /store/shipping-options fields to workflow by @leobenzol in #13527
  • fix(dashboard): add offset and limit to query parameters in useTableConfiguration by @docloulou in #13565
  • Revert "fix(types): pluralize settings" by @willbouch in #13573
  • fix(stripe): add StripePromptPayService to Stripe module provider by @techpowerdev in #13572
  • Fixed premature teardown in medusa-test-runner by @trevster344 in #13038
  • Revert "fix(types): pluralize words ending in s like status" by @willbouch in #13574
  • feat(store): add id filtering to store collections endpoint by @bqst in #13555
  • fix(product): Correctly fetch category descendants by handle by @saoudi-h in #13579
  • feat(dashboard): support RTL in dashboard by @MEClouds in #11252
  • fix: Correctly type Float properties by @aldo-roman in #13585
  • fix(dashboard): copy phone on order customer info by @bqst in #13596
  • fix(index): preserve existing fields configuration (#13639) by @martinerko in #13640
  • fix(medusa): plugin:db:generate skip modules with no data models by @leobenzol in #13652
  • chore(core-flows): only allow published products in addToCartWorkflow by @leobenzol in #13182
  • fix(inventory): delete reservation item inventory level required quantity by @NicolasGorga in #13633
  • feat(admin-bundler): forward env vars to plugin admin extensions by @leobenzol in #13634
  • feat(medusa): include user_metadata in auth routes jwt by @NicolasGorga in #13597
  • docs(resources): Improve third party login by @NicolasGorga in #13606
  • docs(): Add local development instructions in CONTRIBUTING.md by @NicolasGorga in #13651
  • chore(): Downgrade mikro orm (performance regression) by @adrien2p in #13680
  • fix(core-flows): fix shipping_total showing as 0 in createFulfillment method by @NicolasGorga in #13704
  • fix(js-sdk): pass headers to auth.refresh() by @NicolasGorga in #13690
  • fix(dashboard): replace native select Element in CountrySelect & ProvinceSelect with Select(Medusa UI). by @patelaryan0914 in #13521
  • feat(medusa): export feature flag configs by @NicolasGorga in #13714
  • feat(medusa,dashboard): Add support for configurable additional columns in entity views by @docloulou in #13566

New Contributors

Full Changelog: v2.10.3...v2.11.0

Don't miss a new medusa release

NewReleases is sending notifications on new releases.