metricsign
Start free
High severitycapacity

Power BI Refresh Error:
Resource governing command memory limit in Premium

What does this error mean?

A single semantic model operation via the XMLA endpoint has exceeded the per-operation memory limit enforced by Power BI Premium resource governance. The limit is determined by your capacity SKU (e.g., P1 has a lower effective memory limit than P3).

Common causes

  • 1Running a large DAX query or processing operation against a semantic model that exceeds the SKU's per-command memory ceiling
  • 2Semantic model size on disk already near or above the effective memory limit before the command executes (as seen in error details: 'database size before command execution 26000 MB, memory limit 25600 MB')
  • 3Executing bulk XMLA write operations (e.g., partition refresh, table processing) on an undersized Premium SKU
  • 4Multiple large concurrent operations competing for the same capacity memory pool

How to fix it

  1. 1Step 1: Check the error details for 'memory limit' and 'database size before command execution' values to confirm the model size exceeds your SKU's ceiling — if so, either upgrade the capacity SKU or reduce the model size.
  2. 2Step 2: Reduce the semantic model's memory footprint by limiting imported data — apply incremental refresh policies, remove unused columns/tables, or aggregate data before import.
  3. 3Step 3: Break large XMLA processing operations into smaller batches (e.g., refresh individual partitions instead of the full table) to stay within per-command memory limits.
  4. 4Step 4: Schedule heavy XMLA operations during off-peak hours to avoid contention with other workloads consuming capacity memory.
  5. 5Step 5: If the model is legitimately large and batching is not feasible, consider migrating the semantic model to a higher Premium SKU (e.g., P2 or P3) or a Fabric capacity with a larger memory allocation.

Frequently asked questions

How do I find the exact memory limit for my Premium SKU?

The effective per-operation memory limit is SKU-dependent — for example, P1 is approximately 25 GB. Microsoft publishes the memory per node for each SKU in the Power BI Premium capacity documentation. Check the error message itself, as it typically states 'memory limit X MB' explicitly.

Will retrying the operation automatically succeed after this error?

Not unless the underlying cause changes (e.g., other models are evicted from memory). Simply retrying against the same oversize model will produce the same error. You must either reduce the model size, batch the operation, or increase capacity before retrying.

Other capacity errors