High severitycapacity
Power BI Refresh Error:
Uncompressed data limits for refresh
What does this error mean?
The semantic model's uncompressed data volume exceeds the Power BI service limit during refresh processing. On shared capacity, the limit is 10 GB of uncompressed data, even though the compressed model on disk may be well under 1 GB.
Common causes
- 1Importing high-cardinality columns (e.g., long text, GUIDs, or free-form strings) that compress poorly and expand significantly in memory during refresh
- 2Importing more data than necessary — no query folding, missing filters, or overly wide tables with many unused columns
- 3Semantic model size approaching or exceeding the 1 GB compressed limit for shared capacity, indicating the uncompressed data is far larger
- 4Running large refreshes on Power BI shared capacity (Pro) rather than Premium or Embedded capacity, which has higher uncompressed data allowances
How to fix it
- 1Step 1: Audit your data model in Power BI Desktop using the Model view — identify and remove unused columns, tables, and measures to reduce unnecessary data import.
- 2Step 2: Apply aggressive query filtering in Power Query to import only the rows and columns required for reporting — push WHERE clauses and column selection to the source query for maximum query folding.
- 3Step 3: Replace high-cardinality text columns with integer keys where possible — integer columns compress far more efficiently than string columns in the VertiPaq engine.
- 4Step 4: Implement incremental refresh to partition the data and process smaller chunks during each refresh cycle, reducing the peak uncompressed data size in memory.
- 5Step 5: If the dataset legitimately requires more than 10 GB of uncompressed data, upgrade to Power BI Premium Per User (PPU) or Premium capacity, which supports larger semantic models and higher uncompressed data limits.