MetricSign
EN|NLRequest Access
High severityresource

Power BI Refresh Error:
WORKSPACE_STORAGE_QUOTA_EXCEEDED

What does this error mean?

The Databricks workspace has reached its DBFS or workspace storage quota, preventing new files, notebooks, libraries, or Delta tables from being written.

Common causes

  • 1Accumulated checkpoint files, Delta log history, or MLflow artifacts are filling DBFS
  • 2Temporary files from previous failed jobs were not cleaned up
  • 3Library packages installed at the cluster level duplicate storage across many clusters
  • 4The workspace tier has a lower storage quota than the data volume requires

How to fix it

  1. 1Step 1: Use dbutils.fs.ls and dbutils.fs.rm to identify and delete unused files in DBFS.
  2. 2Step 2: Run VACUUM on Delta tables to remove old checkpoint files beyond the retention period.
  3. 3Step 3: Clean up MLflow experiment artifacts that are no longer needed.
  4. 4Step 4: Move large datasets to external cloud storage (S3, ADLS, GCS) and reference them as external Delta tables.
  5. 5Step 5: Contact Databricks support to request a quota increase if the workspace legitimately requires more storage.

Frequently asked questions

Does VACUUM help reclaim DBFS quota?

Yes — VACUUM removes parquet files and log entries older than the retention threshold from Delta tables, freeing DBFS space.

Is it better to store large datasets in DBFS or external cloud storage?

External cloud storage (S3, ADLS, GCS) is preferred for large datasets — it is cheaper, not subject to workspace quotas, and can be shared across workspaces.

Other resource errors