MetricSign
EN|NLRequest Access
Critical severitysql

Power BI Refresh Error:
DELTA_LOG_DAMAGED

What does this error mean?

The Delta transaction log (_delta_log) is corrupted, incomplete, or contains entries that cannot be parsed. Delta cannot reconstruct the table state and refuses to read or write the table.

Common causes

  • 1A JSON log entry in _delta_log was partially written and truncated due to a job crash or storage failure
  • 2Files in _delta_log were deleted manually or by an overly aggressive VACUUM call with too short a retention period
  • 3Object storage replication or sync tool corrupted or truncated a checkpoint file
  • 4A bug in an older Delta/Spark version produced a malformed log entry

How to fix it

  1. 1Do not run VACUUM on the table until the log is repaired — it may delete files needed for recovery.
  2. 2Attempt to read from the last known good version: `spark.read.format('delta').option('versionAsOf', <n>).load('<path>')`.
  3. 3Use `DESCRIBE HISTORY <table>` to identify the last clean version before the corruption.
  4. 4If the checkpoint file is damaged, delete the latest checkpoint file from _delta_log and allow Delta to rebuild from prior JSON entries.
  5. 5Contact Databricks support with the table path and _delta_log contents if recovery attempts fail.

Frequently asked questions

Can Delta log corruption affect other tables in the same catalog?

No — each Delta table has its own _delta_log. Corruption is isolated to the affected table's path.

How do I prevent this in the future?

Never manually delete files from _delta_log. Always use VACUUM with the default 7-day retention. Monitor for partial-write failures and enable cloud storage versioning on the bucket.

Other sql errors