Critical severitysql
Power BI Refresh Error:
DELTA_LOG_DAMAGED
What does this error mean?
The Delta transaction log (_delta_log) is corrupted, incomplete, or contains entries that cannot be parsed. Delta cannot reconstruct the table state and refuses to read or write the table.
Common causes
- 1A JSON log entry in _delta_log was partially written and truncated due to a job crash or storage failure
- 2Files in _delta_log were deleted manually or by an overly aggressive VACUUM call with too short a retention period
- 3Object storage replication or sync tool corrupted or truncated a checkpoint file
- 4A bug in an older Delta/Spark version produced a malformed log entry
How to fix it
- 1Do not run VACUUM on the table until the log is repaired — it may delete files needed for recovery.
- 2Attempt to read from the last known good version: `spark.read.format('delta').option('versionAsOf', <n>).load('<path>')`.
- 3Use `DESCRIBE HISTORY <table>` to identify the last clean version before the corruption.
- 4If the checkpoint file is damaged, delete the latest checkpoint file from _delta_log and allow Delta to rebuild from prior JSON entries.
- 5Contact Databricks support with the table path and _delta_log contents if recovery attempts fail.