MetricSign
EN|NLRequest Access
High severityfabric

Power BI Refresh Error:
No Delta Transaction Log Entries Were Found

What does this error mean?

Fabric Lakehouse cannot read a directory as a Delta table because the _delta_log folder is missing or empty. This usually means the directory contains raw Parquet files that were copied directly rather than written through Delta protocol, or the Delta table was never properly initialized.

Common causes

  • 1Parquet files were copied directly to the table path (e.g., via azcopy or data pipeline) without converting them to Delta format
  • 2Attempting to query a folder created by a Spark write before the Delta format was enabled (plain Parquet output)
  • 3The _delta_log directory was accidentally deleted
  • 4Using spark.read.parquet() and then registering the result as a table without converting to Delta

How to fix it

  1. 1Convert existing Parquet files to Delta format: CONVERT TO DELTA parquet.`<abfss-path-to-folder>`
  2. 2Alternatively, use Delta write mode from scratch: spark.read.parquet('<path>').write.format('delta').save('<new-path>')

Beyond the docs

Common practitioner solutions not covered in the official documentation.

  1. 1Verify the _delta_log directory exists in the table folder using the Lakehouse Explorer or an ABFSS ls command
  2. 2If the Delta log was accidentally deleted, restore it from the Recycle Bin in the Fabric portal if available
  3. 3When copying data into Lakehouse, always write via Delta API or pipeline Copy Data activity — avoid direct file copy to the Tables area

Official documentation: https://learn.microsoft.com/en-us/fabric/data-engineering/troubleshoot-lakehouse

Other fabric errors