MetricSign
EN|NLRequest Access
Critical severityexecution

Power BI Refresh Error:
DELTA_LIVE_TABLES_PIPELINE_FAILED

What does this error mean?

A Delta Live Tables pipeline stopped with a terminal failure, meaning one or more datasets could not be computed and the pipeline must be restarted manually.

Common causes

  • 1An upstream dataset threw an unhandled exception during an update cycle
  • 2A DLT expectation set to 'fail' mode triggered on too many rows
  • 3The pipeline compute cluster ran out of memory processing a large table
  • 4A Python or SQL syntax error was introduced in a new pipeline version

How to fix it

  1. 1Step 1: Open the DLT pipeline in the Databricks UI and click the failed update to view dataset-level error details.
  2. 2Step 2: Check the event log for the root dataset that first failed — downstream datasets inherit the failure.
  3. 3Step 3: Fix the underlying code or data issue, then click 'Start' to trigger a full or triggered update.
  4. 4Step 4: If memory-related, increase the cluster worker count or move to a larger instance type under Pipeline Settings.
  5. 5Step 5: Add expectation monitoring to surface data quality issues before they cause terminal failures.

Frequently asked questions

Will DLT retry automatically after a terminal failure?

No. Terminal failures require a manual restart from the Databricks pipeline UI or API. Continuous pipelines stop; triggered pipelines remain in a failed state.

How do I find which dataset caused the pipeline failure?

In the DLT pipeline event log, filter for 'FAILED' events. The first dataset to fail is the root cause; all others fail because their upstream dependency failed.

Other execution errors