MetricSign
EN|NLRequest Access
Medium severityjob

Power BI Refresh Error:
SUCCESS_WITH_FAILURES

What does this error mean?

A Databricks multi-task job completed, but one or more tasks failed. The overall job run is considered partially successful because the failed tasks were not on the critical path or were configured with 'continue on failure' behavior.

Common causes

  • 1One or more tasks in a multi-task job were configured to allow failure without blocking downstream tasks
  • 2A non-critical validation or notification task failed while the core data processing tasks succeeded
  • 3A task failure was within the retry limit but retries were exhausted without success
  • 4A job was designed with failure-tolerant branches and some branches failed

How to fix it

  1. 1Open the run detail in the Databricks UI and identify which specific tasks failed
  2. 2Review the failed task logs to understand the root cause
  3. 3Determine whether the failed task's output is required for downstream correctness
  4. 4If the task failure is significant, update the job dependency config to make it blocking
  5. 5Fix the underlying error in the failed task and re-run if needed

Frequently asked questions

Why does the job show as 'succeeded' if tasks failed?

Databricks reports the overall job state as SUCCESS_WITH_FAILURES when failed tasks were not configured as blocking. The job graph completed, but not all tasks succeeded.

How do I make a task failure block the overall job?

In the job configuration, set downstream tasks to only run 'if upstream succeeded'. If all tasks depend on the failing task, the overall job will then fail instead of showing SUCCESS_WITH_FAILURES.

Other job errors