MetricSign
EN|NLRequest Access
Low severityconfiguration

Power BI Refresh Error:
CLUSTER_LOG_DELIVERY_FAILED

What does this error mean?

Databricks could not write cluster driver or executor logs to the configured log destination (DBFS, S3, ADLS, or GCS) because of missing permissions or an unreachable storage path.

Common causes

  • 1The cluster IAM role or managed identity lacks write permission on the target storage location
  • 2The target storage path or container does not exist
  • 3The log destination URL was specified incorrectly when the cluster was created
  • 4Network connectivity to the storage endpoint is blocked

How to fix it

  1. 1Step 1: Open the cluster configuration and confirm the log delivery path (DBFS, S3 prefix, ADLS container, or GCS bucket).
  2. 2Step 2: Grant write permissions on the storage location to the cluster instance profile or managed identity.
  3. 3Step 3: Create the storage container or bucket if it does not exist.
  4. 4Step 4: Verify the path syntax — S3 paths must be s3://bucket/prefix/, ADLS paths must be abfss://container@account.dfs.core.windows.net/path/.
  5. 5Step 5: Restart the cluster after fixing permissions and confirm the log directory is populated.

Frequently asked questions

Will log delivery failure cause my cluster jobs to fail?

No — log delivery is asynchronous and a delivery failure does not affect job execution. However, logs may not be available after the cluster terminates.

How long are cluster logs available without a custom log destination?

Without a configured log destination, Databricks retains cluster logs for 30 days on the cluster details page, then they are deleted.

Other configuration errors