MetricSign
EN|NLRequest Access
Medium severityexecution

Power BI Refresh Error:
DELTA_CONCURRENT_APPEND

What does this error mean?

A Delta Lake write operation — such as INSERT or MERGE — failed because a concurrent append operation added new files to the same table partition or root directory that the current transaction did not account for in its conflict check.

Common causes

  • 1Multiple concurrent INSERT or streaming micro-batch write operations target the same unpartitioned Delta table or the same partition simultaneously
  • 2A MERGE INTO statement with an INSERT-only path conflicts with a concurrent blind append to the same table
  • 3Two streaming pipelines writing to the same Delta table trigger simultaneous micro-batch commits

How to fix it

  1. 1Step 1: Retry the failed INSERT or MERGE — DELTA_CONCURRENT_APPEND is typically safe to retry and the retry will succeed once the conflicting append commits.
  2. 2Step 2: Partition the table by a high-cardinality key so that concurrent writers target distinct partitions and do not generate append conflicts.
  3. 3Step 3: For streaming writers, ensure each stream has exclusive ownership of its write path or use Delta's multi-writer protocol with the appropriate isolation level.
  4. 4Step 4: If the table is small and frequently written, consider enabling Delta Auto Compaction to reduce the number of small files that create conflict opportunities.

Frequently asked questions

Does this only happen with streaming?

No. DELTA_CONCURRENT_APPEND can occur with any concurrent writers — batch INSERT jobs, streaming micro-batches, or MERGE statements all compete under the same optimistic concurrency protocol.

Will enabling Photon reduce concurrent append conflicts?

No. Photon improves execution performance but does not change Delta Lake's concurrency conflict detection logic.

Other execution errors