High severitysql
Power BI Refresh Error:
DELTA_SCHEMA_CHANGED
What does this error mean?
The schema of the Delta table changed between when the write transaction started and when it attempted to commit. The transaction is aborted because its data does not match the new schema.
Common causes
- 1A column was added or removed from the table between job start and commit
- 2Another job ran with `mergeSchema=true` and added columns, changing the committed schema
- 3A dbt model ran CREATE OR REPLACE TABLE with a modified column list during the write window
- 4Upstream Autoloader schema inference detected new fields and updated the table schema mid-stream
How to fix it
- 1Retry the failed job — on retry it will read the current schema and proceed correctly.
- 2Enable `spark.databricks.delta.schema.autoMerge.enabled = true` on the cluster if controlled schema evolution is acceptable.
- 3Coordinate schema changes across jobs: use a maintenance window or job dependency to prevent overlap.
- 4For streaming jobs, set `mergeSchema` explicitly in the writeStream options rather than relying on cluster-level defaults.