MetricSign
EN|NLRequest Access
High severitydbt

Power BI Refresh Error:
Incremental Model Schema Change Error

What does this error mean?

A dbt incremental model failed because the schema of the new data does not match the schema of the existing table in the warehouse. By default, dbt does not automatically add or remove columns on incremental runs.

Common causes

  • 1A new column was added to the model's SELECT statement but the existing table does not have that column
  • 2A column was removed from the model's SELECT statement but the existing table still has it and the merge fails
  • 3A column's data type changed between runs and the warehouse rejects the implicit cast
  • 4The unique_key used for the incremental merge was changed, causing key conflicts with existing rows
  • 5An upstream source added a column that the model now exposes, creating a mismatch

How to fix it

  1. 1Run with `--full-refresh` to drop and rebuild the target table with the current schema: `dbt run --select <model_name> --full-refresh`.
  2. 2If full-refresh is too expensive, manually alter the table to add the missing column before the next run.
  3. 3Configure `on_schema_change` in the model config to control behavior: `fail` (default), `append_new_columns`, `sync_all_columns`, or `ignore`.
  4. 4Set `on_schema_change: 'append_new_columns'` in the incremental model config to automatically add new columns without a full refresh.
  5. 5Review the model's column list and compare it to the current warehouse table schema to identify exactly which columns differ.

Frequently asked questions

Is `on_schema_change: 'sync_all_columns'` safe to use?

It adds new columns and removes deleted ones automatically, but removal means data loss for that column in existing rows. Use only when old columns are no longer needed.

How do I change a column's data type in an incremental model without losing data?

Run with `--full-refresh` to drop and rebuild the table. There is no safe in-place cast for incremental models — altering a column type directly in the warehouse breaks dbt's merge logic.

Will `on_schema_change: 'append_new_columns'` backfill existing rows with the new column's value?

No — dbt adds the column but existing rows get NULL. Only subsequent incremental runs populate it. For historical backfill, run with `--full-refresh` after setting the option.

Official documentation: https://docs.getdbt.com/docs/build/incremental-models#what-if-the-columns-of-my-incremental-model-change

Other dbt errors