MetricSign
EN|NLRequest Access
Medium severitydata source

Power BI Refresh Error:
DF-Delta-InvalidProtocolVersion

What does this error mean?

The ADF Mapping Data Flow Delta Lake connector cannot read the target Delta table because the table's Delta protocol version requires a reader version higher than ADF currently supports.

Common causes

  • 1The Delta table was created or last written by Databricks Runtime 12+ or Delta Lake 2.0+, which uses a newer reader version (e.g., reader version 3) that ADF does not yet support
  • 2A OPTIMIZE, VACUUM, or table feature command was run on the Delta table that upgraded the protocol to a version ADF cannot read
  • 3The Delta table uses a new Delta table feature (e.g., column mapping, deletion vectors) that requires an upgraded reader protocol

How to fix it

  1. 1Check the Delta table's protocol version by running: DESCRIBE DETAIL your_table in Databricks or a compatible SQL engine — note the minReaderVersion.
  2. 2If the reader version exceeds what ADF supports (currently reader version 1 and 2 for most operations), consider creating a copy of the table without the advanced features that require the higher protocol version.
  3. 3Alternatively, if you only need read access from ADF, create a view or an export of the data to Parquet format that ADF can read without protocol version restrictions.
  4. 4Check the ADF documentation for the current maximum supported Delta Lake reader version before upgrading the table protocol.

Frequently asked questions

How do I check which Delta protocol version my table requires?

Run DESCRIBE DETAIL tableName in Databricks or Synapse Spark. If minReaderVersion is 3 or higher, ADF cannot read the table directly.

Does enabling column mapping or deletion vectors on a Delta table cause this error?

Yes — column mapping requires minReaderVersion 2 and deletion vectors require minReaderVersion 3. If ADF doesn't support these versions, it cannot read tables with those features enabled.

What is the workaround if I cannot downgrade the Delta table protocol?

Export data to Parquet (e.g., via a Databricks notebook) and point the ADF data flow to the Parquet export instead of the Delta table.

Official documentation: https://learn.microsoft.com/en-us/azure/data-factory/data-flow-troubleshoot-guide

Other data source errors