MetricSign
EN|NLRequest Access
Medium severitydata quality

Power BI Refresh Error:
DF-SQLDW-DataTruncation

What does this error mean?

A source value exceeds the target Synapse Analytics column's defined length or precision and cannot be written without truncation.

Common causes

  • 1A string column in the source contains values longer than the VARCHAR(n) or NVARCHAR(n) length defined in the Synapse target table
  • 2A numeric value with more decimal places than the DECIMAL(p,s) target column precision
  • 3The Synapse target table was created with narrower column types than the actual data in the source
  • 4Data values have grown over time (e.g., longer descriptions or codes) and now exceed the original column size
  • 5A derived column expression produces longer strings than expected — for example, string concatenations or formatted dates

How to fix it

  1. 1Read the error message in the ADF activity run output — it names the column where truncation occurred.
  2. 2Check the maximum length of that column's values in the source using a data preview in debug mode: add a max(length(columnName)) aggregate transformation to find the true maximum length.
  3. 3In the Synapse database, alter the column to a wider type: ALTER TABLE [schema].[table] ALTER COLUMN [column_name] VARCHAR(new_size); — run this via Synapse Studio SQL editor.
  4. 4If altering the table is not possible immediately, add a Derived Column transformation in the data flow before the sink to truncate or clean the value: left(columnName, max_allowed_length).
  5. 5If only a few rows cause truncation, configure error row handling in the sink Settings tab — set 'Truncation of string data' to Skip or Redirect to capture offending rows without failing the entire run.
  6. 6After fixing, run the data flow in debug mode with the Data Preview tab on the sink to confirm the mapping succeeds.

Frequently asked questions

Why does this error not appear every run — only sometimes?

Truncation is data-driven — it only occurs when a row with an oversized value is processed. Use a max(length()) aggregate in debug mode to find the true maximum column length.

Can I configure the data flow to skip rows that would cause truncation instead of failing?

Yes — in the Synapse sink Settings, set 'Truncation of string data' to 'Skip incompatible rows' (silent) or 'Redirect incompatible rows' (to an error path). Neither fixes the root cause but prevents a full failure.

Is this the same as the DF-SQLDW-ErrorRowsFound error?

Related but distinct. DataTruncation fires before any rows are written — the connector detects overflow and throws immediately. ErrorRowsFound fires after Synapse rejects rows during COPY INTO, which can include truncation or constraint violations.

Will downstream Power BI datasets be affected?

Yes — unless error row handling is active, the entire pipeline run fails and the target table receives no new data. Dependent datasets serve stale figures.

Official documentation: https://learn.microsoft.com/en-us/azure/data-factory/data-flow-troubleshoot-guide

Other data quality errors