MetricSign
EN|NLRequest Access
High severitydata source

Power BI Refresh Error:
UserErrorDataOverflow

What does this error mean?

A value in the source data is too large to fit in the destination column's data type — for example, a numeric value exceeding DECIMAL precision, a string longer than VARCHAR(n), or a date outside the supported range. ADF stops the copy activity when it encounters the first violating row.

Common causes

  • 1A numeric source value exceeds the precision or scale of the sink column (e.g., DECIMAL(10,2) receiving a value that needs DECIMAL(12,4))
  • 2A string column value is longer than the VARCHAR(n) defined in the sink
  • 3An implicit type cast during copy silently widens a value beyond the target type's range
  • 4Unexpected data in the source (e.g., a test row with an extreme value) that was never present during schema design

How to fix it

  1. 1Identify the specific column and row causing the overflow from the ADF activity error details.
  2. 2Widen the sink column definition to accommodate the observed maximum value (e.g., increase DECIMAL precision or VARCHAR length).
  3. 3Add a data flow transformation upstream to truncate or cap values that exceed the allowed range, with logging for out-of-range rows.
  4. 4If source data quality is the root cause, add validation in the upstream system to prevent extreme values.
  5. 5Use ADF's fault-tolerance settings (allowIncompatibleRow) to skip overflowing rows and write them to a rejected-rows log.

Frequently asked questions

Can I skip the overflowing rows instead of failing the whole pipeline?

Yes. In the copy activity's Settings tab, enable 'Fault tolerance' and set 'Incompatible row handling' to 'Skip incompatible rows'. ADF will log skipped rows to a storage path you specify.

Official documentation: azure-data-factory

Other data source errors