MetricSign
EN|NLRequest Access
Medium severitydata source

Power BI Refresh Error:
ParquetInvalidDecimalPrecisionScale

What does this error mean?

A DECIMAL column in the Parquet file has precision or scale outside ADF's valid range. ADF supports precision up to 38; scale must be less than or equal to precision.

Common causes

  • 1The Parquet writer used a precision or scale value outside the supported range (precision > 38 or scale > precision)
  • 2The decimal column precision was set to an invalid value such as 0 or a negative number

How to fix it

  1. 1Check the DECIMAL column definition in the Parquet file schema using a Parquet viewer.
  2. 2If the precision exceeds 38, cast the column to a string type and handle it as a high-precision numeric string.
  3. 3Update the Parquet writer to use a valid precision (1–38) and scale (0 to precision).

Frequently asked questions

Does this error affect all pipeline runs or just the current one?

Depends on the root cause. A persistent misconfiguration fails every run; a transient issue may resolve on retry. Check the run history.

Can this error appear in Azure Data Factory and Microsoft Fabric pipelines?

Yes — the same connector errors appear in both ADF and Fabric Data Factory pipelines.

How do I see the full error detail for an ADF pipeline failure?

In ADF Monitor, click the failed run, then the failed activity. The detail pane shows the error code, message, and sub-error codes.

Will downstream Power BI datasets be affected when an ADF pipeline fails?

Yes — a dataset refreshing after the pipeline will use stale data or fail if the target table was cleared. The Power BI refresh may succeed while serving wrong data.

Official documentation: https://learn.microsoft.com/en-us/azure/data-factory/connector-troubleshoot-parquet

Other data source errors