MetricSign
EN|NLRequest Access
Medium severitydata source

Power BI Refresh Error:
ParquetDateTimeExceedLimit

What does this error mean?

A datetime value in the Parquet file exceeds ADF's valid range. Values like 0001-01-01 can fall outside the valid ticks range due to calendar differences.

Common causes

  • 1The Parquet file contains datetime values at the boundary of the valid range (e.g., 0001-01-01 00:00:00)
  • 2Calendar differences between Julian and proleptic Gregorian calendar representations cause values to fall outside the valid ticks range

How to fix it

  1. 1Filter out or replace the out-of-range datetime values in the source system before writing to Parquet.
  2. 2Add a data flow transformation to clamp or replace out-of-range datetime values with a valid default.
  3. 3Use a string type for the datetime column if preserving the exact original values is important.

Frequently asked questions

Does this error affect all pipeline runs or just the current one?

Depends on the root cause. A persistent misconfiguration fails every run; a transient issue may resolve on retry. Check the run history.

Can this error appear in Azure Data Factory and Microsoft Fabric pipelines?

Yes — the same connector errors appear in both ADF and Fabric Data Factory pipelines.

How do I see the full error detail for an ADF pipeline failure?

In ADF Monitor, click the failed run, then the failed activity. The detail pane shows the error code, message, and sub-error codes.

Will downstream Power BI datasets be affected when an ADF pipeline fails?

Yes — a dataset refreshing after the pipeline will use stale data or fail if the target table was cleared. The Power BI refresh may succeed while serving wrong data.

Official documentation: https://learn.microsoft.com/en-us/azure/data-factory/connector-troubleshoot-parquet

Other data source errors