MetricSign
EN|NLRequest Access
Medium severitypipeline

Power BI Refresh Error:
FlowRunSizeLimitExceeded

What does this error mean?

The ADF pipeline run payload exceeded the service size limit. Large activity inputs, outputs, or pipeline variable values exceeded the allowed maximum.

Common causes

  • 1A Lookup activity is returning too many rows and passing the full result to a downstream activity
  • 2Large dynamic expressions or pipeline variables are carrying oversized payloads
  • 3Activity output data exceeds the ADF 4 MB activity output size limit

How to fix it

  1. 1Reduce the size of the data passed between activities — avoid passing large datasets through pipeline variables.
  2. 2Use intermediate storage (Blob, ADLS) to pass large payloads between activities instead of activity output.
  3. 3Add a top-N limit to the Lookup activity to reduce the result set size.
  4. 4Split the pipeline into smaller sub-pipelines using Execute Pipeline activities.

Frequently asked questions

Does a pipeline start failure affect all activities in the pipeline?

Yes — if the pipeline fails to start, no activities run. If the failure occurs mid-run, only activities that ran before the failure complete; downstream activities are skipped.

How do I see why a pipeline run failed to start?

In ADF Monitor, click the failed pipeline run. If it shows 'Failed' before any activity started, check the trigger details — trigger errors appear in the trigger run history, not the pipeline run history.

Can pipeline size limits be increased?

Some limits (like the 50 MB pipeline JSON size limit) are hard limits that cannot be increased. If a pipeline hits this limit, restructure it by extracting sub-pipelines using Execute Pipeline activities.

How does an ADF pipeline failure affect downstream Power BI reports?

If the ADF pipeline loads data that Power BI datasets depend on, a failure means Power BI refreshes against the previous run's data. The Power BI refresh may show as Successful while serving stale numbers — a silent data quality issue.

Official documentation: https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-fault-tolerance