MetricSign
EN|NLRequest Access
Medium severitydata source

Power BI Refresh Error:
COPY INTO File Size Limit Exceeded

What does this error mean?

A COPY INTO command failed because an individual data value exceeded Snowflake's per-row size limit. For VARIANT/OBJECT/ARRAY types, the limit is 16 MB per value. Large semi-structured documents or very wide rows cause this error.

Common causes

  • 1A JSON document in the staged file exceeds 16 MB after parsing into a VARIANT column
  • 2A CSV row has a string value longer than the maximum VARCHAR(16777216) length
  • 3A deeply nested JSON array or object expands beyond the 16 MB VARIANT limit during COPY
  • 4Source data was not validated before staging and contains abnormally large records

How to fix it

  1. 1Identify the oversized rows: use `VALIDATE(<table>, JOB_ID=>'<copy_job_id>')` to see which files and rows failed.
  2. 2For VARIANT: pre-process large JSON documents to split or truncate oversized arrays/objects before staging.
  3. 3Use the `STRIP_OUTER_ARRAY=TRUE` option in the file format if the JSON is a large outer array containing many smaller objects.
  4. 4Set `ON_ERROR=SKIP_FILE` or `ON_ERROR=CONTINUE` in the COPY options to load valid rows and log the failures.
  5. 5For VARCHAR: increase the column size in the target table or truncate long values in a staging step.

Frequently asked questions

Is the 16 MB limit per-file or per-row?

Per-row (per VARIANT value, specifically). File size is not directly limited by Snowflake — the 16 MB constraint applies to the parsed value that would be stored in a VARIANT column.

Other data source errors