MetricSign
EN|NLRequest Access
Medium severitydata quality

Power BI Refresh Error:
000639 — VARIANT value too large

What does this error mean?

A VARIANT, OBJECT, or ARRAY column contains a value that exceeds Snowflake's maximum VARIANT value size of 16 MB. Snowflake rejects inserts and updates containing oversized VARIANT values.

Common causes

  • 1A PARSE_JSON operation on an API response string that exceeds 16 MB (e.g. a paginated endpoint returning a large array)
  • 2An ARRAY_AGG or OBJECT_CONSTRUCT aggregation producing a VARIANT output larger than 16 MB
  • 3A nested JSON document with deeply recursive structures or extremely long string fields
  • 4Uploading large raw JSON files to a VARIANT column without splitting them first
  • 5A FLATTEN operation creating intermediate VARIANT objects that exceed the limit before being projected

How to fix it

  1. 1Check the size of the problematic value: SELECT LENGTH(TO_JSON(variant_col)) to identify oversized rows.
  2. 2Split large JSON arrays into individual rows before loading: use FLATTEN(INPUT => PARSE_JSON(col), PATH => '$') to unnest.
  3. 3Truncate oversized string fields or hash large blobs before storing in VARIANT columns.
  4. 4For large API responses, paginate at the source and load in chunks rather than as a single VARIANT value.
  5. 5Store large binary payloads in an external stage (S3/ADLS) and keep only a reference URL in the VARIANT column.

Frequently asked questions

Is there a way to detect oversized VARIANT values before they cause a write failure?

Yes — use a pre-load quality check: SELECT COUNT(*) FROM staging WHERE LENGTH(TO_JSON(variant_col)) > 16000000. Add this as a dbt test or a SQL check in your pipeline before the write step.

Does the 16 MB limit apply to ARRAY and OBJECT types as well?

Yes. VARIANT, OBJECT, and ARRAY are all stored internally as VARIANT and share the same 16 MB per-value limit.

Other data quality errors