MetricSign
EN|NLRequest Access
Medium severityquery

Power BI Refresh Error:
DATATYPE_MISMATCH

What does this error mean?

An expression or function received an argument of an incompatible type. Spark SQL strict type checking raised a type mismatch before execution began.

Common causes

  • 1Comparing or joining columns of incompatible types, e.g. STRING vs TIMESTAMP
  • 2Passing a DOUBLE argument to a function that expects INTEGER
  • 3Using UNION or INSERT INTO where source and target column types differ
  • 4ARRAY or MAP element types not matching an expected schema
  • 5Implicit casting disabled via `spark.sql.ansi.enabled=true` surfacing previously silent coercions

How to fix it

  1. 1Use CAST() to explicitly convert the column to the expected type: `CAST(col AS BIGINT)`
  2. 2Run `DESCRIBE TABLE <table>` to verify the actual data types of all columns involved
  3. 3For UNION queries, align column types in each SELECT branch using explicit CASTs
  4. 4If ANSI mode is enabled, check whether existing queries relied on implicit coercion that is now disallowed
  5. 5Use TRY_CAST() to safely handle conversion failures and return NULL instead of erroring

Frequently asked questions

Can I turn off strict type checking?

Set `spark.sql.ansi.enabled=false` to revert to legacy coercion behaviour, but this is not recommended for production SQL warehouses as it can silently produce wrong results.

What is TRY_CAST and when should I use it?

TRY_CAST attempts the conversion and returns NULL on failure instead of raising an error — useful for ingesting messy source data where some values may not convert cleanly.

Other query errors