The same DAX query returns different row counts depending on where you run it
A pattern shows up regularly in Power BI developer forums: someone builds a DAX query in DAX Studio, confirms it returns the full result set — say 11,000 rows — then wraps the same query in VBA using an ADODB connection to the XMLA endpoint. The Excel sheet fills with roughly 6,000 rows. No error. No warning. The recordset simply stops short.
This happens because VBA's ADODB layer and DAX Studio use fundamentally different client stacks. DAX Studio uses ADOMD.NET, which speaks the XMLA protocol natively and streams results as a complete rowset. VBA's ADODB uses the MSOLAP OLE DB provider, which translates between OLE DB semantics and XMLA under the hood. That translation introduces three failure surfaces: cursor behavior, string-length handling, and authentication token scope.
The connection string looks deceptively simple:
``
Provider=MSOLAP;Data Source=powerbi://api.powerbi.com/v1.0/myorg/[Workspace];Initial Catalog=[Dataset]
``
But the MSOLAP provider is doing heavy lifting behind that string. It acquires an OAuth2 token via the Microsoft Authentication Library, negotiates the XMLA session, and maps the returned rowset into an OLE DB-compatible format. Any version mismatch or misconfiguration at these layers truncates data silently rather than raising an error. The VBA developer sees a partial result and assumes the query is wrong. It isn't.
Forward-only cursors discard rows the provider can't buffer
The most common cause of row truncation is the default cursor type in ADODB. When you open a recordset in VBA without specifying cursor parameters, ADODB defaults to adOpenForwardOnly (CursorType = 0) with adLockReadOnly (LockType = 1). This is efficient for small results but problematic for XMLA connections.
The MSOLAP provider streams data from the Power BI engine in chunks. A forward-only cursor reads each chunk once and discards it. If the provider's internal buffer fills before all rows are consumed — which happens when individual rows contain long strings or many columns — the cursor moves forward and the unconsumed rows are gone. No exception is raised.
The fix is to switch to a static cursor:
``vb
Dim rs As New ADODB.Recordset
rs.CursorLocation = adUseClient ' Forces client-side buffering
rs.CursorType = adOpenStatic ' CursorType = 3
rs.LockType = adLockReadOnly ' LockType = 1
rs.Open strDAX, conn
``
Setting CursorLocation = adUseClient forces the entire result set into client memory before any rows are read. This eliminates the streaming truncation issue at the cost of higher memory usage. For an 11,000-row result set with typical BI columns, expect 20–50 MB of memory overhead — trivial on modern machines.
There is a secondary trap: CopyFromRecordset itself truncates string fields longer than 8,203 characters. If your DAX query returns text columns with long values (descriptions, concatenated lists, JSON blobs), those fields get silently clipped. The row appears in Excel but the data is incomplete. Looping through the recordset field by field avoids this, at the cost of slower writes:
``vb
Dim row As Long: row = 1
Do While Not rs.EOF
For col = 0 To rs.Fields.Count - 1
Cells(row, col + 1).Value = rs.Fields(col).Value
Next col
rs.MoveNext
row = row + 1
Loop
``
MSOLAP version determines whether authentication works or just appears to
The XMLA endpoint requires MSOLAP version 17.0.40.18 or higher and ADOMD 19.104.2.0 or higher. These are not suggestions — older versions establish connections that partially function. They authenticate, they accept queries, and they return results. But the OAuth2 token negotiation may fall back to a restricted scope that limits the result set size the engine is willing to return.
You can check your installed MSOLAP version from VBA itself:
``vb
Dim conn As New ADODB.Connection
conn.Provider = "MSOLAP"
conn.Open "Data Source=powerbi://api.powerbi.com/v1.0/myorg/[Workspace];Initial Catalog=[Dataset]"
Debug.Print conn.Properties("Provider Version").Value
``
If the version is below 17.0.40.18, install the latest Analysis Services client libraries from Microsoft's download page. The MSI package updates both the MSOLAP provider and the ADOMD.NET libraries simultaneously.
Multi-factor authentication adds another wrinkle. When your Azure AD tenant enforces MFA, the MSOLAP provider must trigger an interactive browser prompt. In VBA, this works only if you leave credentials out of the connection string entirely — no User ID, no Password properties. Including either property causes MSOLAP to attempt basic authentication, which MFA-enabled tenants reject. The connection may still succeed using a cached token from a previous interactive session, but that token expires (typically after 60–90 minutes), and the next VBA execution fails with a generic Automation error rather than a clear authentication message.
For unattended scenarios — scheduled Excel macros, Task Scheduler jobs — service principal authentication is the only reliable path. This requires registering an Azure AD app, granting it workspace-level permissions, and passing the app's client ID and secret through the connection string's User ID and Password properties with Persist Security Info=True. Few VBA implementations get this right on the first attempt.
Dynamic DAX construction in VBA invites injection and silent type mismatches
The community thread that prompted this post uses VBA to inject Excel cell values into a DAX query string — building a TREATAS filter dynamically from spreadsheet inputs. This pattern works but introduces two risks that don't exist when the same query runs in DAX Studio.
First, DAX has no parameterized query mechanism. Every filter value must be concatenated directly into the query string. If a cell contains a single quote, the DAX parser throws a syntax error. If it contains a comma in a numeric locale that uses commas as decimal separators, the DATE or VALUE function receives an argument it doesn't expect. VBA doesn't sanitize these inputs, and the resulting errors surface as The expression contains a syntax error with no indication of which token caused it.
Second, date handling across this boundary is fragile. A cell formatted as a date in Excel passes to VBA as a Double (the OLE Automation date serial number). Converting it to a DAX DATE() call requires explicit year, month, and day extraction:
``vb
Dim d As Date: d = Range("B2").Value
Dim daxDate As String
daxDate = "DATE(" & Year(d) & "," & Month(d) & "," & Day(d) & ")"
``
Passing the raw cell value — Format(d, "yyyy-mm-dd") — into a string comparison against a date column in DAX often returns zero rows rather than an error, because DAX treats the unquoted string as an identifier, not a literal. The query executes, returns an empty table, and the VBA macro writes nothing to the sheet. The developer assumes the dataset has no data for that date.
Using TOPN in the DAX query (as the original poster did, with a 500,000-row cap) adds a misleading safety net. TOPN without an ORDER BY clause returns an arbitrary subset. If the engine's internal sort order happens to place the filtered rows at the end, TOPN clips them. The result set looks complete — it has the expected number of rows — but contains the wrong rows. Always pair TOPN with an explicit ORDER BY on the column you're filtering.
When the XMLA endpoint is not the right tool for this job
Using VBA to query Power BI via XMLA works. It is also the most friction-heavy path available. Every component in the chain — Excel's VBA runtime, ADODB, the MSOLAP provider, Azure AD token acquisition, the XMLA protocol, the Power BI engine — adds a potential failure point that produces no useful error message.
For pulling filtered subsets of a Power BI dataset into Excel, three alternatives reduce that surface area significantly. The first is the native Excel integration with Power BI semantic models, available in Excel builds 16.0.18129.x and higher. This uses MSOLAP 160.139.29 internally but handles authentication, cursor management, and result buffering through Excel's own connection infrastructure rather than through ADODB. The user experience is Insert → PivotTable → From Power BI, and the connection is managed, not scripted.
The second is the Power BI REST API's executeQueries endpoint, which accepts DAX queries over HTTPS and returns JSON. A VBA macro can call this via MSXML2.XMLHTTP, passing a bearer token obtained through MSAL. The advantage is that the result format is predictable — JSON arrays don't truncate silently — and the error responses include specific codes like PowerBINotAuthorizedException or DatasetRefreshInProgressException.
The third option is Power Automate, which can run a DAX query against a Power BI dataset and write the results to an Excel file in OneDrive or SharePoint. This eliminates VBA entirely and delegates authentication to the Power Automate connector's managed identity.
MetricSign monitors the Power BI datasets underneath these queries and fires a refresh_delayed signal when a scheduled refresh misses its expected window. If your VBA macro pulls stale data because the underlying dataset hasn't refreshed, the query returns successfully — but with yesterday's numbers. A delayed-refresh alert closes that gap before a stakeholder notices.
Debugging the silent truncation when it's already in production
If you have a VBA macro in production that's already returning partial results, here is a concrete diagnostic sequence.
First, isolate whether the issue is at the query layer or the write layer. Replace CopyFromRecordset with a row count check:
``vb
rs.MoveLast
Debug.Print "Total rows: " & rs.RecordCount
rs.MoveFirst
``
If RecordCount matches your expected count (e.g., 11,000), the XMLA query returned the full result set and the problem is in how VBA writes it to the sheet. If RecordCount is lower, the truncation happened during data retrieval.
For retrieval-side truncation, check three things in order. One: is CursorLocation set to adUseClient? Without it, RecordCount may return -1 (unknown) on a forward-only cursor, and MoveLast will fail. Two: is the MSOLAP provider version current? Run the version check from the previous section. Three: has the OAuth token expired mid-query? For queries that take longer than a few seconds to return, a token acquired at connection time can expire before the result set finishes streaming. The symptom is a partial result with no error.
For write-side truncation, check whether the target sheet has enough rows. Excel's row limit is 1,048,576 — unlikely to be the issue for 11,000 rows — but if the macro writes to a range that starts partway down the sheet, the available space shrinks. Also verify that no On Error Resume Next statement is swallowing a runtime error from CopyFromRecordset. This is the single most common VBA antipattern: the macro hits a string-length or data-type error on row 6,001, the error handler suppresses it, and execution continues past the write loop.
Finally, compare results deterministically: add ORDER BY to your DAX query and run it in both DAX Studio and VBA. If the row counts diverge, the rows present in both results will at least be the same rows, making it possible to identify exactly where truncation begins.