This explainer is for Excel professionals who are familiar with Power Query and Power Pivot, but who may not be aware of Excel’s native bidirectional data flow capabilities — nor how the architectural choices behind features like Power Query and OLAP cubes can introduce avoidable performance bottlenecks.

Let’s start with a foundational clarification.

❌ “Isn’t that just VBA?”

No, it’s not.
What we’re talking about — ActiveX Data Objects (ADO) and other “hidden” components of Excel — are not VBA. They’re components of Excel itself, just like formulas, charts, or data validation.

Excel is built on COM — the Component Object Model. This is a design framework that allows different parts of Excel to talk to each other via objects and methods. When you use a formula, click a ribbon button, or apply data validation — you’re triggering actions on these objects. VBA simply allows you to automate and interact with these components, just like Power Query does, albeit with more control and less overhead.

Some components are visible (charts, slicers, pivot tables), others — like ActiveX Data Objects — are invisible, but just as fundamental. They allow direct communication between Excel and external databases, enabling real-time reading and writing of data without round-trips or staging layers. This is the key to Excel’s bidirectional power.


✅ What is Bidirectional Data Flow in Excel?

It’s the ability for Excel not only to pull in data from an external source, but to write back changes — just like any true client in a client-server architecture. Think of it like this:

  • Excel = the client (what the user interacts with)
  • SQL Server = the server (where data lives)
  • ADO = the plumbing (how they talk)

With a setup like this, you can build enterprise-grade budgeting and reporting tools that feel like ordinary spreadsheets — fast, intuitive, and update instantly — because they’re running on architecture that mirrors how real systems are supposed to work.


🚧 What’s the Problem with Power Query, Cubes, and OLAP?

Let’s look at a real case: A large public company scrapped their £1M SAP BPC project after discovering it introduced 40-second delays every time a user updated a cell in Excel. Why? Because the spreadsheet front-end was connected to a cube. That cube had to refresh after every input, even for minor changes. Result? Unusable. You couldn’t even type fluidly.

Now ask yourself — why the cube? What benefit was it bringing? The answer: None that mattered for that scenario. There was no animation, no need for instant graphics or pre-aggregated visualizations. The cube was a detour — completely unnecessary for their budgeting process.


🚦When Are Cubes Appropriate?

Cubes are great when the dataset is static or for high-performance graphics:

  • Financial dashboards updated overnight
  • Sales leaderboards
  • KPI animations

But if you’re in a live, user-driven environment — like budgeting, reconciliation, or forecasting — then live input speed is everything. And cubes are a bottleneck.


🔁 Power Query’s Hidden Detour

Power Query is a lightweight version of OLAP. It pre-processes data in memory before letting you use it. But here’s the catch: if you add one new transaction, it may need to refresh the entire data set — from day one. This is not a bug. It’s by design. Power Query’s memory-resident model means you’re always working with a cached snapshot, not a live, streaming connection.

Five years ago, Celia Alves highlighted this problem on LinkedIn — questioning why Power Query forces a full refresh just to add one new month’s data. The reality? That’s how this architecture works. It’s a cube detour, even if we don’t call it that.


🛩️ The Submarine Analogy

Imagine someone proposes: “Instead of flying over the Atlantic, why not fly underwater in a submarine plane? That way, passengers can look at fish instead of clouds.”

Sounds exciting… until you realize how aircraft are designed and why submarines don’t fly. It’s absurd — not because the goal is silly, but because the suggestion misunderstands the engineering realities.

Same with Power Query. Asking it to behave like a real-time bidirectional engine is like asking an Airbus A380 to dive into the Atlantic. That’s not what it was designed to do.


💡 The Smarter Approach: Native Excel + ADO

In the replacement project I delivered (after the SAP failure), we reverted to what works:

  • Keep Excel as the input client
  • Let the user update as normal — no delays
  • Only commit the changes back to the database after the user is finished

Just like saving a file. One click, done in milliseconds. No round trips. No cube detour. And certainly no 40-second lag per cell.


📈 Demonstration: 10 Million Records, No Problem

A well-known influencer once claimed you could import 10 million records into Excel via Power Query. I recreated their report without Power Query — and showed that by using native ADO connections, I could update the model just as fast, even when adding new data.

The issue isn’t volume. It’s architecture.


🔁 Final Thought

When a solution is slow, clunky, or breaks under pressure, it’s often because of a detour — a misguided attempt to force a tool into a role it wasn’t meant for. Just like asking why planes can’t snorkel with the fish.

Power Query is a detour. Cubes are a detour. If you don’t need that detour, don’t take it.

Instead, look at Excel for what it really is: a client interface in a client-server system. And when you treat it that way — using ADO and native data components — it becomes the most powerful, flexible, and responsive enterprise tool you already own.


TL;DR Summary:

  • Excel has built-in bidirectional data flow via COM and ADO.
  • Power Query and cubes introduce unnecessary detours for many use cases.
  • These detours make Excel sluggish and impractical for live input workflows.
  • Enterprise Excel is not about flashy features. It’s about the right architecture.
  • Don’t use submarine logic for an aircraft problem.

Choose your tools with understanding — not because they’re trending.

Hiran de Silva

View all posts

Add comment

Your email address will not be published. Required fields are marked *