Most Qlik to Power BI migration guides stop at "plan your migration and test your reports." That's not a guide — that's a checklist. This post goes deeper: the architectural decisions that actually matter, where enterprise migrations break down, and what a realistic execution looks like from someone who has run these projects at scale.
At KPI Partners, we've migrated organizations with hundreds of Qlik apps, complex Set Analysis logic, and QVD-based pipelines built up over 5–8 years. Here's what we've learned.
Why Organizations Are Moving, And Why Now
The trigger is usually licensing. Qlik's per-user model made sense when BI was centralized in a team of 15 analysts. As organizations push self-service analytics to finance, operations, HR, and executive teams, per-user costs compound fast. We've seen companies spending millions annually on Qlik coverage they could replicate on Power BI Pro for a fraction of that — sometimes 60–70% less.
But cost is the trigger, not the full reason. What sustains the decision is Microsoft ecosystem fit. Organizations running Azure, Microsoft 365, and Teams find that Power BI isn't a tool they need to integrate — it's already part of the stack. Reports embed in Teams channels without configuration. Governance flows through Microsoft Purview. Identity and access management runs through Entra ID. Managing Qlik alongside that stack creates friction that compounds over time.
The third driver is AI readiness. Microsoft Fabric — which Power BI sits inside — is the path toward unified data engineering, real-time analytics, and AI-driven reporting. Qlik does not have an equivalent roadmap in a Microsoft-first environment. Organizations building toward intelligent analytics are making this migration now rather than later because the gap will only widen.
The Core Technical Problem: Two Different Engines
This is where most migration guides gloss over the hard part.
Qlik's associative engine works by loading all data into memory and creating dynamic associations between tables at query time. You don't define relationships — Qlik infers them. A user clicking a filter in one chart instantly propagates that selection across every connected dataset, regardless of how the tables relate. This is what makes Qlik feel so fluid to analysts. It also means the data models built on top of it often have no explicit structure — they rely on Qlik doing the association work automatically.
Power BI's VertiPaq engine is also in-memory, but it works completely differently. Relationships must be explicitly defined. The engine performs best with a star schema: one central fact table connected to dimension tables via clearly defined keys, with single-direction relationships wherever possible. Many-to-many relationships are supported but come with performance costs.
The practical consequence: you cannot migrate a Qlik data model directly into Power BI. The model needs to be redesigned. And in organizations where Qlik has been in production for years, those data models often contain synthetic keys — which Qlik generates automatically when two tables share multiple common field names. Synthetic keys are a signal that the data model was never explicitly designed; Qlik handled the ambiguity for you. In Power BI, you have to resolve it yourself.
Resolving synthetic keys typically means one of three things: renaming fields so tables join on a single unambiguous key, creating a bridge table to handle the many-to-many relationship explicitly, or restructuring which table owns the foreign key. None of these is difficult individually. In a data model with 40 tables and 15 synthetic keys, working through them systematically takes time.
Set Analysis to DAX: The Expression Problem
Qlik's Set Analysis is the feature migration projects consistently underestimate. It lets analysts write measures that calculate across custom subsets of data, completely independent of whatever filters the user has applied to the dashboard. It's powerful, it's widely used, and it has no direct equivalent in Power BI.
DAX handles the same problem through CALCULATE, which modifies the filter context a measure evaluates in. The logic is equivalent — but the syntax, the mental model, and the edge cases are different enough that you can't automate a simple search-and-replace across all expressions.
A simple Set Analysis expression like summing sales for a fixed year maps cleanly to a CALCULATE with a filter condition. An expression that uses set operators to union or intersect multiple data subsets, or that references variables defined elsewhere in the Qlik script, requires careful analysis before you can write the DAX equivalent. Nested Set Analysis expressions — where a set modifier references the result of another set expression — are genuinely complex and need to be handled case by case.
In environments we've assessed, it's common to find 200–400 distinct Set Analysis expressions spread across reports. At that volume, manual conversion doesn't scale. This is one of the primary reasons we built automated expression parsing into our migration utility — the tool identifies each expression, classifies its complexity, generates a DAX equivalent where the pattern is clear, and flags the complex cases for human review.
Understanding Your Qlik Assets Before You Start
Qlik stores data and logic in three formats you'll encounter during any migration.
QVD files are binary data extracts — optimised for fast reads within Qlik but not natively readable by Power BI. The migration path is to extract the underlying data to a staging layer, typically parquet files on Azure Data Lake Storage or Fabric Lakehouse tables, and connect Power BI to those sources directly.
QVF files are Qlik Sense app files — they contain the data model, all load scripts, all measures and calculated dimensions, and the dashboard layouts. These are the primary objects you're migrating.
QVW files are QlikView documents — the legacy format, typically with more complex scripting and older data models. They're often found in organizations that started on QlikView before Qlik Sense existed and never fully migrated internally.
Before starting a migration, you need a complete inventory across all three. How many objects exist, how complex they are, which ones are actively used, and what dependencies exist between them. Organizations consistently discover during this phase that a significant portion of their Qlik environment is redundant — reports that haven't been opened in over a year, duplicate dashboards built by different teams for the same purpose, data models that load the same source tables multiple times. Migration is an opportunity to rationalize, not just copy.
Security: Section Access to Row-Level Security
Qlik implements data-level security through Section Access — a separate section of the load script that defines which users or groups can see which rows of data. The logic lives in the script, tied to user identifiers.
Power BI implements Row-Level Security through DAX roles, integrated with Microsoft Entra ID. Each role contains a DAX expression that filters the data model for users assigned to that role. For simple region-based or entity-based access, the conversion is straightforward: the DAX expression filters the relevant dimension table, and the relationship propagation handles the rest.
Where it gets complex is group-based access, hierarchical security, or Section Access tables with multiple fields controlling access at different levels. These need to be carefully mapped before migration — document every access rule in your current Qlik environment, define the equivalent RLS roles in Power BI, and validate access behaviour with test accounts for each role before decommissioning Qlik.
One important difference: Qlik's Section Access can control visibility at the sheet level. Power BI RLS only controls data visibility, not page or visual visibility. If your current Qlik deployment uses Section Access to hide entire sheets from certain users, you'll need to handle that differently in Power BI — typically through separate reports per audience or through Power BI's built-in page visibility settings combined with access control at the workspace level.
Running the Migration: A Realistic Sequence
Phase 1 — Assessment Before writing any DAX or touching any data model, run a complete inventory of your Qlik environment. Every app, every object, every expression, every data source, every scheduled reload, every Section Access rule. Complexity-score each object. Identify what gets migrated, what gets decommissioned, and what gets redesigned rather than rebuilt.
Phase 2 — Architecture Design the target state before building anything. Define your Power BI semantic model structure. Decide what lives in Fabric Lakehouse versus Power BI Import mode versus DirectQuery. Design your workspace structure and deployment pipeline. Define your governance model — who owns which datasets, how refresh schedules work, what the certification process looks like for promoted datasets.
Phase 3 — Data model migration Move QVD data to the staging layer. Resolve synthetic keys and circular references. Rebuild data models as star schemas. Define explicit relationships. Validate row counts and schema accuracy before building any reports on top.
Phase 4 — Expression conversion Convert Set Analysis to DAX systematically. Start with the high-frequency simple patterns, then work through the complex cases. Document every conversion decision for future maintainability.
Phase 5 — Report reconstruction Rebuild dashboards in Power BI. Preserve drill-through paths, cross-filter behaviour, bookmarks, and filter pane logic. Test with real user scenarios, not just visual spot-checks.
Phase 6 — Parallel validation Run Qlik and Power BI simultaneously for 2–4 weeks on all critical reports. Compare aggregated values, test every filter combination that matters, verify RLS access controls, and get explicit sign-off from report owners before any decommissioning begins.
How Automation Changes the Scale Problem
A team of three engineers manually migrating 200 Qlik apps — doing discovery, expression conversion, data source mapping, and report reconstruction — is looking at 4–6 months of work at minimum. That assumes no rework, which is optimistic.
At KPI Partners, we built our Qlik to Power BI Migration Utility to compress that timeline. The utility scans QVD, QVF, and QVW assets automatically, produces the full inventory, handles data source mapping to Power BI semantic models, automates expression conversion for the patterns it can handle reliably, and reconstructs report structures. It brings total migration time down by up to 90%.
What we've found in practice: automation reliably handles the repeatable 60–70% of the work. The remaining 30–40% — complex Set Analysis, unusual data model patterns, security edge cases — benefits from experienced engineers making judgment calls. The utility doesn't replace that judgment; it clears the repetitive work so engineers can focus where it actually matters.
We offer a free migration assessment: run the utility against your Qlik environment and get a full inventory, complexity breakdown, and realistic effort estimate before you commit to a project plan or a vendor.
The Mindset That Separates Good Migrations from Bad Ones
Every migration project we've seen struggle had the same root cause: it was treated as a technical task rather than an architectural redesign. Engineers rebuilt Qlik reports in Power BI as faithfully as possible — same layout, same logic, same structure — and ended up with Power BI dashboards that underperformed because the underlying model was Qlik-shaped, not Power BI-shaped.
The migrations that go well start with the question: given what we know now, how would we build this in Power BI from scratch? The answer is usually different from what exists in Qlik. Better data models, cleaner report structures, fewer duplicates, clearer governance. Migration done right leaves the organization in a better analytics state than they were in before — not just on a different platform.
Start with an honest inventory. Design before you build. Validate before you decommission. That sequence, applied consistently, is what makes Qlik to Power BI migration succeed at enterprise scale.
Top comments (0)