How companies can use composite models in power bi as a game changer-for-self-service-business-intelligence
//
Data & Analytics, Microsoft Power BI
With Composite Models, Microsoft significantly expands the application spectrum of Power BI for business departments in companies. At the same time, a very powerful tool for self-service BI is created for the central business intelligence and IT departments in order to give the departments new freedoms and possibilities while taking their governance guidelines into account. The article is therefore aimed at both responsible and advanced users of Power BI Desktop in Controlling as well as those responsible in central IT or Business Intelligence. Used correctly, composite models open up a new world for self-service BI.
The use case
Every Microsoft Power BI report developer or "power user" is familiar with the following situation: A new Power BI dashboard or report is to be created on the basis of a Power BI dataset provided centrally by BI or IT departments. In the best case scenario, this dataset is already optimally suited for the report question at hand and, in addition to the required data in the appropriate granularity, also contains centrally secured and coordinated logics for the key figures to be evaluated. Nevertheless, it often happens that a few additional pieces of information are missing in the dataset for the requested report. There are many conceivable use cases for such small extensions to a central dataset, which are required at short notice for certain reports and ad-hoc analyses and may even be needed only temporarily:
- Individual groupings of customers, products, etc. in the form of calculated columns in the corresponding dimension tables of the central model or as an extension via additional tables that can be related to specific keys of an existing dimension table of the central model.
- Plan and forecast values from planning tools are to be imported and related to the central dimensions and thus to the actual values from the central model.
- Benchmark values from the Internet, web service based sources, etc.
- Import of individually created threshold values that can be used in DAX logics for ABC, Pareto analyses, etc.
- …
As a power user, however, you are not authorized to extend the central dataset for data governance reasons. Although the required extension might even be relatively easy to implement (e.g. as a simple, additional calculated column in an existing table for an additional aggregation or as an additional fact table for benchmark values), this is not included (fast enough) in the central dataset for certain reasons. Examples of such reasons are:
- In the short term, IT staff or IT specialists do not have free capacity to expand the central dataset.
- The extension would be irrelevant to most other dataset users, would only confuse them and jeopardize the acceptance of the central dataset.
- The extension is only temporarily relevant for a few ad-hoc analyses and becomes obsolete again after some time. It would then either have to be removed again with additional effort or would remain as a "corpse" in the analysis model.
- According to the IT experts, the extension must be installed "cleanly" and this would take several days/weeks, including the work in the central data warehouse. Here again, a conflict arises regarding priorities and resource availability in the central BI department.
- …
This situation creates a major dilemma for the power user. One possible solution to the problem would be to agree with the requester of the report on a more limited scope/content without the necessary extension. If this - which is the rule - is not a real option, the power user has to create his own data basis in self-service with Power BI Desktop by extracting data from the central dataset or importing it into his own model and linking it there with the required, additional information. In addition to a significantly higher effort, this procedure has many other serious disadvantages. All conceivable aspects regarding data governance, security, data protection, IT alignment, flexibility, etc. would actually speak clearly for a temporary extension of the central model - in contrast to the now required self-service BI procedure using export/import mechanisms in Power BI Desktop or other tools. Especially if, in the worst case, Excel, Access, etc. are also used for this purpose, a shadow IT quickly flourishes in the departments instead of the desired, preferably integrated self-service business intelligence. A responsible power user is well aware of this, but in this case there is no other choice, is there? With Composite Models for Microsoft Power BI, there is an elegant solution to this dilemma...
What are Composite Models?
Since December 2020, a powerful feature has been available for Microsoft Power BI under the official name "Direct Query for Power BI Datasets and Analysis Services" (see Microsoft documentation), with which companies can reach an unprecedented level in self-service BI.
This functionality is already included in Power BI Pro, so it is not(!) an exclusive Power BI Premium feature. Alberto Ferrari and Marco Russo from sqlbi call the introduction of this functionality a "milestone in Business Intelligence" and instead of the somewhat unwieldy official name they simply refer to it as "(new) Composite Models" (see sqlbi article). We at noventum want to follow the more catchy name of sqlbi, although Microsoft officially uses this term to describe models that combine import and direct query data sources (cf. Microsoft documentation). So when we talk about "Composite Models" in the following, we mean the feature Microsoft officially calls "Direct Query for Power BI Datasets and Analysis Services".
Enough of the confusing explanation of terms... What exactly are these Composite Models? In short, they allow you to extend existing Power BI datasets and (Azure) Analysis Services databases (i.e. "tabular models" based on the in-memory engine Vertipaq. cf. Microsoft documentation). This sounds very simple at first. However, if you think about it in more detail, you realize that this is a very powerful tool, where complex logics have to work in the background. Extensions of existing analysis models are otherwise only known on a dedicated aggregation level - e.g. import of DAX queries (see above) or comparable mechanisms in other tools. Composite Models, however, allow in comparison an extension of the entire (complex) analysis model with all its tables, columns, aggregation rules, etc. based on the data granularity from the data warehouse contained therein. Exactly this fact makes this feature a real game changer for Self Service Business Intelligence from our point of view!
Typically, when using a Live Connection to a tabular model (Power BI Dataset or Analysis Services database), Microsoft Power BI Desktop becomes a report generation tool only. All modeling options, connection of additional data sources via "Get Data" and the table view are grayed out or deactivated when using a Live Connection. Only the creation of new, local DAX-Measures is possible. With the new preview feature, this finally behaves differently for Live Connections to Power BI Datasets, Azure Analysis Services (AAS) and SQL Server Analysis Services (SSAS) starting with the SQL Server 2022 release.
A quick review: Shortly after the birth of Microsoft Power BI in 2015, the tabular in-memory engine for Analysis Services (Vertipaq) was also made available for the Standard Edition of SQL Server with the introduction of SQL Server 2016 - so it was no longer an expensive Enterprise Edition feature. Even in our first projects with the tabular SSAS version from SQL Server 2016 and Power BI as a reporting tool, the following question came up every time after establishing the live connection to a central SSAS model: "This is great. Now I can easily create reports on the central SSAS analysis model. Can I still extend that now in self-service with Power BI functionality to include data from a local file, etc.? Can I still add a calculated column or similar?" Our answer to this was then always somewhat meekly: "Unfortunately, in this case only the creation of new, local DAX-Measures is possible. But we are convinced that the developments at Microsoft will go in that direction, so that such extensions should be possible (soon)." Since December 2020, we can now proudly claim that we were not lying at the time. True, we hoped we wouldn't have to wait that long. But better late than never.
How do composite models work?
Since composite models are still a preview feature despite their introduction in December 2020, they must first be activated in the Power BI Desktop options:
If you now open a Live Connection to an existing Power BI dataset in the cloud (or an Analysis Services database in the cloud and/or on-premise), the trained eye will immediately recognize that the "Get Data" button is no longer grayed out and in the modeling tab and in the status bar (bottom right), the option to make changes to this analysis model just connected via Live Connection is offered. Then the confirmation follows that a local model is created by this process, in which the Live Connection is replaced by a Direct Query Connection (this process cannot be undone afterwards for the PBIX file!) and afterwards one makes a selection of the tables of the selected tabular model to be included.
However, "Make changes to this model" is a very misleading term. The tabular model previously integrated via Live Connection - the so-called "Remote Model" - is not changed at all in the following. It remains unchanged in the background and is merely integrated or linked into the local model that has just been created. The data of the remote model is also not contained in the local model, but is only physically located in the remote model - e.g. in the data view for the tables of the remote model you only see the text "This table uses Direct Query and cannot be shown". In the following, you only make changes to the Local Model you just created.
If you publish the local model to the service, you will see the dependency on the remote model in the origin view:
In the Local Model it is possible to change certain properties of objects of the Remote Model (e.g. renaming, visibility, etc.), but these are only "cosmetic" corrections that only affect the included objects in the Local Model and do not change the Remote Model. It is not possible to make any real changes to the content of the Remote Model. E.g. no relations in the Remote Model can be deleted or changed, no new relations between Remote Model tables can be created, no DAX definitions of Remote Model measures can be changed, etc.
The Local Model is not used to change the included Remote Model, but to extend it! So, besides new DAX measures (also possible before), e.g. new calculated columns can be added to the Remote Model tables. The most important point for extensions is that in Power BI now other data can be imported into the local model (or connected with Direct Query) via the complete "Get Data" dialog with all possible data sources from relational SQL Server and flat file connections via Web Services up to individual interfaces for SAP, Hadoop, etc., etc.! This information can now be combined with the tables and data of the remote model via relationships and in DAX logics.
Fields of application for composite models
The functionality of the Composite Models probably excites all advanced Microsoft Power BI users and a multitude of interesting use cases for Self-Service BI come to mind (see above).
But be careful! With great power comes great responsibility. It is important to use Composite Models correctly and sensibly right from the start. Marco Russo from sqlbi also warns despite his euphoric announcement of the Composite Models: "However, an easy prediction is that this feature will also be used in the wrong way. Beware of new architectures based on new composite models. I am already scared of diagrams showing tens of data sources fed into a "virtual" semantic model connected to smaller datasets." (cf. sqlbi Artikel). The more you can answer the following questions with "yes", the more likely it is a suitable application area for Self-Service BI using Composite Models. If you are more likely to answer "no" to the following questions, composite models tend to be the wrong choice.
- Do you really want to "extend" the central dataset only in the narrow sense (additional columns, tables, etc.)? Composite models are not(!) suitable for changes to the remote model. Composite Models are also not a suitable means to virtually combine separate datasets.
- Can the additionally integrated data be related to the data/structures of the remote model relatively easily - if necessary with Power Query (uniform keys, granularities, etc.)?
- Is the extension of the central dataset only relevant for some/few users of this dataset? Is it therefore an individual extension that might even irritate other dataset users?
- Is the extension maybe only temporary? For example, is it only required for a few weeks for a short-term, quite operational question and then no longer relevant?
- Is the data volume of the extensions rather small? In other words: Are the cardinalities for relationships between the new tables of the local model and existing tables of the remote model rather small?
Even if you answer "no" to most of the above questions, the use of composite models is still often preferable to an alternative, individual export/import solution for self-service BI. Especially if the required enhancement would indeed be best located directly in the centrally provided dataset, but cannot be implemented fast enough by central BI departments due to time and resource constraints. In this case, a development with composite models is much easier to transfer to the central model later because it is done in the "same language" of the central BI departments.
If you want to learn more about how Composite Models can become a game changer for Self-Service BI in your company as well, just contact us. We will be happy to help you and also advise you on the technical details that Composite Models use in the background and what the consequences are for their use and optimal integration into your central BI strategy.
noventum consulting GmbH
Münsterstraße 111
48155 Münster