We would like to discuss a Continuous Integration – Continuous Deployment (CICD) example when developing and deploying Power BI reports. The data source is CDS/Dataverse. There are several examples demonstrated by MVPs and Power BI communities in the past few years. It is “Continuously” being developed and evolving by BI experts.
There are examples of CICD with Azure DevOps, and the Microsoft Power BI team has been developing Deployment Pipeline in Power BI Service. We are heading towards the best solution that meets our needs, although CICD solution is complex to achieve end-to-end automated solution. We see CICD with Azure DevOps is a great solution to implement; however, it requires a high learning curve and requires some development efforts for implementation. Power BI Deployment Pipeline is a great “No Code” solution in Power BI Service and coming more features in the roadmap.
We experimented with the following steps in our CICD example:
- Power BI Reports with parameters to change data source connections from Dev to QA
- Power BI pbix file source control in Azure DevOps Repository with Git
- CICD automation in Azure DevOps triggered when pbix file is committed in a folder in repository
- *Adding Workspaces created by CICD to Deployment Pipeline in Power BI Service
- Power BI report review request notification before promoting to the next stages
- Comparing Changes in Power BI reports between stages in Power BI Service Deployment Pipelines
Power BI Reports with parameters to change data source connections from Dev to QA
It is a best practice to use parameters to switch data sources in Power BI reports. We experimented to point data source from Development (Dev) to QA by updating data source name in Power Query. These parameters are recognized in Azure DevOps Release configuration. Power BI Service Deployment Pipeline also recognizes the parameters in configuration.
In our example, we have Dynamics 365 Sales application development environment and production environment. Each environment has different volume of data and we named the data source URL as “https://abc_dev.crm.dynamics.com” for development environment and “https://abc.crm.dynamics.com” for production environment. **Please note abc is a fictitious name.
Let’s take a look at a sample Power BI report for data source parameters. We are using Common Data Service (CDS)/Dataverse connector and connecting Account, Opportunity, and User entities. Set parameters as follows:
Added “Instance Name” as parameter name to the source string in M query. Below is an example of Account entity. The same configuration is done for Opportunity and User.
Refresh the data in Dev environment and the report looks like the below:
Power BI pbix file source control in Azure DevOps Repository with Git
We use Git for source control and store Power BI report pbix file(s) in a repository in Azure DevOps. We created a file structure in Azure DevOps like below:
Dev folder is to store Power BI reports developed in local machines or VM development environment with Power BI Desktop. We pull the dev pbix file to our local/VM environment before starting developing and we publish the report to My Workspace in Power BI Service for unit test. Then we push/commit/pull request the dev pbix file to the Dev folder in the repository.
We use Dev and Publish folders in this example. CICD YML folder includes yml file for pipeline development.
DevOps repository may not be a best storage location for pbix files. We are not able to view the source code (TOM) file to see the changes made in the pbix file by opening the pbix file. It requires downloading the file to local/VM environment and open with a third party software such as Tabular Editor. Moreover, if the pbix file is “import mode” and contains a large volume of data in the file then it easily consumes the storage space in your account in Azure DevOps. Therefore, it is a best practice to add filters to datasets and limit the number of records to import in the pbix file with parameters.
CICD automation in Azure DevOps triggered when pbix file is committed in a folder in repository
We may have a team of developers who work on a single dev pbix file. As of the writing of this blog, due to the current pbix file structure that limits the ability to work by a team of developers concurrently, one developer may work on data source and data model; the other developer may work on visualizations, and so on. So, they must get up-to-date dev pbix file in their local machines or VM environment before they start working on the report. Once the dev pbix file is ready for review and ready to be published to the development workspace (Dev) in Power BI Service, we remove “_dev” suffix from the file name and move the pbix file to “Publish” folder in the repository.
In this scenario or by design, this action triggers and publishes the pbix file to Dev workspace in Power BI Service. Pipeline in DevOps is configured to create Dev workspace in Deployment Pipeline in Power BI Service and deploy the pbix file to the Dev workspace automatically.
Pipeline (Build)
We configured a pipeline (build) with a yml file. We set up two tasks. One is to pick up pbix files in the Publish folder in the repository. The other is to move the file to the ‘drop’ container where the pbix files are deployed to the Dev workspace later.
Release
We configured an artifact and stages in Releases. We give it a name to the artifact and set a trigger to deploy pbix files to Dev stage in pipeline in Azure DevOps.
We set up Dev environment stage. We click “+ Add” and create a stage and name it “Dev Env” and create a job and four tasks in the stage.
We create an agent job and leave as default. Select “Artifact download” and specify the artifact location as “publish” folder and pick up all the pbix files in the folder.
We add four tasks in sequence. We use an app called “Power BI Actions.” Microsoft Power BI team released an app called, “Power BI automation tools (Preview)” in January 2022. It provides actions for Power BI Deployment Pipelines. Yet, we focus only on the Power BI Actions app in this blog.
We installed Power BI Actions from Marketplace.
First, we add an action to create a workspace called “Sales Dev” in Power BI Service. The app provides a template so we configure connection to Power BI Service and select a Power BI API for the action.
Please note that we need to configure a connection to Power BI Service beforehand. It is best practice to set up and use a *service principal for the tasks; however, as of writing this blog, we experienced an error in some actions, so we use both service principal and an admin user authentication. Our service principal name is: sp-pbi-cicd-dpl.
*Resources: how to set up a service principal for Power BI Service
Secondly, we add an admin user to the Dev Workspace in Power BI Service. We configure as below:
Thirdly, we add a security group where the service principal and other users will be accessing the Sales Dev workspace in Power BI Service. We configure as below:
Note: The permission of the security group to the workspace is set “Admin,” so we might want to create another security group for “member” permission to manage a group of users, for other permissions like contributor.
Lastly, we add an action to publish Power BI report(s) to the Sales Dev workspace.
Once tasks are configured, we set triggers on the Dev Env stage. First, we want to start the four tasks after the pbix file(s) is (are) available in the artifacts (build/pipeline).
*Adding Workspaces created by CICD to Deployment Pipelines in Power BI Service
Sales Dev workspace is created with a report and a dataset by CICD pipeline. The admin user and the security group “pbi-cicd-poc” are added with admin role to the workspace.
Security Group:
In this blog, we manually created Power BI Deployment Pipeline called “CICD Sales” and added the Sales Dev workspace created by DevOps pipeline.
We used the Power BI automation tool (preview) to create a pipeline and to set capacity to the workspace. It worked when the pipeline was initially created; however, after running the series of tasks the second time, we experienced an error with a bad request (400). Therefore, we removed the task from the Dev stage.
We are using a dedicated capacity for this tenant in order to use Power BI Deployment Pipelines. As of writing this blog, there is no action available to assign Premium per use capacity in the Power BI Action app in DevOps release pipeline.
Power BI report review request notification before promoting to the next stages
We set an approval process to the next stage, “QA Env,” in Dev/Ops pipeline once the Power BI reports in the Sales Dev workspace in Power BI Service is approved. The DevOps deployment to Dev Env stage is completed successfully, an approver in this example, admin user receives an email notification in Outlook mailbox to approve the Power BI report.
The admin user receives an email from Azure DevOps for approval to QA stage deployment.
The admin can review the Power BI report in Sales Dev workspace in Power BI Service.
The admin clicks the “View approval” button in the email and that takes the approver to the release pipeline in DevOps to approve the deployment to QA Env stage.
Comparing Changes in Power BI reports between stages in Power BI Service Deployment Pipelines
We create a new stage, “QA Env” by cloning the Dev stage. We update names and other configurations including the four tasks created in the Dev Env in release pipeline. We add two more tasks in this stage. One is Update Datasource Connection and the other is Refresh Datasets.
We were not able to use service principal to connect Power BI Service for this task. We configured with user authentication in Azure DevOps.
We set the data source connection string in “Update parameter json” where ‘name’ is “Instance Name” and ‘newValue’ is “abc” for the production data source name. We are using production application for QA due to the limited applications in this example.
We configure Refresh Datasource below and use the service principal for the connection.
The deployment to QA Env completed and is successful after the approval.
Sale QA workspace is created, the report and the dataset are deployed. The admin user and the security group with admin role are added to the workspace. The dataset refresh is successful and that matches the time stamp of deployment in QA Env stage in this example.
The pie charts between the reports in Sales Dev and Sales QA show that they are connected to the different data sources.
It is expected that the two workplaces, “Development” and “QA,” are not synced initially in the Deployment Pipelines after completing the QA workspace deployment. We ran the release pipeline in DevOps again and experienced it was not yet synced although the dataset refresh was successful in QA workspace.
We manually updated the data source connection from the dev instance to the production instance in the parameter rules in the QA workspace in Deployment Pipelines and set “abc” instance name of the production data source. The task, “Update Datasource Connections,” in the Azure DevOps release pipeline task did not update the data source connection in this example.
After running the release pipeline in DevOps again, the sync between Sales Dev and Sales QA in Power BI Deployment Pipelines was successful after browser refresh. In this way, pbix files are always promoted from a single folder in the repository with source control and approved manually from DevOps pipeline. We do not need to use the “Deploy to test” button to trigger a deployment.
In this example, we experienced Dev to QA deployment process. QA to Production process can be cloned and created from the QA stage job and the tasks in Azure DevOps release pipeline. We managed the deployment process with source control and approval process in DevOps. We compared reports and datasets between workspaces in Power BI Deployment Pipeline.
Azure DevOps with “Power BI automation tools” is supporting deployment of existing dataflow created in Power Service. Power BI Deployment Pipelines released the feature as GA in January 2022. Each application can perform and fill the gaps of the other application. It is possible to use the two applications for the end-to-end process. Selecting and utilizing the features of each application is a solution today.
Happy Power BI’ing!