Professional chefs use a French kitchen organization approach called “Mise En Place”, which means literally, “putting in place.” It ensures that the placement of ingredients and utensils are optimally placed for efficient cooking. Power BI report models can require the same type of approach to ensure that your report models are maintainable over time. This applies whether you are doing reporting against Microsoft Project Online, Microsoft Project Server or Office 365.
We’ll focus on three techniques for the report developer’s back end configuration this week. We’ll cover the front end configuration next week as it relates to improving the end user experience.
Fast is not always good
Power BI allows you to accomplish a lot quickly when it comes to transforming data. However, if you aren’t smart about how you approach your back end configuration, you could be creating a maintenance nightmare for yourself.
Do you know what is what in a report model that you haven’t touched in months? These three techniques will enable you to easily understand and work with your transformation code over time.
What was I thinking when I did this?
This is a common problem for any developer who has ever done work in a rush. The reason for doing what you did in the fifteen transformation steps was blatantly clear the day you created the model. However, the sands of time have worn away the memories and now you are doing code archaeology to understand your previous work.
Power BI allows you to make your transformation steps self-documenting.
To make your transformation steps descriptive, do the following.
- Open the query editor
- Select a data set in the left pane. The list of transformation steps are shown in the Applied Steps pane on the right.
- Right-click (#1) on transformation step
- Rename the step with something descriptive
Instead of accepting the default “Grouped Rows” step name, you can rename the step to read “Grouping by Year, Week, Resource to provide weekly work totals.” Doing so creates a coherent story of what is happening and why.
What is this data set again?
Power BI provides a default data set name, based typically on the source of the data retrieved. This may not be desirable as the data set may be an amalgamation of data from several sources and have a different intent than that indicated by the first data retrieved. Secondly, this name also appears to the end user and may have no meaning.
To make your data set name descriptive
- Open the query editor
- Select a data set in the left pane
- On the right side, in the Properties Name box above the Applied Steps, rename the data set to be more meaningful to the user
- If you are using our Conversation-centric design approach, you should already have some candidate names available
- To update, simply type over the value in the Name field and click off of the field to save
- When you save the model, the change will be saved
Below, we renamed AssignmentTimephasedDataSet to Resource Work Summary by Week.
Where did I put that thing?
Another challenge that you may encounter is trying to find a query data set when you have several in a report model. Power BI allows you to create groups and assign your data sets and functions to a specific group.
If I am working on model that is in production, I can create a Version 2 group, copy the data set, move the copy to the Version 2 group. This way, I have both copies and can ensure I don’t accidentally change the wrong version.
In the example below, I used groups as a way of organizing my demos for a webinar. Each group demo number corresponded to a PowerPoint slide so it was easy to keep my place.
To create a Power BI query group
- Right click anywhere on the Query Editor left pane
- Select New Group…
- Fill out the name and the description
- Click OK
The description can be seen when you hover over the group name. It is very helpful for providing additional context.
To move a data set to a Power BI query group
- Right click on the data set name
- Select Move to Group
- Select the Group
These three techniques will help you keep things organized and understandable. Next week, we’ll discuss best practices for enabling a great end user experience.