How to: Group Dates by Week in Power BI

This post shows you how to use the date hierarchy and the grouping function to easily group your data by year, month, week in Power BI. This can be very handy when reporting against data that is timephased, such as sales transactions or Project Online schedule and capacity data.

If you need to start your week on a specific day of the week, you can create a new column based on your date, using this DAX statement:

Week Beginning Date = ‘TableName'[DateFieldName] – MOD(‘TableName'[DateFieldName]-1,7) + [Add 0 for Sunday, 1 for Monday, etc. to start week accordingly]

How to query Project Online multiple value fields in Power BI

Power BI

This post addresses a need to arises when querying Project Online data via OData. It can also serve as a solution template for other data sources.

Synopsis

A number of projects in our portfolio have impacts in different countries. These countries are designated via a Country Project level multi-value (MV) custom field. When multiple values are selected from the dropdown list, it creates a comma-delimited list of values. For example, you could see Brazil, Canada, United States returned as a field value from OData.

Challenge

One of the challenges is that this list can be any number of values and will grow and contract over time. Therefore, the technique should automatically adjust to the underlying maximum number of specified values.

Approach

For each MV field, a new data set representing all values in the MV field will be added to the data model. This new MV field data set will be a child to the original Project master data set. The relationship between the Project set and the MV set will be 1:N.

An inline function will also be used so that it can be executed for each record in the data set. This function will perform the split of values into separate columns.

Once split, the values will be unpivoted, creating a record for each ProjectId, MV Field value pair.

Potential Issues

  • Correct data set relationships. Initially, Power BI created the MV data set with a 1:1 relationship instead of a 1:N. Once corrected, all errors went away.
  • You will need to watch refresh performance as you are basically querying the same data multiple times within the model.

Procedure

Register here for this free session to see the step by step video and to download the PBIX file containing the data model with sample code. http://academy.tumbleroad.com/courses/how-to-parse-multi-value-columns-in-power-bi

One surprising challenge that can derail your Business Intelligence implementation

One challenge facing Business intelligence implementations today is related to how data is shared within the organization. We’ve been working to make work more open and social over the last few years, but there’s one place we’ve neglected to address along the way. How do we share ad hoc business intelligence data within the organization?

 

In years past, our Business Intelligence content was developed as individual reports. This disconnected content approach enabled data producers to act as data gatekeepers. The owner of the report would decide who to share the data with and would email it to the person as they saw fit.

 

In subsequent years, we migrated to a portal based distribution strategy. Tools like SQL Server Reporting Services enabled us to post formal reports centrally, where they could be accessed. However, ad hoc reporting generally remained outside of the portal, to remain shared via email.

 

With the advent of new tools like Power BI, ad hoc reporting is easier than ever. Everything on the screen is clickable and it enables interactive data exploration. The intended interaction with these tools is similar to how we use Facebook, Twitter and Pinterest in our personal lives today. We post information and others can choose how and when to consume it.

 

Imagine the confusion that arises when a BI implementation team gets puzzling questions from groups. How do I email this data model to others? How do I print a dashboard and so forth? Simply updating the tool set to the latest generation, while ignoring certain underlying organizational behaviors can impact the long term success of implementing the latest generation BI tools.

 

An organizational assessment should be done as part of a Business Intelligence tool implementation to consider how people share data currently and as to whether those sharing habits are compatible with the new BI tools. If the organization is used to posting and forwarding links, then they should have no issues adopting the new tools. If the organization is still email centric in their information sharing, a concerted effort will be required to change habits away from sharing content via email and to posting and link sharing. These findings can be used to adjust scope around work needed for training and ongoing support.

 

What has been your experience? Please share below in the comments.

Using Project Online, Power BI Alerts and Microsoft Flow for Custom Alerts

This is so cool! I just received an alert about a data condition in Project Online from Microsoft Flow. The best thing is that this condition isn’t anything Microsoft provides out of the box. With the functionality now in Flow and Power BI, we can now construct our own using Power BI and Flow.

With recent updates, Microsoft Power BI added the ability to set alerts on the card visual. You can read more about the details here. One of the settings allows an email to be sent with a distinct subject. That got me thinking since I knew from playing with Microsoft Flow that you can connect it to Office 365 Outlook and drive flows from specific emails. Flow also provides a Push Notification function so that I can push a notification to the Flow app on my iPhone.

It seems like all of the pieces I need are there to create my own custom notification.

First, I have a Project Online instance where I’ve got projects and people assigned to those projects. I’m using Project to manage capacity so knowing when the total number of overallocations is growing helps me react accordingly.

Project Web App

Second, using our Marquee Project Dashboards product, I have a count of Overallocated resources already available. I’m using Power BI Pro, so my data model refreshes automatically multiple times a day.

Marquee Dashboard

Third, I’ve created my own dashboard and I pinned the count of Overallocated Resources to it. Once the card is on your dashboard, you can click the ellipsis menu … and access the alert function (bell icon).

When the Alert panel opens, I turned it on, changed the Alert title and set my threshold. You can also determine if you need once an hour or once a day for alerting. The key is to check the box that says to send an email. This provides a way for us to use Flow to act upon this alert.

Now, we go to http://flow.microsoft.com. Once you login (because you already have an account, right?), go to My Flows and select Create a New Flow. You are going to create a two-step flow where it connects to your email looking for a specific subject and sender and the second step is where it sends the notification.

  • Type Outlook into the Search window to see the Outlook events.
  • Select Office 365 Outlook – When a new email arrives
  • Ensure the connection is to the right account that will receive the alert email
  • Click Show Advanced Options
  • I filled in mine to look like this. Note the From and Subject Filter values.

  • Now click New Step
  • Click Add an Action
  • Type Push into the search window
  • Select Push Notification – Send a push notification
  • I filled out mine to look like the following. You could have the URL set to bring you back to Power BI or Project. I chose for simplicity to bring me to Outlook.

I would have loved to have had an SMS message generated. This would have required a Twilio account that I don’t have at the moment. So since I was playing, I took the free route and loaded the Flow app on my iPhone. The Push notification would then show up there on my phone.

Once set up,

  • I updated a Project Online project plan to create a new resource overallocation and published it.
  • Power BI automatically refreshed the dataset, the number of overallocated resources increased.
  • This value change triggered a Power BI alert and sent an alert email to my Outlook inbox.
  • Flow picked up the alert and fired off the Flow.
  • The push notification was sent to my phone and bam, there it is on my Apple Watch.

What do you think? What have you created with Flow and  Power BI? Tell me in the comments below! I hope you found this useful.

Free Marquee™ for Google Sheets Template demo

Microsoft Power BI isn’t just for getting data from Microsoft products. The PBIX demo file that you can get once you register below, allows you to query the data from your Google Sheet into Power BI and then share resulting reports and dashboards via PowerBI.com with co-workers or the world if you desire. If you have the Power BI mobile app, you now have Google Sheets data on the go.

Demo File Only

This PBIX  is provided as a demo only, with no support or warranty offered as a result. Testing was only sufficient for a demo and not for production use. You may encounter errors in your environment with the use of this model in it’s current state. You are welcome to expand the solution. If you do, please add to the comments below so that we can all share from your experience.

Note: the PBIX file only connects to the first tab in your Google Sheet.

Google Sheets API Oddities

This was an interesting project as Google Sheets doesn’t have the same concept of table as Excel does. Therefore, there’s two conditions you may encounter for which we don’t yet have a good solution.

First, you shouldn’t have a blank column heading. This will cause the model to error out on the last data transformations as Power BI expects column headings to be present.

Second, the Google Sheets API doesn’t appear to return cells that are null that are in the last column of your sheet. Since the cells are returned as a list and we fold the list every X rows, this throws off the row count and fold points. As a workaround, we recommend having the last column of data have values in all cells.

Send me the data model!

 

Setup

You need three pieces of data in order to use this PBIX file.

  • The number of columns in the Sheet
  • Your Spreadsheet ID
  • Your Browser API Key

Steps to get your SpreadsheetID

  • Navigate to your Sheets page.
  • The key is in the URL, see the bolded red text below.
    • https://docs.google.com/spreadsheets/d/1gLzc8AxdlUl1MPY4t2ATKjc1UfBNj7iUaHRexwLYSKQ/edit#gid=0.

Steps to get your Browser API Key

  • Log into your Google account
  • Click this link to start the wizard.
  • Click Continue button to create a new project.
  • Unfortunately, you have to create the following, even though we won’t use it.
  • Click Go to credentials button.
  • Select Web Browser (Javascript) in the Where will you be calling the API from?
  • Select User Data under What data will you be accessing?
  • Click What credentials do I need? button
  • Click Create client ID button
  • Enter a Product name
  • Click Continue button.
  • Click Done button.
Now to create the credential we need for this to work.
  • Click the Create credentials button.
  • Select API key.
  • Select Browser key.
  • Give it a name and click the Create button.
  • Copy the API key and paste it into the BrowserAPIKey parameter.

Setting Up Your PBIX File for Use

Once you receive your PBIT file, do the following.
  • You must have Power BI Desktop installed prior to performing this procedure.
  • In File Explorer, double-click on the Google Spreadsheet Template – Final.pbit file.
  • Power BI Desktop will open and you will be presented with this dialog.
  • Fill in the values and click the OK button.
  • The model will refresh and it should load your Google data.

Setting Up Scheduled Refresh on PowerBI.com

Once you have saved the model, verified the data and built your reports, you can publish this model to PowerBI.com. Once there, you can set it up to automatically refresh the data so that any reports and dashboards are up to date.

Procedure for Scheduled Refresh

  • In Power BI Desktop, click File, Save to save the model
  • Click Publish
  • If you aren’t signed into Power BI, you’ll be prompted to do so.
  • You may be prompted for the location to publish. My Workspace is the default
  • Once done, go to PowerBI.com.
  • Follow the procedure in the video below.
  • Navigate to the Datasets in the left navigation to start the process.
  • Note, the API key you entered earlier in the model is your login. This is why it is set to anonymous in PowerBI.com.
Send me the data model!

The Truth Shall Make You Miserable

Lack of Faith - Vader- Project Dashboards

When companies begin making their data more accessible via Self-Serve Power BI, they soon reach a critical break point in those efforts. The Project dashboards tell them something that isn’t pleasant or doesn’t match the narrative been publicized.

The Reality in Your Project Dashboards

Performance indicators go red. The data shows the stellar progress that was planned isn’t happening. Operational demands for time are much higher in reality than assumed in planning. In short, it shows the harsh reality, as captured in the data.

This is a moment of truth for organizations. Are we going to embrace the transparency or will we attempt to control the narrative?

Data Quality Challenges

The first question is normally, is this data accurate? This is quite reasonable to ask, especially at the beginning the data stream may not be as clean as it should be.

The approach to this answer can decide your success going forward. For some, questioning the data is a prelude to dismissing the use of the data. For others, it’s a starting point for improvement.

The data deniers will provide many reasons why “we can’t use the data.” They will complain that the data is inaccurate or incomplete. Therefore, they can’t trust their data to integrate its use into their daily work or to use it to make decisions.

These data deniers may have other hidden reasons for their position, such as political or power base protection reasons. Moving to data-centric culture is a big change for many organizations, as you have to be open about your failures. No company is always above average in every endeavor.

Data deniers also fear how business intelligence might impact their careers. If the corporate culture is such where punishment is meted out when the numbers and updates aren’t desirable, likely data transparency won’t be welcome.

Change the Focus of How Data is Used to Succeed

The key to overcoming the data fear is to change the intent for its use, moving the focus from punishment to improvement.

For the successful companies using data, they embrace two simple facts. One, the data is never perfect and that it doesn’t have to be to effect a positive change. Two, they’ve defined the level of granularity needed in the data to be used successfully.

How Imprecise Data is Changing the World

We see this approach in our personal lives. For example, the Fitbit device is not 100% accurate or precise. Yet, millions are changing their behavior of being more active because of the feedback that it provides. based on relatively decent data. You may also be carrying a smart phone, which also tracks your steps. Between the two, you would have a generally good idea of how many steps you took today.

From a granularity approach, we aren’t generally worried about whether I took 4103 steps or 4107 steps today. We took 4100 steps. Hundreds is our minimum granularity. It could easily be at the thousands level, as long as that granularity meets your information needs.

Cost Benefit of a Minimum Level of Granularity

One area we see this type of data accuracy dispute in the corporate world is with cost data. It’s been engrained in our psyche that we have to balance to the penny. Our default data granularity is set to the cent.

While that may improve accuracy and precision, it doesn’t make a material difference in the impact. For example, if your average project budget is $2M, then worrying about a 5 cent variance is a percentage variance of 0.0000025%. I’ve seen organizations who get wrapped up in balancing to the penny and waste an inordinate amount of time each week getting there.

Instead, let’s define a minimum granularity in the data such that a 1% variance is visible. For a $2M average, you would round up at the $10,000 point. Doing so then reduces work attempting to make the data perfect. Any variances of that size are significant enough to warrant attention and are more likely to stand out.

Implementing Self-Server BI using products like Microsoft Power BI and Marquee™ Project Dashboards will enable your organization to gain great improvements as long as they are willing to accept the assumptions above. The truth may make you miserable in the short term as you address underlying data and process challenges. In the long run, you and your company will be better served.

Please share your experiences in the comments below.

Death to the “Perfect Report”

Crumpled paper Power BI

When designing your Power BI solution, are you designing your solution for maximum long term effectiveness? One of the biggest mind shifts that my Power BI students have to make when beginning their Power BI journey is to stop thinking in terms of reports when approaching their Business Intelligence (BI) needs. Rather, it’s more effective to approach those BI needs from a holistic data model perspective.

Why the “perfect report?”

In years past, designers were focused on writing the “perfect report” as the amount of time and effort needed to create a report was substantial. This led to reports that were used for multiple purposes. This multi-use design also created complexity in the presentation and increased maintenance costs. Each additional report also creates a long tail of maintenance effort to ensure that the report stays relevant over time.

With tools like Power BI, the focus is on rapid development of new reports and dashboards. The “perfect report” approach is no longer necessary as it is very easy to create new single focus reports and dashboards. The value of a single report is then diminished and the importance of the data model design increases.

Think data models, not reports

The rationale for this design approach is that, on average, 50% of your annual reporting needs are ad hoc in nature. Therefore, you can’t design a report for every possible contingency. Having a 200 reports that don’t meet your immediate need creates noise rather than benefit to the organization. They simply slow down the ad hoc reporting process as the data consumer wastes time evaluating the existing content first. Flexibility offered from a well-designed data model is the expedient path to fulfilling your ad hoc needs effectively for the long term.

Power BI design for the business conversation

The purpose of business intelligence is to support business conversations with relevant data to enable decision making. Therefore, your data model should be designed to support a specific business scenario. The scenario provides the necessary context for scoping the type of data required to meet the most likely questions. It also provides the necessary context to the data consumer to identify the data model as the correct one for use in a given business conversation.

Conversation-centric design

The path to outlining the necessary data is using an approach we at Tumble Road call “Conversation-Centric Design.”

The first step is to identify the conversation to be supported. Standing meetings are usually a good starting point for the identification process. Meetings provide a lot of information for the design process as the timing, audience and typical agenda are known. The agenda can be used to identify the key questions and needed answers. From the answers, we can extract the core data elements for the data model. Each answer becomes a visualization in the eventual dashboard or report.

For example, there’s a weekly Vice President status meeting every Wednesday. Project status of the overall portfolio is reviewed during the meeting. The key questions for the portfolio review are:

  • Which projects have implementation milestones due this month
  • Which projects have resource constraint issues
  • Which projects have escalated issues to the meeting

Each of these questions can be broken down into the data elements necessary to provide the needed answers. This forms the core data for the data model. The designer then adds the most likely secondary data, like custom fields and other key data to anticipate future needs.

Designing for the long term

There are three aspects to consider to support your long term Business Intelligence needs.

Reports are still needed but are not the focal point

First, you should still provide reports over the data, understanding that they provide an immediate starting point for the data consumer to use the data. The report represents the designer’s view of the data and is generally designed for a specific conversation need.

Ultimately, the data consumer is in control

Second, a well-designed data model supports the data consumer needs for ad hoc information. The data consumer can create ad hoc constructs of data using two methods within Power BI.

The data consumer can create personal dashboards, pinning visualizations from multiple reports to create a personalized view of the data to meet their specific needs. This pinning process supports a visualization being pinned to multiple personal dashboards so the data consumer can narrowly define their dashboards.

If you are not reading this post on TumbleRoad.com, please go here to enjoy this post at its original source.

The data consumer can also create new reports in Powerbi.com via the browser.  The primary difference between the report and the dashboard is intent. Dashboards provide a view of the data; reports provide a way to explore the data. Data consumers will create reports when the need to slice and dice the data is the primary need. The effectiveness of their report development is dependent on the underlying data model adequately supporting the business scenario.

Business scenario focus means smaller, easier to use data models

Lastly, there is value in the less is more approach. Data models shouldn’t be monolithic in design. A well-designed data model provides only the data frames and data elements necessary to support the business scenario.

For example, one of the challenges of Project reporting is the sheer magnitude of available data. Overwhelming the data consumer with too many data frames (aka data tables) to search through for the requisite data elements, slows down the ad hoc reporting process and creates user frustration. In our Marquee™ designs, for example, we separate project reporting from timesheet reporting as separate business scenarios. This in turn reduces data consumer confusion.

A business scenario focused data model also reduces the risk of the data consumer inadvertently creating cross-join situations. When a cross-join arises, typically the data consumer has combined two data elements together in a visualization that don’t have a direct data relationship. The result is the data consumer seeing every potential combination of the two data elements. This may then lead to you receiving a support call as to why the tool is doing what it is doing.

Finally, keeping individual data models business scenario focused also enables you to better maintain the data model as the business scenario changes over time.

The wrap up

My hope is that you now understand why a data model approach is better than a report centric approach is superior when designing Power BI. Data model centric design approaches yield better long term support of ad hoc reporting. Focusing the data model on the business conversation also yields many benefits from a data consumer experience, from ability to correctly select the right model to improved speed of creating new reports.

How are you approaching your Power BI design today? What questions do you have? Please post your feedback in the comments below. I look forward to hearing about your design approaches.

[tagline_box backgroundcolor=”#5083bb” shadow=”no” shadowopacity=”0.1″ border=”1px” bordercolor=”” highlightposition=”top” content_alignment=”left” link=”http://eepurl.com/b6gtJ1″ linktarget=”_blank” modal=”” button_size=”” button_shape=”” button_type=”” buttoncolor=”orange” button=”Send me the white paper!” title=”Get our Business Intelligence white paper!” description=”Find out the 3 hidden reasons keeping you from effectively using Business Intelligence.” margin_top=”” margin_bottom=”” animation_type=”0″ animation_direction=”down” animation_speed=”0.1″ animation_offset=”” class=”” id=””][/tagline_box]

[tagline_box backgroundcolor=”#508ebb” shadow=”no” shadowopacity=”0.1″ border=”1px” bordercolor=”” highlightposition=”top” content_alignment=”left” link=”http://academy.tumbleroad.com” linktarget=”_blank” modal=”” button_size=”” button_shape=”” button_type=”” buttoncolor=”orange” button=”Click to Register” title=”Are you ready for the changes that Power BI is bringing to your reporting?” description=”Join our classes at the Tumble Road Academy, where we’ll teach you how to use the tools and then go into implementation best practices and organizational change management topics. Our lessons will save you time and headache!” margin_top=”” margin_bottom=”” animation_type=”0″ animation_direction=”down” animation_speed=”0.1″ animation_offset=”” class=”” id=””][/tagline_box]