How to: Group Dates by Week in Power BI

This post shows you how to use the date hierarchy and the grouping function to easily group your data by year, month, week in Power BI. This can be very handy when reporting against data that is timephased, such as sales transactions or Project Online schedule and capacity data.

If you need to start your week on a specific day of the week, you can create a new column based on your date, using this DAX statement:

Week Beginning Date = ‘TableName'[DateFieldName] – MOD(‘TableName'[DateFieldName]-1,7) + [Add 0 for Sunday, 1 for Monday, etc. to start week accordingly]

3 problems tracking operations in Project – Part 2

In part 1, we outlined the problems that many customers encounter managing operational work in Project. We need to capture this information for higher fidelity resource capacity. However, previous methods required more work on the part of Resource Managers than desired.

In this article, you’ll see how to set up an operational project plan template that enables you to treat each week as a distinct unit. You’ll also see how to set up the project in Project Professional and how to configure an Operational Project Enterprise Project Type.

Part 3 will also show how to set up the project using PWA only. The requirement is to enable the Resource Manager to manage the project easily via Project Web App. You’ll also be taken through how to use this setup in a day to day fashion and will see timesheet and approval configuration. Lastly, you’ll see how this information appears via Power BI reporting.

If you have comments or questions, please leave them below in the comments field.

 

3 problems tracking operations in Project, and how to fix them.

Many organizations struggle to manage resource capacity. If they are following the OPRA Resource Capacity model, the need to track recurring operations work immediately becomes necessary. This article is based on real world experiences while managing large Project Implementations. Current tracking methods will be examined and some suggested approaches will be presented.

The old way of tracking Operations has issues.

In the past, the primary method used to track recurring operations work is to create a project that contains a yearlong task with all members of a support team. The theory is you can track all operations easily for the fiscal year, which many companies use as boundaries.

However, this approach makes three core assumptions, which causes numerous headaches for the operations manager.

  • Operations work is just like Project work
  • You will always use the same amount of operations work every week
  • No one will join or leave your operations team during the year

Myth #1: Operations work is like Project work

Project work is scheduled for a given week, team members do the work and status is reported. This is where the similarities between Project and Operations work ends.

If you do less work than scheduled in true Project, the incomplete work is typically moved forward into the following week. If you do more work than scheduled, the finish date should come in.

Operations work, however, is about reserving resource capacity for a type of activity. Thus, the difference in how we treat time variation is where the treatment of Project work and Operations work diverges.

If you go over or under time on a given week for an Operations task, it has no impact on the future of the task. You don’t move unutilized operations time forward as you don’t get that unutilized capacity back. You don’t move the end date in if you use more than planned. You simply record what was used, usually for a given week.

Therefore, each reporting period for Operations work should be treated as discrete tracking entities that have no forward schedule impact and preferably, can be closed.

Myth #2: Level of effort never varies

The reality is that the level of operations work varies week to week, sometimes greatly. There are times during the year where you know there’s more operations time. For example, a year end close process might be extremely taxing for the Finance support team. The ability to capture this seasonality would improve the ability to manage capacity for project work tremendously.

Also, if you are using planned hours on Operations work faster than originally planned, using the one long task will result in support calls. You may enter October with no remaining time left, resulting in the task disappearing from timesheets.

This again points to a need for discrete tracking entities that can be managed individually for a given time frame.

Myth #3: Teams never change

The year long task has a serious user management issue when it comes to tracking team composition. Adding and subtracting team members to the task requires Project Pro and a fair bit of Project knowledge to do properly.

When Heather joins the team in August and the operations task started in January, how easy is it to add Heather in a way that doesn’t mess up the current team tracking? The same is true if Sanjay leaves the team in April. How do you easily remove his remaining time?

This process is typically beyond the training of most Operations Managers. They shouldn’t need to be a tool expert to simply manage their team as this creates a situation that detracts value from the data.

The one long task also doesn’t lend itself to adjusting operations assignments so that you can easily reflect greater project demands in key weeks.

All of these usability questions lead us to a requirement that the solution should be usable by a user in Project Web App and doesn’t require a PMP to execute.

Requirements synopsis

Our desired Operations management solution should be:

  • Discretely managed, such that variances in time entered do not impact the overall timeline
  • Ability to individually adjust the time and team composition of tracking periods
  • Straightforward to manage, using only PWA

In our next post, a suggested solution that meets these three requirements will be presented. You’ll also see examples how it can be used in real-world settings. If you have a question or comment, feel free to post it below.

The Benefit of One Simple Change in Project Online

What we typically do today.

This post is about what can be gained from making one simple change in your Resource Custom Field configuration. Many companies have an FTE Yes/No type resource custom field to distinquish between employees and contractors. This has been the approach for many years, as I see this many times in Project 2007/2010/2013 implementations.

A better approach.

I suggest using a Resource Company custom field instead. You would create a lookup table, Resource Companies, where you include the name of your company and the company names of all of your contractors. You then attach this lookup table to the new Resource Company custom field. Lastly, update the value for all resources, where FTEs are set to your company name and contractors are set to their individual company name.

So why do this?

This simple change creates richer data for reporting. For example, your vendor management team is in the process of renegotiating a contract with a specific vendor. That vendor is supplying a number of contractors across many projects. If you use this approach, the ability to extract how much work this vendor is doing and which projects are they working on is as easy as setting a simple filter.

For employees, you may even consider making the lookup value hierarchical, so that you can enable your company name.your division name reporting.

Are you using this approach today? If so, please share your experiences in the comments.

How to query Project Online multiple value fields in Power BI

Power BI

This post addresses a need to arises when querying Project Online data via OData. It can also serve as a solution template for other data sources.

Synopsis

A number of projects in our portfolio have impacts in different countries. These countries are designated via a Country Project level multi-value (MV) custom field. When multiple values are selected from the dropdown list, it creates a comma-delimited list of values. For example, you could see Brazil, Canada, United States returned as a field value from OData.

Challenge

One of the challenges is that this list can be any number of values and will grow and contract over time. Therefore, the technique should automatically adjust to the underlying maximum number of specified values.

Approach

For each MV field, a new data set representing all values in the MV field will be added to the data model. This new MV field data set will be a child to the original Project master data set. The relationship between the Project set and the MV set will be 1:N.

An inline function will also be used so that it can be executed for each record in the data set. This function will perform the split of values into separate columns.

Once split, the values will be unpivoted, creating a record for each ProjectId, MV Field value pair.

Potential Issues

  • Correct data set relationships. Initially, Power BI created the MV data set with a 1:1 relationship instead of a 1:N. Once corrected, all errors went away.
  • You will need to watch refresh performance as you are basically querying the same data multiple times within the model.

Procedure

Register here for this free session to see the step by step video and to download the PBIX file containing the data model with sample code. http://academy.tumbleroad.com/courses/how-to-parse-multi-value-columns-in-power-bi

One surprising challenge that can derail your Business Intelligence implementation

One challenge facing Business intelligence implementations today is related to how data is shared within the organization. We’ve been working to make work more open and social over the last few years, but there’s one place we’ve neglected to address along the way. How do we share ad hoc business intelligence data within the organization?

 

In years past, our Business Intelligence content was developed as individual reports. This disconnected content approach enabled data producers to act as data gatekeepers. The owner of the report would decide who to share the data with and would email it to the person as they saw fit.

 

In subsequent years, we migrated to a portal based distribution strategy. Tools like SQL Server Reporting Services enabled us to post formal reports centrally, where they could be accessed. However, ad hoc reporting generally remained outside of the portal, to remain shared via email.

 

With the advent of new tools like Power BI, ad hoc reporting is easier than ever. Everything on the screen is clickable and it enables interactive data exploration. The intended interaction with these tools is similar to how we use Facebook, Twitter and Pinterest in our personal lives today. We post information and others can choose how and when to consume it.

 

Imagine the confusion that arises when a BI implementation team gets puzzling questions from groups. How do I email this data model to others? How do I print a dashboard and so forth? Simply updating the tool set to the latest generation, while ignoring certain underlying organizational behaviors can impact the long term success of implementing the latest generation BI tools.

 

An organizational assessment should be done as part of a Business Intelligence tool implementation to consider how people share data currently and as to whether those sharing habits are compatible with the new BI tools. If the organization is used to posting and forwarding links, then they should have no issues adopting the new tools. If the organization is still email centric in their information sharing, a concerted effort will be required to change habits away from sharing content via email and to posting and link sharing. These findings can be used to adjust scope around work needed for training and ongoing support.

 

What has been your experience? Please share below in the comments.

Using Project Online, Power BI Alerts and Microsoft Flow for Custom Alerts

This is so cool! I just received an alert about a data condition in Project Online from Microsoft Flow. The best thing is that this condition isn’t anything Microsoft provides out of the box. With the functionality now in Flow and Power BI, we can now construct our own using Power BI and Flow.

With recent updates, Microsoft Power BI added the ability to set alerts on the card visual. You can read more about the details here. One of the settings allows an email to be sent with a distinct subject. That got me thinking since I knew from playing with Microsoft Flow that you can connect it to Office 365 Outlook and drive flows from specific emails. Flow also provides a Push Notification function so that I can push a notification to the Flow app on my iPhone.

It seems like all of the pieces I need are there to create my own custom notification.

First, I have a Project Online instance where I’ve got projects and people assigned to those projects. I’m using Project to manage capacity so knowing when the total number of overallocations is growing helps me react accordingly.

Project Web App

Second, using our Marquee Project Dashboards product, I have a count of Overallocated resources already available. I’m using Power BI Pro, so my data model refreshes automatically multiple times a day.

Marquee Dashboard

Third, I’ve created my own dashboard and I pinned the count of Overallocated Resources to it. Once the card is on your dashboard, you can click the ellipsis menu … and access the alert function (bell icon).

When the Alert panel opens, I turned it on, changed the Alert title and set my threshold. You can also determine if you need once an hour or once a day for alerting. The key is to check the box that says to send an email. This provides a way for us to use Flow to act upon this alert.

Now, we go to http://flow.microsoft.com. Once you login (because you already have an account, right?), go to My Flows and select Create a New Flow. You are going to create a two-step flow where it connects to your email looking for a specific subject and sender and the second step is where it sends the notification.

  • Type Outlook into the Search window to see the Outlook events.
  • Select Office 365 Outlook – When a new email arrives
  • Ensure the connection is to the right account that will receive the alert email
  • Click Show Advanced Options
  • I filled in mine to look like this. Note the From and Subject Filter values.

  • Now click New Step
  • Click Add an Action
  • Type Push into the search window
  • Select Push Notification – Send a push notification
  • I filled out mine to look like the following. You could have the URL set to bring you back to Power BI or Project. I chose for simplicity to bring me to Outlook.

I would have loved to have had an SMS message generated. This would have required a Twilio account that I don’t have at the moment. So since I was playing, I took the free route and loaded the Flow app on my iPhone. The Push notification would then show up there on my phone.

Once set up,

  • I updated a Project Online project plan to create a new resource overallocation and published it.
  • Power BI automatically refreshed the dataset, the number of overallocated resources increased.
  • This value change triggered a Power BI alert and sent an alert email to my Outlook inbox.
  • Flow picked up the alert and fired off the Flow.
  • The push notification was sent to my phone and bam, there it is on my Apple Watch.

What do you think? What have you created with Flow and  Power BI? Tell me in the comments below! I hope you found this useful.

Free Marquee™ for Google Sheets Template demo

Microsoft Power BI isn’t just for getting data from Microsoft products. The PBIX demo file that you can get once you register below, allows you to query the data from your Google Sheet into Power BI and then share resulting reports and dashboards via PowerBI.com with co-workers or the world if you desire. If you have the Power BI mobile app, you now have Google Sheets data on the go.

Demo File Only

This PBIX  is provided as a demo only, with no support or warranty offered as a result. Testing was only sufficient for a demo and not for production use. You may encounter errors in your environment with the use of this model in it’s current state. You are welcome to expand the solution. If you do, please add to the comments below so that we can all share from your experience.

Note: the PBIX file only connects to the first tab in your Google Sheet.

Google Sheets API Oddities

This was an interesting project as Google Sheets doesn’t have the same concept of table as Excel does. Therefore, there’s two conditions you may encounter for which we don’t yet have a good solution.

First, you shouldn’t have a blank column heading. This will cause the model to error out on the last data transformations as Power BI expects column headings to be present.

Second, the Google Sheets API doesn’t appear to return cells that are null that are in the last column of your sheet. Since the cells are returned as a list and we fold the list every X rows, this throws off the row count and fold points. As a workaround, we recommend having the last column of data have values in all cells.

Send me the data model!

 

Setup

You need three pieces of data in order to use this PBIX file.

  • The number of columns in the Sheet
  • Your Spreadsheet ID
  • Your Browser API Key

Steps to get your SpreadsheetID

  • Navigate to your Sheets page.
  • The key is in the URL, see the bolded red text below.
    • https://docs.google.com/spreadsheets/d/1gLzc8AxdlUl1MPY4t2ATKjc1UfBNj7iUaHRexwLYSKQ/edit#gid=0.

Steps to get your Browser API Key

  • Log into your Google account
  • Click this link to start the wizard.
  • Click Continue button to create a new project.
  • Unfortunately, you have to create the following, even though we won’t use it.
  • Click Go to credentials button.
  • Select Web Browser (Javascript) in the Where will you be calling the API from?
  • Select User Data under What data will you be accessing?
  • Click What credentials do I need? button
  • Click Create client ID button
  • Enter a Product name
  • Click Continue button.
  • Click Done button.
Now to create the credential we need for this to work.
  • Click the Create credentials button.
  • Select API key.
  • Select Browser key.
  • Give it a name and click the Create button.
  • Copy the API key and paste it into the BrowserAPIKey parameter.

Setting Up Your PBIX File for Use

Once you receive your PBIT file, do the following.
  • You must have Power BI Desktop installed prior to performing this procedure.
  • In File Explorer, double-click on the Google Spreadsheet Template – Final.pbit file.
  • Power BI Desktop will open and you will be presented with this dialog.
  • Fill in the values and click the OK button.
  • The model will refresh and it should load your Google data.

Setting Up Scheduled Refresh on PowerBI.com

Once you have saved the model, verified the data and built your reports, you can publish this model to PowerBI.com. Once there, you can set it up to automatically refresh the data so that any reports and dashboards are up to date.

Procedure for Scheduled Refresh

  • In Power BI Desktop, click File, Save to save the model
  • Click Publish
  • If you aren’t signed into Power BI, you’ll be prompted to do so.
  • You may be prompted for the location to publish. My Workspace is the default
  • Once done, go to PowerBI.com.
  • Follow the procedure in the video below.
  • Navigate to the Datasets in the left navigation to start the process.
  • Note, the API key you entered earlier in the model is your login. This is why it is set to anonymous in PowerBI.com.
Send me the data model!