The Truth Shall Make You Miserable

Lack of Faith - Vader- Project Dashboards

When companies begin making their data more accessible via Self-Serve Power BI, they soon reach a critical break point in those efforts. The Project dashboards tell them something that isn’t pleasant or doesn’t match the narrative been publicized.

The Reality in Your Project Dashboards

Performance indicators go red. The data shows the stellar progress that was planned isn’t happening. Operational demands for time are much higher in reality than assumed in planning. In short, it shows the harsh reality, as captured in the data.

This is a moment of truth for organizations. Are we going to embrace the transparency or will we attempt to control the narrative?

Data Quality Challenges

The first question is normally, is this data accurate? This is quite reasonable to ask, especially at the beginning the data stream may not be as clean as it should be.

The approach to this answer can decide your success going forward. For some, questioning the data is a prelude to dismissing the use of the data. For others, it’s a starting point for improvement.

The data deniers will provide many reasons why “we can’t use the data.” They will complain that the data is inaccurate or incomplete. Therefore, they can’t trust their data to integrate its use into their daily work or to use it to make decisions.

These data deniers may have other hidden reasons for their position, such as political or power base protection reasons. Moving to data-centric culture is a big change for many organizations, as you have to be open about your failures. No company is always above average in every endeavor.

Data deniers also fear how business intelligence might impact their careers. If the corporate culture is such where punishment is meted out when the numbers and updates aren’t desirable, likely data transparency won’t be welcome.

Change the Focus of How Data is Used to Succeed

The key to overcoming the data fear is to change the intent for its use, moving the focus from punishment to improvement.

For the successful companies using data, they embrace two simple facts. One, the data is never perfect and that it doesn’t have to be to effect a positive change. Two, they’ve defined the level of granularity needed in the data to be used successfully.

How Imprecise Data is Changing the World

We see this approach in our personal lives. For example, the Fitbit device is not 100% accurate or precise. Yet, millions are changing their behavior of being more active because of the feedback that it provides. based on relatively decent data. You may also be carrying a smart phone, which also tracks your steps. Between the two, you would have a generally good idea of how many steps you took today.

From a granularity approach, we aren’t generally worried about whether I took 4103 steps or 4107 steps today. We took 4100 steps. Hundreds is our minimum granularity. It could easily be at the thousands level, as long as that granularity meets your information needs.

Cost Benefit of a Minimum Level of Granularity

One area we see this type of data accuracy dispute in the corporate world is with cost data. It’s been engrained in our psyche that we have to balance to the penny. Our default data granularity is set to the cent.

While that may improve accuracy and precision, it doesn’t make a material difference in the impact. For example, if your average project budget is $2M, then worrying about a 5 cent variance is a percentage variance of 0.0000025%. I’ve seen organizations who get wrapped up in balancing to the penny and waste an inordinate amount of time each week getting there.

Instead, let’s define a minimum granularity in the data such that a 1% variance is visible. For a $2M average, you would round up at the $10,000 point. Doing so then reduces work attempting to make the data perfect. Any variances of that size are significant enough to warrant attention and are more likely to stand out.

Implementing Self-Server BI using products like Microsoft Power BI and Marquee™ Project Dashboards will enable your organization to gain great improvements as long as they are willing to accept the assumptions above. The truth may make you miserable in the short term as you address underlying data and process challenges. In the long run, you and your company will be better served.

Please share your experiences in the comments below.

Death to the “Perfect Report”

Crumpled paper Power BI

When designing your Power BI solution, are you designing your solution for maximum long term effectiveness? One of the biggest mind shifts that my Power BI students have to make when beginning their Power BI journey is to stop thinking in terms of reports when approaching their Business Intelligence (BI) needs. Rather, it’s more effective to approach those BI needs from a holistic data model perspective.

Why the “perfect report?”

In years past, designers were focused on writing the “perfect report” as the amount of time and effort needed to create a report was substantial. This led to reports that were used for multiple purposes. This multi-use design also created complexity in the presentation and increased maintenance costs. Each additional report also creates a long tail of maintenance effort to ensure that the report stays relevant over time.

With tools like Power BI, the focus is on rapid development of new reports and dashboards. The “perfect report” approach is no longer necessary as it is very easy to create new single focus reports and dashboards. The value of a single report is then diminished and the importance of the data model design increases.

Think data models, not reports

The rationale for this design approach is that, on average, 50% of your annual reporting needs are ad hoc in nature. Therefore, you can’t design a report for every possible contingency. Having a 200 reports that don’t meet your immediate need creates noise rather than benefit to the organization. They simply slow down the ad hoc reporting process as the data consumer wastes time evaluating the existing content first. Flexibility offered from a well-designed data model is the expedient path to fulfilling your ad hoc needs effectively for the long term.

Power BI design for the business conversation

The purpose of business intelligence is to support business conversations with relevant data to enable decision making. Therefore, your data model should be designed to support a specific business scenario. The scenario provides the necessary context for scoping the type of data required to meet the most likely questions. It also provides the necessary context to the data consumer to identify the data model as the correct one for use in a given business conversation.

Conversation-centric design

The path to outlining the necessary data is using an approach we at Tumble Road call “Conversation-Centric Design.”

The first step is to identify the conversation to be supported. Standing meetings are usually a good starting point for the identification process. Meetings provide a lot of information for the design process as the timing, audience and typical agenda are known. The agenda can be used to identify the key questions and needed answers. From the answers, we can extract the core data elements for the data model. Each answer becomes a visualization in the eventual dashboard or report.

For example, there’s a weekly Vice President status meeting every Wednesday. Project status of the overall portfolio is reviewed during the meeting. The key questions for the portfolio review are:

  • Which projects have implementation milestones due this month
  • Which projects have resource constraint issues
  • Which projects have escalated issues to the meeting

Each of these questions can be broken down into the data elements necessary to provide the needed answers. This forms the core data for the data model. The designer then adds the most likely secondary data, like custom fields and other key data to anticipate future needs.

Designing for the long term

There are three aspects to consider to support your long term Business Intelligence needs.

Reports are still needed but are not the focal point

First, you should still provide reports over the data, understanding that they provide an immediate starting point for the data consumer to use the data. The report represents the designer’s view of the data and is generally designed for a specific conversation need.

Ultimately, the data consumer is in control

Second, a well-designed data model supports the data consumer needs for ad hoc information. The data consumer can create ad hoc constructs of data using two methods within Power BI.

The data consumer can create personal dashboards, pinning visualizations from multiple reports to create a personalized view of the data to meet their specific needs. This pinning process supports a visualization being pinned to multiple personal dashboards so the data consumer can narrowly define their dashboards.

If you are not reading this post on TumbleRoad.com, please go here to enjoy this post at its original source.

The data consumer can also create new reports in Powerbi.com via the browser.  The primary difference between the report and the dashboard is intent. Dashboards provide a view of the data; reports provide a way to explore the data. Data consumers will create reports when the need to slice and dice the data is the primary need. The effectiveness of their report development is dependent on the underlying data model adequately supporting the business scenario.

Business scenario focus means smaller, easier to use data models

Lastly, there is value in the less is more approach. Data models shouldn’t be monolithic in design. A well-designed data model provides only the data frames and data elements necessary to support the business scenario.

For example, one of the challenges of Project reporting is the sheer magnitude of available data. Overwhelming the data consumer with too many data frames (aka data tables) to search through for the requisite data elements, slows down the ad hoc reporting process and creates user frustration. In our Marquee™ designs, for example, we separate project reporting from timesheet reporting as separate business scenarios. This in turn reduces data consumer confusion.

A business scenario focused data model also reduces the risk of the data consumer inadvertently creating cross-join situations. When a cross-join arises, typically the data consumer has combined two data elements together in a visualization that don’t have a direct data relationship. The result is the data consumer seeing every potential combination of the two data elements. This may then lead to you receiving a support call as to why the tool is doing what it is doing.

Finally, keeping individual data models business scenario focused also enables you to better maintain the data model as the business scenario changes over time.

The wrap up

My hope is that you now understand why a data model approach is better than a report centric approach is superior when designing Power BI. Data model centric design approaches yield better long term support of ad hoc reporting. Focusing the data model on the business conversation also yields many benefits from a data consumer experience, from ability to correctly select the right model to improved speed of creating new reports.

How are you approaching your Power BI design today? What questions do you have? Please post your feedback in the comments below. I look forward to hearing about your design approaches.

[tagline_box backgroundcolor=”#5083bb” shadow=”no” shadowopacity=”0.1″ border=”1px” bordercolor=”” highlightposition=”top” content_alignment=”left” link=”http://eepurl.com/b6gtJ1″ linktarget=”_blank” modal=”” button_size=”” button_shape=”” button_type=”” buttoncolor=”orange” button=”Send me the white paper!” title=”Get our Business Intelligence white paper!” description=”Find out the 3 hidden reasons keeping you from effectively using Business Intelligence.” margin_top=”” margin_bottom=”” animation_type=”0″ animation_direction=”down” animation_speed=”0.1″ animation_offset=”” class=”” id=””][/tagline_box]

[tagline_box backgroundcolor=”#508ebb” shadow=”no” shadowopacity=”0.1″ border=”1px” bordercolor=”” highlightposition=”top” content_alignment=”left” link=”http://academy.tumbleroad.com” linktarget=”_blank” modal=”” button_size=”” button_shape=”” button_type=”” buttoncolor=”orange” button=”Click to Register” title=”Are you ready for the changes that Power BI is bringing to your reporting?” description=”Join our classes at the Tumble Road Academy, where we’ll teach you how to use the tools and then go into implementation best practices and organizational change management topics. Our lessons will save you time and headache!” margin_top=”” margin_bottom=”” animation_type=”0″ animation_direction=”down” animation_speed=”0.1″ animation_offset=”” class=”” id=””][/tagline_box]

 

 

 

Microsoft Project Online, Power BI and Cortana, better together!

Cortana with Power BI

[fullwidth background_color=”” background_image=”” background_parallax=”none” enable_mobile=”no” parallax_speed=”0.3″ background_repeat=”no-repeat” background_position=”left top” video_url=”” video_aspect_ratio=”16:9″ video_webm=”” video_mp4=”” video_ogv=”” video_preview_image=”” overlay_color=”” overlay_opacity=”0.5″ video_mute=”yes” video_loop=”yes” fade=”no” border_size=”0px” border_color=”” border_style=”solid” padding_top=”20″ padding_bottom=”20″ padding_left=”20px” padding_right=”” hundred_percent=”no” equal_height_columns=”no” hide_on_mobile=”no” menu_anchor=”” class=”” id=””]

The Tumble Road Tribe has been hard at work, pursuing that dream that we should be able to ask our computers questions and get reasonable answers quickly. This post shows some of our recent work, integrating Microsoft Project Online with Power BI and Cortana to create a seamless experience across the products. The video demonstrates this integration, using a very common scenario for anyone who manages a team.

This particular demo is part of a larger, upcoming announcement. Stay tuned for more!

[/fullwidth]

3 Ways to Rev Your Microsoft Project Online and Power BI Performance (Number 2 Is Our Favorite)

Are you having tired of waiting for long refresh times with Power BI? Perhaps, you are getting timeouts because you are retrieving too much data. There’s three easy ways to avoid these issues.

WHY IS MY POWER BI DATA REFRESH SO SLOW?

Many Power BI models refresh slowly as the models are not structured to take advantage of query folding.

Query folding is a process by which specific Power BI transformations are executed by the data source instead of Power BI. Allowing those actions to occur at the source means less data has to be sent to Power BI. Less data means faster data refresh times and smaller report models. Note, not all data sources support query folding, but oData for Project Online and SQL Server for Project Server do.

A sample of these foldable actions include

  • Column or row filtering
  • Group by and aggregation
  • Joins
  • Numeric calculations

For example, you need to find out the amount of work scheduled by week for the quarter. You are querying the time phased data from Project Online. If you aren’t careful, you may be retrieving hundreds of thousands of records. Query folding will make a huge difference in speed and the number of records retrieved. If you have the records filtered by Project Online before retrieving them, you may only receive a few thousand records instead.

ISSUE #1: NOT REFERENCING THE DATA SOURCE PROPERLY

This issue occurs primarily using oData sources, such as Project Online, Office 36 and other web based sources. Query folding breaks if you don’t properly reference the oData feed.

In order for query folding to work properly, the transformations that have folding support need to be the first things executed. If a non-folding transformation is added, no subsequent transformation will be folded. If you don’t reference the oData feed properly, the step to navigate to the oData feed isn’t foldable, therefore blocking all subsequent transformations from being folded.

ISSUE #2: DOING DYNAMIC DATE SELECTION IMPROPERLY

This issue causes more headaches with Project’s time phased data than any other action. In many cases, you are looking to create a range of dates for Power BI to query. If you use the Between filter in the Power BI Query Editor, there’s not option for Today’s date +/- a number of days. If you use many of the other Date/Time filters, like Is Latest, you still seem to pull back a lot of data.

Solving this particular issue requires getting familiar with the M query language so that you can understand how Power BI Desktop performs specific actions.

For example, let’s look at the Is Latest date selection. By default, your dataset is unsorted so Power BI creates a list of the values in the selected column and does a Max value. While this is correct, this could result in a lot of records being retrieved to perform a Max and get one record. The M code over a time phased table looks like this:

= Table.SelectRows(#”Filtered Rows”, let latest = List.Max(#”Filtered Rows”[TimeByDay]) in each [TimeByDay] = latest)

To get much better performance, make the following two changes. First, sort the dataset in a descending manner on the column which you are using to decide latest. In this example, that’s TimeByDay. Sorting is a foldable action so the data source will do the work.

Next, change List.Max to List.First. Since the latest date is the first record in the sorted dataset, a lot less data is required to get the answer. So, my statement is now = Table.SelectRows(#”Filtered Rows”, let latest = List.First(#”Filtered Rows”[TimeByDay]) in each [TimeByDay] = latest)

In testing, the original way required over a 1 Mb of data to be retrieved to answer the question. The new way only retrieves 127 Kb.

ISSUE #3: URLS ARE TOO LONG

This issue comes into play when working with Project Online and SharePoint based data sources. Project, in particular, has a lot of long field names. SharePoint has a limit of 2048 characters for a URL. If you are manually selecting a lot of field names, you can accidentally go past the 2048 character limit.

What this means is that Power BI receives a URL that can’t be acted upon. The default Power BI behavior is to then simply retrieve the feed without modifiers. If this happens on a source with many rows, this could significantly impact performance.

In this case, you would break up your dataset into two or more to fit within the limitation and then merge them back together in Power BI.

WANT TO KNOW MORE?

Join us for a free Power BI performance clinic on March 22! This session will do a deep dive into these issues as well as show you how to determine when folding is happening (and not.) Go here to register!

 

[imageframe lightbox=”no” lightbox_image=”” style_type=”none” hover_type=”none” bordercolor=”” bordersize=”0px” borderradius=”0″ stylecolor=”” align=”none” link=”http://academy.tumbleroad.com/courses/power-bi-performance-clinic” linktarget=”_blank” animation_type=”0″ animation_direction=”down” animation_speed=”0.1″ animation_offset=”” class=”” id=””][/imageframe]

See Cortana and Power BI Integration In Action

My Star Trek moment has finally arrived!

Growing up, it always seemed like a pipe dream to be able to ask questions of the computer and have her answer. Microsoft has got us the closest to this dream with Cortana and Power BI Integration. You can now summon answers from your Power BI dashboards, with the power of your voice.

Attached is a very short demo and hopefully you can see this is just the beginning. I’m working on a more extensive demo and will be presenting this functionality next week at SharePointFest Chicago.

[tagline_box backgroundcolor=”#508ebb” shadow=”no” shadowopacity=”0.7″ border=”1px” bordercolor=”” highlightposition=”top” content_alignment=”left” link=”http://bit.ly/BITrends5″ linktarget=”_blank” modal=”” button_size=”xlarge” button_shape=”” button_type=”” buttoncolor=”orange” button=”Click for More Details” title=”Five Key Trends in Business Intelligence” description=”See our five-star audience rated Five Key Trends in Business Intelligence session for only $49.
Use code get50offall at checkout. Click the button for details.” margin_top=”” margin_bottom=”” animation_type=”0″ animation_direction=”down” animation_speed=”0.1″ animation_offset=”” class=”” id=””][/tagline_box]

How To Use Your Free PowerBI.com license with Microsoft Project Online

You can use the free PowerBI.com license with Microsoft Project Online to do your basic reporting. There are caveats but I’ll show you ways that you can use the free license effectively.

What is Power BI?

Microsoft Power BI is a collection of online and client tools that enable organizations to visualize and explore data, share insights and allow consumption of these insights from multiple platforms.

PowerBI.com is the home site for the service and serves as the portal to see your dashboards. Power BI supports three primary entities:

  • Dashboards – A tile based way of showing you important information in a glance-able and easy to use fashion. Dashboards also support natural language querying, where the viewer can type in a question and have Power BI build a new visualization automatically.
  • Reports  – Are similar to dashboards in that they are tile based. However, reports are more interactive in that clicking on a bar in a chart filters everything on the report. A given Power BI project can have up to 10 reports.
  • Datasets – These are sets of data that have been transformed into business-usable formats.

Power BI is available on all mobile platforms via mobile applications in the platform app stores.

It’s Free?

Yes! Awesome isn’t it! When Microsoft last updated the Power BI pricing plans, they came out with a free license tier. To get your license, go to http://PowerBI.com and login with your Office 365 account. Note, you can sign up with other accounts but if you want to use this license with Project Online, you have to sign up with the account with which you access Project.

Now What?

You need to download and install Power BI Desktop to create and publish your BI content. Think of Power BI desktop as “Word for Reporting”. You put it all together in Desktop but this isn’t where you would typically consume your BI content.

The Approach

Assumptions

For this approach to work, three things have to be true

  • Your report consumer wants to consume the data but desires accessibility over the ability to customize the views
  • The total amount of data you need to make available is less than 1G
  • Automatic daily refresh of data is acceptable

The second point is important as the data stored counts against both the creator of the visualizations and the consumer. The last point is the norm for most Project clients as most information is updated weekly.

Tactics

Publish the Report Pack

  • If you aren’t already signed up for PowerBI.com, create your free account using the same account you access Project Online.
  • If you haven’t already done so, download and install Power BI Desktop,
  • Create your Project visualizations as you normally would.
  • When finished, publish it to PowerBI.com using the Publish button. image

Configure Auto-Refresh

  • Open PowerBI.com
  • Hover over the dataset for your report, in this case, Project Report
  • You will see an ellipsis button appear on the right. click it
  • Select Schedule Refresh
  • Expand Data source credentials and enter your Office 365 credentials for Project Online
  • Expand the Schedule Refresh section and set up when you wish the data refresh to happen
  • Click Apply

Share the Report

Content Packs, which is a way of sharing report packs that allows the report consumer to customize the reports to their liking. However, creating and consuming Organizational Content Packs is a Pro only feature. You can consume the Microsoft supplied content packs with the Free license.

This is where the requirement that the end user is a simple consumer comes into play. The free license level can share their dashboards and reports with other free users. It simply makes the dashboard read only. For those tiles that support drilll-through, you can still access the reports.

To Share

  • Select your dashboard in the Dashboards group on the left pane
  • In the upper right corner of the screen, select the Share button
  • Enter either individual emails or preferably, AD groups with which to share the dashboard. These email addresses have to be within your organization.
  • Click Share

By default, everyone who is on the share list will get an email. The Report consumer will now see the shared dashboard under their dashboard section when they log into PowerBI.com. If they are using any of the mobile PowerBI clients, they will see the new shared dashboard there as well.

Please note, if you change the dashboard, the people with which it is shared will see the updates immediately.

Microsoft Project Online Timesheet Reporting using Power BI Desktop

PowerBI Timesheet Dashboard

Visualizing Project Time

Project Online supports collaborative work management across your organization. It provides the ability to manage multiple project schedules and allows project team members to enter the time spent on assigned tasks via their timesheet. The timesheet information can then be used for many purposes.

Many Project Management Offices (PMOs) have dynamic information needs that require data interactivity to uncover the sought out answer. Power BI Desktop is the perfect tool to enable the data exploration process. In this article, we’ll focus on the PMO need to audit timesheet data, exploring who has submitted time and what types of time were submitted. Below is an example of an overall timesheet status dashboard, created in Power BI.

 

For those unfamiliar with Project, timesheet submission is generally a weekly process. The Project Manager publishes the latest project version. At the end of the week, project team members log how much time they spent on assigned tasks and submit their timesheet for approval. Project Managers then review and approve the timesheet submissions. Once this process is complete, status reports and other key data visualizations use this updated timesheet information to support organizational needs.

Figure 1 Weekly Timesheet Process

Success takes the whole village

Getting everyone to submit their timesheet in a timely manner is key to getting full value from time tracking. Thus, auditing timesheet submissions is an integral part of a successful time tracking implementation.

In a perfect world, everyone would submit their timesheet on time. The team member would get value from the data that supports their work. The organization would also get value from having data in a timely manner with minimal capture effort.

The village faces challenges

Many organizations face three time tracking challenges.

First, it’s difficult to get the entire organization to fill out their timesheet in a timely manner. Sometimes, you aren’t in the office on Friday so the timesheet doesn’t get submitted. Some people will simply forget to submit their time. Others may be more passive aggressive in that they view time tracking as a tax with little individual benefit.

Second, Project has no included functionality for easily auditing timesheet submission. If you want to use time tracking, you have to build your own business intelligence views to meet your analytic needs.

Lastly, reporting tools of years past focused on solving a particular business problem. They were not optimized for ad hoc querying to meet immediate and specific data needs. Below is an example of a dashboard for investigating where timesheets have gone missing.

PowerBI Timesheet Investigate Dashboard

 

This article will take you through the process of defining your business need, show you how to leverage a social design approach to design the solution and take you through the steps to address the need in Power BI.

Designing for Business Intelligence

Business intelligence answers the questions that support specific organizational conversations. Experience in BI Design indicates that mapping the social interactions first, will increase the chances that the designed visualizations will meet the business need.

I use Tumble Road’s Conversation-Centric Design (CCD) approach for designing Business Intelligence solutions. CCD helps you derive the technical design details by starting with the modeling of the targeted social interactions. This process will identify and define three key aspects: the audience, the conversations and the key questions. Once these aspects are defined, you can use these to define the data scenarios, individual data elements, common user terminology, and supporting processes.

Figure 2 Conversation-Centric Design Process

CCD can also be used as a scoping framework. For example, you can choose not to support a conversation and be able to track all aspects impacted by that decision. It also makes it easier to create a project Work Breakdown Structure based on Audience:Conversation:Key Questions structure, allowing the rollup of task progress to provide the health of each aspect automatically.

Requirements and Design

For this exercise, I’m limiting the design to one primary audience, the Project Management Office, and three key questions for the PMO.

Requirements

User Scenario

The previous week’s timesheet results for projects and administrative time are reviewed in the Monday morning PMO team meeting. Claire, the director, leads the meeting, where she reviews the results with Stephanie and Ian. The data is reviewed by open timesheet period. The outcome of the meeting is to follow up with the individual team managers to complete the process. Claire also uses the data to generate PMO reports to the company vice presidents. Therefore, she wants to have complete data as much as possible to generate the reports.

The organization uses administrative time categories so all timesheet eligible resources are expected to turn in a timesheet weekly even when the person is not currently assigned to a project. The PMO is only concerned with total time charged to a project. Only one time category is generally received, Actual Work Billable, but due to an intern incident earlier in the year, all Actuals will be summed together to ensure all time is captured. The timesheet periods are defined as running from Sunday to Saturday each week.

Jason and Sophie need the timesheet data for their own uses with Finance and HR but are passive observers to the review process.

pmo

Figure 3 Requirements in CCD format

Conversation

The business intelligence visualization is needed to support the PMO weekly review of timesheet data at 9AM on Mondays.

Audience

The PMO consists of five people, one director and four analysts. Claire, Stephanie and Ian are direct consumers of the data and are the primary audience of this exercise. Jason and Sophie are indirect consumers of the data. Claire, Stephanie and Ian, therefore, will be best choices for assessing whether the visualization meets their needs. Stephanie and Ian can also verify the data.

Key Questions

From the scenario above, we can derive the following key questions. NOTE: There are typically more than three but I find three is a great starting point.

  • Have all timesheet eligible resources submitted their timesheet?
  • Have all submitted timesheets been approved?
  • Which teams have outstanding timesheets?

Technical Design

Supporting Data

The supporting data will denote three states for timesheet eligible resources. NOTE: Project refers to people as resources. Therefore, resources and project team members are interchangeable for this article.

  • Have not created a timesheet
  • Submitted a timesheet, but timesheet is still in process
  • Timesheet was submitted and approved

The data set will be aggregated to the weekly level of granularity by resource and project. Only the last 60 days of data will be reviewed.

Data Sets

The primary data sets necessary to create the requisite visualizations are as follows.

Timesheet eligible resources are defined as work resources in Project that can log into the system, have an email address and are capable of working in the analyzed time frame.

  • Identify and retrieve timesheet eligible resources using the following conditions
    • Work resources – Resource Type is 2
    • Resource is active in the system – ResourceIsActive = 1
    • Who can log into the system – ResourceNTAccount is not null
    • Are capable of working in the analyzed time frame – Resource Base Capacity is greater than zero.
  • Identify and retrieve the submitted timesheets with associated project information for the last 60 days for timesheet periods that are open.
    • Timesheet Period Start Date is within the last 60 days.
  • Identify all timesheet eligible resources who did not create a timesheet
    • This data will need to be derived using the other two data sets.

Visualizations

In order to answer the key questions, a timesheet status field is needed to hold the state. The following visualizations are needed to show the core data.

  • Listing of each resource by timesheet period with the current timesheet status submission
  • Stacked bar chart by timesheet period with a count of timesheets in each timesheet status
    • Provides a visual indication of resource level submission completeness.
  • Stacked bar chart by team with a count of timesheets in each timesheet status
    • Provides a visual indication of team level submission completeness.
  • Stacked bar chart by resource with a count of timesheets in each timesheet status
    • Provides a visual indication of resource level submission completeness.
  • Amount of time logged per project per timesheet period
  • Slicers will be required for timesheet period, team and timesheet status.

Data Elements for Visualization

This is the list of fields needed to generate the final visualizations. Other data will be needed in the process to generate the underlying data sets.

Project OData feed Field Definition Common Terminology
Resources ResourceDepartment This is the department to which the resource is a member Department
Resources Resource Email Address Work resource email address Email
TimesheetLine PeriodEndDate End date of timesheet period End
TimesheetLine ProjectID Project ID for linking to other project information Hidden from user
Resources ResourceID Resource ID for linking to other resource information Hidden from user
TimesheetLine TimesheetApproverResourceName The person responsible for approving the person’s timesheet Manager
Resources Resource Name Name of work resource Name
Calculated Index Needed to ensure unique key People
TimesheetLine TimesheetPeriodName In this example, the period name follows this pattern: 2015-W01. Timesheet periods are 7 days long. Period
TimesheetLine TimesheetPeriodStatus Open or Closed to timesheet submission Period Status
TimesheetLine PlannedWork Amount of time originally planned for the work Planned
TimesheetLine ProjectName Name of the project to which time is billed Project
TimesheetLine PeriodStartDate Start date of timesheet period Start
TimesheetLine TimesheetStatus The process status for a specific timesheet Status
Resources TeamName This is the team assignment pool to which a resource may be assigned Team
TimesheetLine [ActualOvertimeWorkBillable] +[ActualOvertimeWorkNonBillable] +[ActualWorkBillable] +[ActualWorkNonBillable] Amount of time entered against a project Time
TimesheetLine TimesheetPeriodName In this example, the period name follows this pattern: 2015-W01. Timesheet periods are 7 days long. Year

Supporting Processes

  • Weekly Time tracking process

Tools

  • Power BI Desktop
  • PowerBI.com

Building the Data

Data Feeds

There are four OData feeds necessary to build the timesheet audit report.

Resources

The resources feed provides the data that is visible via the Resource Center in Project Web App. This feed provides access to all resource custom fields.

ResourceTimephasedDataSet

This feed provides access to the resource capacity data at a day level of granularity. The Max Units field in the Resource impacts the capacity available by day. If Max Units is set in PWA, all days will have the same capacity. Project Pro provides the additional capability to create different Max Units for specific date ranges.

TimesheetPeriods

This feed provides access to the date ranges per week for which timesheets are expected.

TimesheetLines

This feed provides the summary data as to what time was logged by the project team member.

Transformations

In order to build out the dataset for the visualization, the following transformations are required.

  1. Identify and retrieve timesheet eligible resources using the following conditions
    1. Work resources – Resource Type is 2
    2. Who can log into the system – ResourceNTAccount is not null
    3. The result of this step is the potential list of resources who can submit a timesheet
    4. NOTE: Typically, you could use ResourceIsActive for steps 2 and 3. Be aware that the value changes if you turn on the new Resource Engagements feature. Feel free to change this in the code based on your individual configuration.
  2. Join the Resources with the Resource Capacity records to determine if a person worked for a specific week
    1. Are capable of working in the analyzed time frame – Resource Base Capacity is greater than zero.
    2. The result of this step is the actual list of resources expected to submit a timesheet for the selected timesheet periods
  3. Create a record for each timesheet eligible resource in step 2 for each timesheet period within the 60 day range. This is similar to a cross join in SQL.
    1. This is required as no timesheet record is created until the resource accesses the timesheet. Therefore, if resource never clicks the timesheet link, there’s no way to directly see that the timesheet wasn’t created.
    2. The result of this step is the master list of all timesheets expected for the selected timesheet periods.
  4. Join the submitted timesheets with the master timesheet list from step 3.
    1. The result of this step is the status of all timesheet submissions and non-submissions.

dataaccess

Figure 4 Order in which to perform the data transformations

Power BI Transformation Code

Resources

This M code calls the Resources OData feed and filters the records for only those defined as timesheet eligible. The columns are then limited to only those necessary to support the end visualizations.

let
Source = OData.Feed(“https://YOURSERVERNAME.sharepoint.com/sites/YOURPROJECTWEBAPP/_api/ProjectData”),
Resources_table = Source{[Name=”Resources”,Signature=”table”]}[Data],
#”Filter for Work Resources who are Users” = Table.SelectRows(Resources_table, each ([ResourceType] = 2) and ([ResourceIsActive] = 1) and ([ResourceNTAccount] <> null) and ([ResourceEmailAddress] <> null)),
#”Get core fields” = Table.SelectColumns(#”Filter for Work Resources who are Users”,{“ResourceId”, “ResourceCreatedDate”, “ResourceEmailAddress”, “ResourceName”, “TeamName”, “ResourceDepartments”})
in
#”Get core fields”

ResourceTimephasedDataSet

One of the requirements was to only expect a timesheet when the resource had available capacity. This requires a look at the data in a timephased manner.

This M code calls the ResourceTimephasedDataset OData feed and performs the following transformations. Descriptive comments for the statement following are denoted with //. Also, each step has been renamed to provide clues as to its function, making the process somewhat self-documenting.

let
Source = OData.Feed(“https://YOURSERVERNAME.sharepoint.com/sites/YOURPROJECTWEBAPP/_api/ProjectData “),
ResourceTimephasedDataSet_table = Source{[Name=”ResourceTimephasedDataSet”,Signature=”table”]}[Data],

// Timephased datasets are usually huge, only get the last 60 days.
#”Filter for Last 60 Days” = Table.SelectRows(ResourceTimephasedDataSet_table, each Date.IsInPreviousNDays([TimeByDay], 60)),

//Since we only care about resources who could have worked, we want to

//examine Basecapacity for any instance greater than zero.

//Basecapacity is calculated using the resource’s MaxUnits value
#”Filter for BaseCapacity > 0″ = Table.SelectRows(#”Filter for Last 60 Days”, each [BaseCapacity] > 0),

//Need to group data by week, so insert new column for week of year
#”Inserted Week of Year” = Table.AddColumn(#”Filter for BaseCapacity > 0″, “WeekOfYear”, each Date.WeekOfYear([TimeByDay]), type number),

//Once grouped, join with Resources feed from previous

#”Merge with Resources” = Table.NestedJoin(#”Inserted Week of Year”,{“ResourceId”},Resources,{“ResourceId”},”NewColumn”,JoinKind.Inner),

// We need to expose the resource created date so that we can

// eliminate any capacity prior to the date of resource creation
#”Expand to see Resource Fields” = Table.ExpandTableColumn(#”Merge with Resources”, “NewColumn”, {“ResourceCreatedDate”}, {“ResourceCreatedDate”}),

// Create a new column that uses Value.Compare to flag the nature

// of the relationship. A value of 1 means TimeByDay > Created Date
#”Flag Valid Capacity” = Table.AddColumn(#”Expand to see Resource Fields”, “ValidCapacity”, each Value.Compare([TimeByDay],[ResourceCreatedDate])),

// Eliminate any rows which don’t have a 1, which are all of the rows

// that exist before the resource was created
#”Filter for Valid Capacity Only” = Table.SelectRows(#”Flag Valid Capacity”, each ([ValidCapacity] = 1)),

// Group the result set by week number
#”Group by ResourceID_WeekNumber” = Table.Group(#”Filter for Valid Capacity Only”, {“ResourceId”, “WeekOfYear”}, {{“Count”, each Table.RowCount(_), type number}, {“Capacity”, each List.Sum([BaseCapacity]), type number}, {“Startd”, each List.Min([TimeByDay]), type datetime}})
in
#”Group by ResourceID_WeekNumber”

Timesheet Lines

This M code retrieves the time logged against the individual projects.

let
Source = OData.Feed(“https://YOURSERVERNAME.sharepoint.com/sites/YOURPROJECTWEBAPP/_api/ProjectData”),
TimesheetLines_table = Source{[Name=”TimesheetLines”,Signature=”table”]}[Data],
#”Removed Other Columns” = Table.SelectColumns(TimesheetLines_table,{“TimesheetLineId”, “ActualOvertimeWorkBillable”, “ActualOvertimeWorkNonBillable”, “ActualWorkBillable”, “ActualWorkNonBillable”, “CreatedDate”, “PeriodEndDate”, “PeriodStartDate”, “PlannedWork”, “ProjectId”, “ProjectName”, “TimesheetApproverResourceId”, “TimesheetApproverResourceName”, “TimesheetName”, “TimesheetOwner”, “TimesheetOwnerId”, “TimesheetPeriodId”, “TimesheetPeriodName”, “TimesheetPeriodStatus”, “TimesheetPeriodStatusId”, “TimesheetStatus”, “TimesheetStatusId”}),

// Only get the last 60 days of timesheet lines
#”Filtered Rows” = Table.SelectRows(#”Removed Other Columns”, each Date.IsInPreviousNDays([PeriodStartDate], 60))
in
#”Filtered Rows”

Timesheet Periods (Renamed to Timesheet Analytics)

This becomes the primary dataset for user facing visualizations. Everything is predicated on timesheet period so it makes sense to orchestrate the rest of the data around this data.

let
Source = OData.Feed(“https://YOURSERVERNAME.sharepoint.com/sites/YOURPROJECTWEBAPP/_api/ProjectData”),
TimesheetPeriods_table = Source{[Name=”TimesheetPeriods”,Signature=”table”]}[Data],

// Filter records for last 60 days
#”Filter for last 60 days” = Table.SelectRows(TimesheetPeriods_table, each Date.IsInPreviousNDays([StartDate], 60)),

// Cross Join of Timesheet Period and Timesheet Eligible Resources

// Creates a master list of all expected timesheets
#”Added Resource for every Period” = Table.AddColumn(#”Filter for last 60 days”, “AllResources”, each #”Resources”),

// Show all relevant resource fields
#”Expand to show Resource Fields” = Table.ExpandTableColumn(#”Added Resource for every Period”, “AllResources”, {“ResourceId”, “ResourceCreatedDate”, “ResourceEmailAddress”, “ResourceName”, “TeamName”, “ResourceDepartments”}, {“ResourceId”, “ResourceCreatedDate”, “ResourceEmailAddress”, “ResourceName”, “TeamName”, “ResourceDepartments”}),

// Left join to match submitted timesheets with

// master list of expected timesheets
#”Merge with TimesheetLines” = Table.NestedJoin(#”Expand to show Resource Fields”,{“PeriodId”, “ResourceId”},TimesheetLines,{“TimesheetPeriodId”, “TimesheetOwnerId”},”NewColumn”,JoinKind.LeftOuter),

// Show the timesheet status
#”Show Timesheet Status” = Table.ExpandTableColumn(#”Merge with TimesheetLines”, “NewColumn”, {“ActualOvertimeWorkBillable”, “ActualOvertimeWorkNonBillable”, “ActualWorkBillable”, “ActualWorkNonBillable”, “PlannedWork”, “ProjectId”, “ProjectName”, “TimesheetApproverResourceId”, “TimesheetApproverResourceName”, “TimesheetStatus”}, {“ActualOvertimeWorkBillable”, “ActualOvertimeWorkNonBillable”, “ActualWorkBillable”, “ActualWorkNonBillable”, “PlannedWork”, “ProjectId”, “ProjectName”, “TimesheetApproverResourceId”, “TimesheetApproverResourceName”, “TimesheetStatus”}),

// If timesheet status is null, there’s no matching timesheet lines

// Therefore, the timesheet must not have been created
#”Replace Timesheet Status null with Not Created” = Table.ReplaceValue(#”Show Timesheet Status”,null,”Not Created”,Replacer.ReplaceValue,{“TimesheetStatus”}),

// Join with resources that were working during this time
#”Merge with ResourceTimephasedData” = Table.NestedJoin(#”Replace Timesheet Status null with Not Created”,{“ResourceId”},ResourceTimephasedDataSet,{“ResourceId”},”NewColumn”,JoinKind.Inner),

// Per requirement, combine all actuals columns into one
#”Total up all actuals” = Table.AddColumn(#”Merge with ResourceTimephasedData”, “Time”, each [ActualOvertimeWorkBillable]+[ActualOvertimeWorkNonBillable]+[ActualWorkBillable]+[ActualWorkNonBillable]),

// For missing timesheets, replace nulls with zero
#”Replace Time nulls with zeros” = Table.ReplaceValue(#”Total up all actuals”,null,0,Replacer.ReplaceValue,{“Time”}),

// Rename columns to be aligned with user terminology

#”Renamed Columns” = Table.RenameColumns(#”Replace Time nulls with zeros”,{{“Description”, “Period Status”}, {“EndDate”, “End”}, {“PeriodName”, “Period”}, {“StartDate”, “Start”}, {“ResourceEmailAddress”, “Email”}, {“ResourceName”, “Name”}, {“TeamName”, “Team”}, {“ResourceDepartments”, “Department”}, {“PlannedWork”, “Planned”}, {“ProjectName”, “Project”}, {“TimesheetApproverResourceName”, “Manager”}, {“TimesheetStatus”, “Status”}}),

// Narrow down data set to only those field required

#”Removed Other Columns” = Table.SelectColumns(#”Renamed Columns”,{“Period Status”, “End”, “Period”, “Start”, “Email”, “Name”, “Team”, “Department”, “Planned”, “Project”, “Manager”, “Status”, “Time”}),

// Change dates to only show date, not datetime

#”Changed dates to only dates” = Table.TransformColumnTypes(#”Removed Other Columns”,{{“End”, type date}, {“Start”, type date}}),

// Add an unique index that will provide additional

// functionality in Q&A

#”Added Index” = Table.AddIndexColumn(#”Changed dates to only dates”, “Index”, 1, 1),

// Rename the index to use the People term

#”Renamed Index for Q and A support” = Table.RenameColumns(#”Added Index”,{{“Index”, “People”}})

in

#”Renamed Index for Q and A support”

Want More? Let us teach you Power BI.

‘); // ]]>