Skip to end of banner
Go to start of banner

PPM - Using Program Predictability Measure report

Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Next »

Contents

Prerequisites

Ensure that you are good on the following prerequisites:

 

Reading the chart

Your Program Predictability Measure Gadget displays the following information:

Reading the chart


Selecting data to display on the chart

The interface enables you to customize your report view by showing and hiding the data depending on your current focus. For instance, if you are focused on sprint estimates (commitments) and the estimation error, you can hide the sprint actuals by clicking the name in the chart legend as shown below. To display hidden data again, click its name in the legend again.


Tooltips

For you to have an instant insight into what the certain parts of the chart mean and how much work was delivered at a certain point in time, hover over a place on the chart. The tooltip will show details for the selected point on the report.


Zoom feature

If you have too many sprints displayed on the report, it might get difficult to read data (see screenshot), so you can zoom in to see only a few sprints of interest. To zoom in, hover over the chart area and use the mouse wheel or your track pad. Also, you can pan over the chart by clicking and dragging on a section of the chart. To zoom out, you can use your mouse wheel or click the Refresh button.


Program Predictability Measure details in tabs

Working with the report details, you might want to have more insights in addition to chart view. For this purpose we provide detailed information, insights, and warnings in a table view below the chart.

There are the following tabs with the report insights:

Summary tab

Here you can view information about planned and delivered work measured in the selected metrics, average length for the sprint and average velocity, and warnings in case the actual progress is behind the plan. Note: In case source data is not available the panel is hidden.

StatsDescription
Total BV committed 

sum of all planned BV for all PIs in the report. The sum of all planned BV in the PIs tab is equal to the Total BV committed in the Summary tab. This means that a feature can be calculated several times in the Total BV committed. See the explanation below & apply this logic. Examples?

Total BV delivered 

sum of all BV for all PIs within the period for the reporting. If the feature was planned in several PIs but got delivered in the last one, only count it for the PI where it was delivered (see example above).

Lower point
  • Field type: single select dropdown
  • Possible values: 10% - 100% in 10% increments
  • Default value: 80%
  • Dependencies: none
  • Error handling: none
  • Tooltip - Define a lower point where a delivered Business Value percentage should be considered to be still within the norm. The default is 80%.

This setting defines the static area on the chart - green color on the mockup:

% of BV delivered

total BV delivered / total BV committed for the period of the report.

Total features
sum of all features examined for the report, even not delivered ones.


This statistics can help you plan the future sprints. For example you can use average estimated or average accomplished as the value for the ideal team velocity and plan sprints taking this into account.

PIs tab

The Sprints tab contains details on the sprints that are included in the report. The sprints are clickable to make the navigation faster for you.

This tab will have the following columns (top to bottom in the list corresponds to left to right in the table)

  • PI NAME - up  to 5 PIs will be here, no pagination
  • COMMITTED BV - please use this label, not the one in the mock
  • DELIVERED BV - please use this label, not the one in the mock
  • % DELIVERED -
  • START DATE
  • END DATE

Other details

  • Tab name should also have the amount of the PIs in the report, see picture for reference, there's 4.
  • Columns are sortable. Default sorting by end date of the PIs from the latest on top to the oldest in the bottom
  • Align elements left inside the columns
  • No decimals for %
  • Date format MM/DD/YYYY

Warnings tab

Warnings are generated to help you identify cases when chart's data might show not the actual picture. For instance, when an issue is not estimated or when a ticket is resolved outside of the sprint and is not marked as a duplicate or clone. The tickets in the Warnings tab are not in included into the chart calculations, so to ensure the chart's accuracy, click the issue in the warning and take action, for instance – fill out the necessary fields in this ticket.



  1. Feature wasn't marked as done
    • Feature completion is indicated by a workflow state. Status category = Done/green
    • Text: "Feature wasn't marked as done; report data might be incorrect."
  2. Features containing tickets that are not estimated (Original Estimate field is empty)
    • We should show this warning for both completed features and those that are planned but not completed yet. This can be helpful for PI planning to indicate that some stories have yet to be planned and that the overall feature estimate may not be accurate.
    • Text: "Feature contains unestimated tickets; feature estimate might be inaccurate."
  3. Features that don't have BV indicated (BV field is empty)
    • We should show this warning for both completed features and those that are planned but not completed yet.
    • Text: "Feature doesn't have BV indicated; report data might be incorrect."
  4.  Any of the children of the Feature are tagged to a different PI
    • When a chart is built and shown, check the tickets tied to this feature / capability / portfolio epic. If at least one of these tickets is linked to a PI (fixVersion) that is different from the Pi that a feature / capability / portfolio epic is linked to, then display a warning. The warning text should be different for each issue type
      • for feature (renamed epic): "Check tickets for this Feature, at least one ticket is linked to another PI."
      • for capability: "Check tickets for this Capability, at least one ticket is linked to another PI."
      • for portfolio epic: "Check tickets for this Portfolio Epic, at least one ticket is linked to another PI."
    • Elena Kolesnik please add this info to the docs
      If the Feature is committed in a PI, all child stories should also be completed in the PI. The feature should not be marked complete/done until all children are done, and if some children are not done until a later PI, then that Feature should only be credited as "complete" in the later PI. 

Other details:

  • Tab name should also have the amount of the Warnings in the report, see picture for reference, there's 7.
  • FEATURE column is sortable. The default sorting should be from the A/lowest, e.g. ABC-123, then KLM-354, then ZYX-867
  • Features should be clickable and open the tickets on click. See Sprints tab in other gadgets for reference, this should work in the same way.
  • Once a user resolves the warning and comes back to the report, the warning should go away. See logic in other gadgets to see when conditions are checked and data is refreshed.
  • There should be 5 warnings max on the page. If there's 6 or more - there should be pagination. See other warnings in other AR gadgets for reference.


Exporting report data

You can save the chart image, all you have to do is right click the chart area and select Save image. You can also export the statistics and the sprints details into the XLS file by clicking the Export button.

Other tips and tricks

For you to have a meaningful data on the Gadget, check on the following items:

  • You have at least 1 completed sprints to filter by the SCRUM board
  • You check the Warnings tab from time to time and triage tickets from there. The tickets shown in the Warnings tab are not included into the chart calculations, and thus in order to see the accurate picture on the chart, you might want to triage those tickets first.

See also

BD - Using Burndown report

  • No labels