First you need to add this report to your dashboard. To do that, click Add new gadget. In the Add new gadget dialog, click Load all gadgets. Search for the Program Predictability Measure report and add it to your dashboard.
To see more instructions about adding a gadget to your dashboard see Atlassian documentation.
Configuring the report
To configure the report, fill out the fields in its settings (see fields description below).
Terminology
PI — program increment
This is FixVersion field in Jira. In 99% of cases the PI will be equal to 3 months but this should not influence the report buildup.
BV — business value
BV can be calculated in the BV field for one of the following issue types: Feature (renamed Epic field), Capability, or on the Portfolio Epic. All these are issue types that consultants configure, they are not provided by a third party app and not by a gdaget. The gadget should be configurable where a user should be able to select which issue type to use for the report. For example, all teams inside a particular Jira use the same BV field but some teams use it inside the Capacity issue type and some - inside the Feature issue type. For the gadget we select tickets of which issue type it should be looking for.
The gadget should only calculate BV when ticket of the selected issue type (Feature, Capability or Portfolio Epic) is marked as Done. Until then we assume the work is planned (counted in the planned BV) but not delivered yet (not counted in delivered BV). For example, while Features should be completed in a single PI, it is possible that the Feature is not completed and gets re-planned in the next PI; however in this case the Feature should only show as delivered in the PI it is completed in. So, the BV committed would count for both the first PI and the second, but would only show as delivered in the second PI.
ART - Agile Release Train
An ART conceptually is a Team of Delivery Teams (DTs). This can be upwards of 120 folks (max by SAFe's recommendations so if each team has 5 people that would be 24 Delivery Teams) broken down into Scrum, Kanban, or XP Teams of 5 to 7 individuals ideally. Each DT is on the exact same candace in the PI. For example, the entire PI is on the same 2 week Sprints all starting and stoping on the same dates. Every two weeks some portion of or even all of the DTs would be delivering software to their ART. Once included on the ART further testing, etc. is occurring.
PPM - Fields description
Field | Description |
---|---|
Program (ART) | Data buildup selection. A prerequisite is to have an Agile Release Train custom Field created in Jira. This Jira field corresponds to the Program (ART) field in the report. We select one of the ARTs here for which we should be building a report. For example, there is ART1, ART2, ART3 configured for this field in Jira. If we select "ART2" in the Program (ART) field, the report should only show tickets that have "ART2" indicated in the Agile Release Train custom field, pending the other conditions indicated in other settings. Select an Agile Release Train (ART) for which you want to view the report. |
Issue type | All these are issue types that we configure, they are not provided by a third party app. The gadget should be configurable where a user should be able to select which issue type is used for capturing the BV. The field should display those issue types that are set up in Jira. For example, there's Feature and Capability but there's no Portfolio Epic. Then the list in this field should display these two options only. Select the issue type where the business value is entered |
Start reporting at | Select a completed program increment (PI) from which you want to start the reporting. The latest five PIs are available.
|
Lower point | Define a lower point where a delivered Business Value percentage should be considered to be still within the norm. The default is 80%. |
Refresh Interval | The interval at which you want the gadget to be updated. |
Tips for the Program Predictability Measure report
For you to have a meaningful data on the report, check on the following items:
- You have at least 1 completed sprints to filter by the SCRUM board
- You check the Warnings tab from time to time and triage tickets from there. The tickets shown in the Warnings tab are not included into the chart calculations, and thus in order to see the accurate picture on the chart, you might want to triage those tickets first.