How to Avoid API Rate Limiting
Every system has its limits. At 7pace, there are limits on the number of API requests that you and your team are allowed to ensure quality of service for all users. Please see API Request Limits and Lifetime Policies for more information. This article contains techniques that may help you to achieve less frequent and more effective interaction with 7pace's API.
On this page:
|
---|
Incremental refresh
Incremental refresh is a great way to decrease the load on the API. Instead of repeatedly polling the entire history of data, the data created, changed, or deleted during a specified period is requested (refreshed) from the API. Whereas, the data created, changed, or deleted before this specified period stays stored in the dataset and is no longer requested from the API. Data older than a specified archived period is removed from the dataset altogether. The archived period is usually set to years, while the refreshed period is set to days or single months (more on this later).
Â
How to set this up with 7pace Reporting API? Use the EditedTimestamp attribute of worklogs and only query data where EditedTimestamp was changed within a specified period. The EditedTimestamp attribute is updated whenever a worklog is created, updated, or deleted. Make sure you use API of version 3.3 or higher and utilize endpoints ending with WithDeleted
so that you get deleted worklogs in responses.
Incremental refresh can be implemented manually, e.g., via using two reports (one for historical data, one for fresh data). The easiest way, though, is to use native Power BI functionality. We recommend reading the official Microsoft documentation first.
Incremental refresh via PowerBI
In this section, the process to connect Power BI to our data source and set up incremental refresh is described.
Prerequisites and recommendations
You need Power BI Desktop together with a Power BI Pro license or higher. Data model and incremental refresh is set up on Power BI Desktop, but incremental refresh only works if a model is published as a service. See Power BI documentation for up to date information.
OData feed data source is more suitable for incremental refresh because it supports automatic pagination (via
@odata.nextlink
attribute in metadata). Therefore, we recommend using the 7pace reporting API.ÂFor optimal process, use version 3.3 or higher because it can also provide information about deleted worklogs. Why does it help? If you process information about deleted worklogs, you can select shorter refresh periods because you need not be dependent on locking past weeks for editing; you can simply get all changes no matter how old the edited or removed worklogs were.
If you decide to connect a data source as regular JSON content into Power BI (New Source > Web), you must implement pagination logic. This most likely prevent query folding that is required for incremental refresh to work (more on this later).
Process
For incremental refresh to work, you need to:
Set up a data source that will obtain data from the 7pace API for a certain date/time interval.
Set up incremental refresh parameters for this data source.
Publish the data source to a Power BI service and schedule its regular refresh interval.
Set up data source
First, connect Power BI to the 7pace OData data source (see our 7pace documentation for our step-by-step guide). For optimal performance, it is strongly recommended to use API v3.3 or higher and utilize endpoints providing deleted worklogs as well (they end with WithDeleted
suffix).
Once the data source is connected, you must set the RangeStart and RangeEnd parameters, and update the Power Query accordingly. This tells Power BI how to filter records from a specified refresh period.
Edit the query, e.g. via the context menu:
Add RangeStart and RangeEnd parameters using the official Microsoft configuration guide. The parameters must be in the Date/Time format, otherwise, the incremental refresh will not work. For the initial values, we set RangeStart in the past (i.e. years) and RangeEnd to the current date.
Next, choose a field with RangeStart and RangeEnd as ranges when obtaining data from the API. The field must contain a date and time. It is recommended to use the EditedTimestamp of a work log; even if a user creates a work log in the future or changes an old record, EditedTimestamp will contain the system date of the time when the change took place.
OData source, however, stores timestamps in DateTimeZone format, so you must create a function that transforms DateTimeZone to DateTime. Therefore, create a blank query, call itDateKey
, and paste the following code to the body:= (x as datetime) => DateTimeZone.From(x)
The result looks like this:As the final step, apply the filtering by RangeStart and RangeEnd to the field of our choice from the previous step with the
DateKey
conversion function applied to it. An easy way to do this is to first filter the field with some random value:Next, update the transformation step so it works with the RangeStart and RangeEnd parameters with
DateKey
function applied.Â
Make sure the interval is closed on one side (that is, >=
) and open on the other side (that is, <
), otherwise, the data in the report can get duplicated.
The final Power Query must look like this:
let
Source = OData.Feed("https://{replaceByYourURL}.timehub.7pace.com/api/odata/v3.3-beta", null, [Implementation="2.0"]),
workLogsWorkItems_table = Source{[Name="workLogsWorkItemsWithDeleted",Signature="table"]}[Data],
#"Filtered Rows" = Table.SelectRows(workLogsWorkItems_table, each [EditedTimestamp] >= DateKey(RangeStart) and [EditedTimestamp] < DateKey(RangeEnd))
in
#"Filtered Rows"
Further transformations, if needed, can be done after the previously-mentioned steps, but only transformations that allow the query to be folded must be applied.
Now that the filtering is set, check if Power BI translates the Power Query to the API request properly. In other words, the filters must be applied in the API calls directly, instead of Power BI downloading all data first and then applying the filters after post.
Incorrect:
GET https://{sampleURL}.timehub.7pace.com/api/odata/v3.3-beta/workLogsWorkItemsWithDeleted
Correct:
GET https://{sampleURL}.timehub.7pace.com/api/odata/v3.3-beta/workLogsWorkItemsWithDeleted?$filter=EditedTimestamp ge 2022-06-01T00:00:00%2B02:00 and EditedTimestamp lt 2022-06-09T00:00:00%2B02:00
One option to verify this is to use a tool for capturing and debugging network traffic (WireShark, Fiddler); or, use the Power BI built-in diagnostic tools as demonstrated below:
In the Power Query editor, choose Tools > Start Diagnostics.
Refresh the query.
In Tools, select Stop Diagnostics and check the Detailed results of the Query section.
Filter only those rows that have the Data Source Query column filled in and check the requests. In the example below, the query folding works correctly and EditedTimestamp filter is used in the API call directly.
Troubleshooting - start with a base URL
What if you still cannot see filters applied in the URL? The issue may lie in how you set up the data source in the first place - for PowerBI query folding to work, you must start with the base URL (e.g. https://{sampleURL}.timehub.7pace.com/api/odata/v3.3-beta
), and then, choose the endpoint you wish to use as a subsequent step in PowerBI.Â
Incorrect URL when setting up data source:
https://{sampleURL}.timehub.7pace.com/api/odata/v3.3-beta/workLogsWorkItemsWithDeleted
Correct URL when setting up data source:
https://{sampleURL}.timehub.7pace.com/api/odata/v3.3-beta
The selection of a specific endpoint is represented by the second row of the Power Query example (the one starting with workLogsWorkItems_table
).
let
Source = OData.Feed("https://{replaceByYourURL}.timehub.7pace.com/api/odata/v3.3-beta", null, [Implementation="2.0"]),
workLogsWorkItems_table = Source{[Name="workLogsWorkItemsWithDeleted",Signature="table"]}[Data],
#"Filtered Rows" = Table.SelectRows(workLogsWorkItems_table, each [EditedTimestamp] >= DateKey(RangeStart) and [EditedTimestamp] < DateKey(RangeEnd))
in
#"Filtered Rows"Â
Â
Set up incremental refresh
After making sure that Power BI is calling the correct API queries, go ahead and set up incremental refresh of the data source.
 The setting of incremental refresh attributes can vary per organization, depending on how much past data is needed for your reports (months or years), if you process deleted worklogs or not, and, the frequency of your reporting.Â
An example of the setting can be:
The archive period is set to two years, which means that data up to two years old is a part of the dataset. Older data will no longer be kept in the dataset in Power BI (which means it will not be in your reports, but will still remain in 7pace database).
Refresh period:
Recommended: set to one week if you fetch all changes including deleted worklogs (i.e. you are using an endpoint ending with
withDeleted
). This way, PowerBI only fetches worklog changes (creation, modification, deletion) done during the past 7 days, while the rest of the data stays untouched.Set to one month if you do not process deleted worklogs so that you have more space to catch changes. One month refresh period means that data creation/deletion/update within the past month is reflected in your report. This also includes the update of records that are much older than one month, because EditedTimestamp is replaced with the current date in this case. Only deletion of records older than the refresh period are not reflected.
If the refresh period is kept short, the response becomes faster; though the API returns less data.
The warning about query folding does not necessarily mean that the query folding will not actually work. It just means that Power BI was not able to verify the query. If Power BI called the URLs with the correct filters, as described in the previous section, and if a refresh can be scheduled after the dataset is published as a Power BI service (see below), the warning sign can be ignored.
Will this create duplicate records?
Yes, if somebody updates a worklog older than your refresh period, you will get a new record via a refresh along with the original record created or updated before the refresh period. It is, therefore, recommended to set up a follow-up Power Query that shows records only with the latest EditedTimestamp according to its id. Removing duplication is not in scope of this tutorial, but watch this video as a guideline.
 Publish to a service and schedule regular refresh
With the incremental refresh set, the dataset to a Power BI service must be published. If any changes occurs in the Power BI Desktop afterwards, the data model must be published again for the changes to take place in the service. Click Publish.
In Power BI cloud, navigate to our dataset and refresh the data manually for the first time. You may need to enter credentials (this is described at the bottom of the article).
After the first refresh is done, you can schedule a regular refresh of the dataset.
Troubleshooting
When experimenting with the incremental refresh ourselves, there can be several potential issues. Use the following information to troubleshoot or avoid them altogether:Â
The refresh fails/asks for credentials if Power BI does not know the given API yet. Try updating the credentials in the settings of the data source and try again.
Datasets with dynamic data sources cannot be refreshed from the service (although they seem to work fine in Power BI Desktop). You have to rewrite your Power Query:
If using
OData.feed
in your PowerQuery, as is recommended for 7pace API, remember that you must obtain the feed via the base URL (e.g.https://{sampleURL}.timehub.7pace.com/api/odata/v3.3/workLogsWorkItemsWithDeleted
) without parameters, and, subsequently, filter results usingTable.SelectRows
as described in this post. You must go through the guide thoroughly to avoid this error.If using
Web.Contents
in your Power Query, see this blog post for more details. However, you can encounter the dynamic data source issue when implementing pagination anyway.
Final thoughts
It is recommended to verify the refresh status of the PowerBI service once the process is completed.
Template
A Power BI template file - 7pace Incremental Load Example.pbit - is available that contains a simple example of incremental refresh. Just input initial start and end dates, change URL in the datasource, and, enter your credentials.Â
Incremental refresh in Excel
Currently, Excel does not support incremental refresh.
It is recommended to set up a service in Power BI (see above) and then using the result as a data source in Excel to save API calls.
 Incremental refresh in other tools
Incremental refresh can be achieved in the other tools as well, but you need to keep two separate reports: one with stable data and one with fresher 'delta' data.
If you are used to loading 7pace Timetracker data to your database and evaluating the changes there, you can miss deleted records (because they are currently deleted for good, without being indicated using a delete flag in the API responses). In this case, do add and/or upvote our feature request for webhooks.