Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

Why is time waiting on customer longer than resolution time

Marion Vandeputte
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
March 18, 2025

Hello, 

I’ve been tracking average resolution time and time spent waiting on customers for some specific issue types (using the same filter). However, I’ve noticed something odd: in February, the time spent waiting on the customer was actually longer than our resolution time. I’ve double-checked the filters and verified that the numbers are calculated for the exact same issues, yet the discrepancy remains.

Has anyone experienced something similar or can offer insights on what might be causing this?

Thanks in advance.

2 answers

0 votes
Valeriia_Havrylenko_SaaSJet
Atlassian Partner
March 19, 2025

Hi @Marion Vandeputte 
Welcome to the Community!

Your observation makes sense, and I’ve encountered similar cases before. The discrepancy between Resolution Time and Time Waiting on Customer is likely due to the way these metrics are calculated and how Jira gadgets process data.

Possible reasons for the difference:

  1. Time Waiting on Customer accumulates over multiple transitions.
    • If an issue moves in and out of the "Waiting on Customer" status multiple times, the total waiting time will add up.
    • Meanwhile, Resolution Time is only calculated from issue creation to resolution, regardless of how many times an issue transitions between statuses.
  2. Gadgets process data differently.
    • The Resolution Time gadget calculates time based on the difference between the created and resolution dates, ignoring intermediate status changes.
    • The Average Time in Status gadget tracks time spent in a particular status across all occurrences, meaning if an issue was in "Waiting on Customer" multiple times before resolution, the waiting time could exceed the total resolution time.
  3. Outlier issues may skew the results.
    • If one issue spent a long time in the "Waiting on Customer" status (e.g., reopened months later), it can significantly impact the average waiting time, even if most issues were resolved quickly.
    • The resolution time metric might not reflect this because it only considers the final resolution date.
  4. Filters and calculation periods might differ.
    • Even though both gadgets use the same filter, they may still interpret the data differently.
    • The Resolution Time gadget groups data by resolution date, while Time in Status might calculate all time spent in status regardless of when the issue was ultimately resolved.

How to investigate further:

  • Check the status history of a few affected issues. If an issue re-entered "Waiting on Customer" multiple times, the accumulated time will likely explain the discrepancy.
  • Compare the period of calculation in both gadgets. If the Resolution Time gadget is focusing on issues resolved in February, but the Time in Status gadget considers all time spent in that status (even if part of it happened in January), that could explain the longer waiting time.
  • Look for reopened issues. If some issues were reopened after initial resolution, their waiting time could accumulate across multiple periods.
0 votes
Walter Buggenhout
Community Champion
March 18, 2025

Hi @Marion Vandeputte and welcome to the Community!

Can you provide more details about the way you are trying to make these calculations? "Averages" are always quite tricky in calculations, and the tools you use to visualise results are not all equally 'smart' when it comes to processing the underlying data ...

Marion Vandeputte
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
March 18, 2025

Thanks Walter! I am using the JIRA gadgets to calculate both metrics:

- 'resolution time' gadget for the average resolution time.

- 'Average Number of Times in Status' with the 'waiting on customer status for the waiting on customer time.

Walter Buggenhout
Community Champion
March 18, 2025

I am not sure if you have dug into the documentation of how these gadgets work? Assuming that you are referring to the average time in status gadget as the second one, this kb article may be useful. Though it refers to the server version of Jira, I suspect that the main logic behind it is still the same on cloud.

See this extract for a possible explanation of what might be happening:

This chart shows the average time spent in a status for all resolved issues over the past 30 days. It only shows the average time spent on the selected statuses for issues that are resolved with a resolution set on the same day. For example, if there are 10 issues resolved on March 16, the total time spent on each selected status will be divided by 10 issues, despite when these issues are created. The average hours will then be plotted on the graph for the date March 16

The issues that get resolved on the same day may have a very different history for the Waiting for Customer status. Suppose that you have one issue that was resolved in February, but has been waiting for customer feedback for 3 months, this can have a major impact on the average time in that status overall.

The resolution time gadget works from the following principles:

  • The report is based on your choice of project or issue filter, and your chosen units of time (ie. hours, days, weeks, months, quarters or years).
  • The 'Resolution Time' is the difference between an issue's Resolution Date and Created date.
  • If a Resolution Date is not set, the issue won't be counted in this gadget.
  • The Resolution Date is the last date that the system Resolution field was set to any non-empty value.

I have seen confusion arise many times because different gadgets - due to their different underpinning logic - include different issues, map data to different time dimensions / dimension members or run fundamentally different calculations. This may be because they server different goals and may not have been designed to be used together. Hopefully the background of how they work may help you pinpoint the cause of the results you're seeing. Most likely, one of the gadgets is taking an outlier issue into account that the other one is not ...

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
PRODUCT PLAN
STANDARD
PERMISSIONS LEVEL
Product Admin
TAGS
AUG Leaders

Atlassian Community Events