The Request Stats Report shows the Key Performance Indicators (KPIs), or metrics, for each element of a test script, allowing you to drill down and review the statistics (stats) for how every single element of your test performed.
- View Request Stats Report
- KPIs per Element
- Aggregated Labels
- Download Report as CSV
- Download Report as CSV via the API
View Request Stats Report
Follow these steps:
- In the Performance tab, select Reports. The most recent reports are shown on top.
- Click Show all reports and select a report to view its details.
- Click the Request Stats tab.
- Move the timeline sliders to narrow down the time. The results for your selected time period will show in the results table. To reset your selection, click Reset Timeline .
- (Optional) Toggle the Show Baseline Comparison button. For more information, see Baseline Comparison.
- To Filter By Label, expand the drop-down list and check the box for all labels that you want to display.
Note: If you select ALL, values for all requests made during the test are displayed. If you used your own JMeter script, this table displays the labels you used in your script.
- Click Apply.
- To Select Table KPIs, click Edit Columns.
- Review what KPIs are available to display in the table.
Note: By default, the times shown are in milliseconds. To change the times shown to seconds, toggle the Show results in button:
The available KPIs include:
- Element Label
The name of the HTTP Request from the JMeter script.
- # Samples
The total number of samples executed.
- Avg. Response Time (ms)
The average response time for the request(s) executed. While the test is running, it will display the average of the requests already executed, and the final value once test execution is finished.
- Avg. Hits/s
The number of requests made per second. When the throughput is saved to a CSV file, it is expressed in requests/second, i.e. 30.0 requests/minute is saved as 0.5. Once the test is done, the throughput is the actual throughput for the duration of the entire test.
- 90% line (ms)
90th Percentile. 90% of the samples were smaller than or equal to this time.
- 95% line (ms)
95th Percentile. 95% of the samples were smaller than or equal to this time.
- 99% line (ms)
99th Percentile. 99% of the samples were smaller than or equal to this time.
- Median Response Time (ms)
50th Percentile. 50% or half of the samples are smaller than the median, and half are larger.
- Min Response Time (ms)
The shortest time for the samples with the same label.
- Max Response Time (ms)
The longest time for the samples with the same label.
- Avg. Bandwidth (Kbytes/s)
The size of traffic made per second in Kbytes.
- Error Percentage
The Error rate per label. While the test is running, it displays value based on samples already completed, and a final value after completion of test execution.
- Avg. Latency
The average Latency for the request(s) executed.
- StDev (ms)
The standard deviation(a measure of variation) of the sampled elapsed time.
- Error Count
The number of errors, including response codes and JMeter assertions.
- Duration (hh:mm:ss)
The sum of the duration for all the samples in that label.
Tip: You can find the count of requests whose response times, latency, or any other KPI are above or within certain limits by setting a baseline and comparing against it. To get an exact count, download the report as CSV.
You may see a row named AGGREGATED LABELS. The Request Stats Report displays the first 300 element labels executed by your test (due to the limit of 300 labels per engine). If your test script has more than 300 labels, the AGGREGATED LABELS row appears and only reports the sample count for those labels beyond the first 300 displayed.
If your test script has more than 300 labels, the following will apply:
- The first 300 labels executed will be displayed normally.
- The ALL label will still track all 300+ labels and account for them in its calculations.
- Though the Request Stats Report can be filtered by scenario and location, it will nonetheless still not display more than 300 labels.
- A multi-test can contain up to 900 labels before aggregating them. The underlying scenarios that make up the multi-test are still limited to 300 unique labels each.
Example: In a multi-test with two tests where one test generates the labels 1-300 and the second test generates the labels 301-600, you will see all the 600 labels in the report.
- Multi-tests with over 900 labels do not show an AGGREGATED LABELS row.
- A report displays a total of 900 labels per report.
Download Report as CSV
To download the aggregated report data for the Request Stats Report in CSV format, click Download CSV:
Download Report as CSV via the API
Report data in CSV format can also be downloaded via Blazemeter's API.
Once a test is executed, you can run the following API call to download the file:
curl -o report.csv -X GET https://a.blazemeter.com/api/v4/masters/<master_id>/reports/aggregatereport/data.csv --user '<id>:<secret>'
Jump to next section: