The Request Stats Report shows the Key Performance Indicators (KPIs), or metrics, for each element of a test script, allowing you to drill down and review the statistics (stats) for how every single element of your test performed.
KPIs per Element
The table in this report displays a first row with labelled ALL, which displays values for all requests made during the test, followed by individual rows for each named request in your test.
Note: If you used your own JMeter script, then this table displays the labels you used in your script.
Click the gear icon to open Requests Stats Settings and review what KPIs are available to display in the table.
Note: All times are in milliseconds.
The KPIs available include:
- Element Label - The name of the HTTP Request from the Jmeter script
- #Samples - The total number of samples executed.
- Average Latency - The average Latency for the request(s) executed.
- Average Response Time - The average response time for the request(s) executed. While the test is running, it will display the average of the requests already executed, and the final value once test execution is finished.
- Geo Mean RT - T his type of calculation is less-sensitive to extreme values (w.g spikes of high or low values that can affect regular "arithmetic" average). Calculation:
- Standard Deviation - The standard deviation(a measure of variation) of the sampled elapsed time.
- 90% Line - 90th Percentile. 90% of the samples were smaller than or equal to this time.
- 95% Line - 95th Percentile. 95% of the samples were smaller than or equal to this time.
- 99% Line - 99th Percentile. 99% of the samples were smaller than or equal to this time.
- Minimum Response Time - The shortest time for the samples with the same label.
- Maximum Response Time - The longest time for the samples with the same label.
- Median Response Time - 50th Percentile. 50% or half of the samples are smaller than the median, and half are larger.
- Average Bandwidth (bytes/s) - The size of traffic made per second in Bytes.
- Hits/s - The number of requests made per second. When the throughput is saved to a CSV file, it is expressed in requests/second, i.e. 30.0 requests/minute is saved as 0.5. Once the test is done, the throughput is the actual throughput for the duration of the entire test.
- Error % - The Error rate per label. While the test is running, it displays value based on samples already completed, and a final value after completion of test execution.
- Duration - The sum of the duration for all the samples in that label.
You may or may not see a row named AGGREGATED LABELS. The Request Stats Report only displays the first 100 element labels executed by your test. If your test script has more than 100 labels, the following will apply:
- The first 100 labels executed will be displayed normally.
- The AGGREGATED LABELS row will appear and will only report the sample count for those labels beyond the first 100 displayed.
- The ALL label will still track all 100+ labels and account for them in its calculations.
- Though the Request Stats Report can be filtered by scenario and location, it will nonetheless still not display more than 100 labels.
- A multi test may contain up to 300 labels before aggregating them, but the underlying scenarios that make up the multi test are still limited to 100 unique labels each.
- Multi tests with over 300 labels do not show an AGGREGATED LABELS row.
Download Report as CSV
You can download the aggregate report data for this report in CSV format via the "Download CSV" button, which can be found on the right side of the screen, directly above the report table, and to the left of the settings gear icon.
Download Report as CSV via the API
Report data can also be downloaded, in CSV format, via Blazemeter's API.
Once a test is executed, you can run the following API call to download the file:
curl -o report.csv -X GET https://a.blazemeter.com/api/v4/masters/<master_id>/reports/aggregatereport/data.csv --user '<id>:<secret>'
Jump to next section: