Follow

Create a Test - Detailed Scheme

Step 1 : Write Your Script

There are various ways to get your script done:

  1. Use the BlazeMeter Chrome Extension to record your scenario 
  2. Use the JMeter HTTP(S) Test Script Recorder. This sets up a proxy so you could run your test through and record everything
  3. Go manually all-the-way and construct everything from scratch. This is more common for functionality/QA tests

If you get your script from a recording (as in steps 1 & 2), keep in mind that:

  1. You'll need to change certain parameters, such as Username & Password, or you might want to set a CSV file with those values so each user can be unique.
  2. You might need to extract elements such as as Token-String, Form-Build-Id and others using Regular Expressions, JSON Path Extractor, XPath Extractor. This will enable you to complete requests like "AddToCart", "Login" and more...See this article regarding these procedures
  3. You should keep your script parameterized and use configuration elements like HTTP Requests Defaults to make your life easier when switching between environments.

Step 2 : Testing Locally With JMeter

Start debugging your script with one thread, one iteration, and using the 'View Results Tree' element, Debug Sampler and Dummy Sampler. Keep the Log Viewer open in case any JMeter errors are reported.

Go over the True and False responses of all the scenarios to make sure the script is performing as you expected.

After the script has run successfully using one thread, raise it to 0-20 threads for ten minutes and check:

  1. Are the users coming up as unique (if this was your intention)?
  2. Are you getting any errors?
  3. If you're running a registration process, take look at your backend - are the accounts created according to your template? Are they unique?
  4. Test statistics on the summary report - do they make sense (in terms of average response time, errors, hits/s)?

Once your script is ready:

  1. Clean it up by removing any Debug/Dummy Samplers and deleting your script listeners.
  2. If you use the Listeners (such as "Save Responses to a file"), make sure you don't use any Paths! If it's a Listener or a 'CSV Data Set Config' make sure you don't use the path you've used locally and use only the filename instead (as if it was on the same folder as your script)
  3. If you're using your own proprietary JAR file(s), upload them to BlazeMeter along with the JMX.
  4. If you're using more than one Thread Group (or not the standard one) -  set the values before uploading them to BlazeMeter, and make sure to uncheck the 'Threads' and 'Users' scrollers in BlazeMeter's test configuration page so the values will be pulled from the Thread Groups.

Step 3 : BlazeMeter SandBox Testing

If that's your first test, take a look at this article on how to create tests in BlazeMeter.

The SandBox is actually a test which has up to 20 users, using 0 engines and runs for up to 20 minutes.

The SandBox configuration allows you to test your script and backend and ensure everything works well logically.

Here are some common issues you might come across:

  1. Firewall - make sure your environment is open to the BlazeMeter CIDR list and whitelist them.
  2. Make sure all of your test files e.g: CSVs, JAR, JSON, User.properties etc. are present.
  3. Make sure you didn't use any paths.

If you're still having trouble, look at the logs for errors (you should be able to download the entire log).

A SandBox configuration can be:

  • Engines: Console only (1 console , 0 engines)
  • Threads: 1-20
  • Ramp-up: 0-1200 seconds.
  • Iteration: Test continues forever
  • Duration: 1-20 minutes

This will allow you to get enough data during your ramp-up period to analyze the results and ensure the script was executed as you expected.

You should also check the Monitoring Report to see how much memory & CPU was used. This should help you with step four

Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.