Follow

Creating New Functional API Tests

API Test Maker lets you create tests for each operation within your API model. Once you create these tests, you can view and execute them from the Tests tab. You can also view these tests like any other functional test in BlazeMeter.

You can create the following types of tests: 

A few things to know as you create your model (and before you create tests):

Most of the parameters and assertions that define your API tests are created at the operation level. But the following elements are defined at a model level, and are then applied to every test that you generate. You can define these elements from the fields at the top of the Edit API Tests page.

  • Test Execution Location
    You can use the default location for executing your tests, or you can select a specific execution location. This includes predefined geographic locations as well as any Private Locations that you defined in BlazeMeter to test applications behind a firewall. If you don't select a location, BlazeMeter automatically selects a default location for you at the time of execution. For more information about setting up on-premise testing for a private location, see Private Locations and Install a New On-Premise Agent.
  • API Key
    If you are testing an API that requires an API key, you can define your API key and then apply it to all of the tests that are generated for a specific model. For more information, see Managing Authentication in API Test Maker.
  • Authentication
    If you are testing an API that requires Basic Auth or OAuth credentials, you can define those credentials and then apply them to all of the tests that are generated for a specific model. For more information, see Managing Authentication in API Test Maker.
  • SQL Injection
    SQL injection tests check your API for vulnerability from SQL injection attacks where a partial or complete SQL query is injected along with an API request. For more information about creating this type of test, see Creating SQL Injection Tests

Note: Changes to Basic Auth, OAuth,  and API Keys that are located in the request header are applied to your tests at the time of execution. If you make changes to any of these elements in your model, those changes are automatically applied to your tests the next time that you execute them. Changes to Test Execution Location, SQL Injection, or API Keys that are located in the query are applied to your tests at the time of test generation. If you make changes to these values, you must regenerate your tests to have these changes applied.

The following video provides a brief introduction to creating and executing API tests.

Creating an API Quick Test

The Quick Test feature creates a quick reference test to make sure that you are able to pass a request to your API and receive a valid response. You can use a quick test as a quick validation to make sure you are getting the expected response. You can also use a quick test to validate a change that you just made to the model. 

QuickTest.png

Follow these steps:

  1. Create an API Model or edit an existing one to access the Edit API Tests page. 
  2. Click a resource in the left pane to view the operations under that resource.
  3. Click Quick Test QuickTestIcon.png next to the operation that you want to test.
    The Quick Test window opens. The parameters for the selected operation display in the Parameters tab.
  4. Enter the values that you want to use for your quick reference test.
    Parameter values are populated from the reference values defined in your model. If you change the values in the Quick Test window, those changes are not saved in your model. To change a value for all future tests, change the reference value in the Parameters tab of your API model.
    As you type values, the field is highlighted in green if the value is valid for the parameter. Invalid values are highlighted in yellow, but the quick test does not prevent you from using an invalid value in your test.
  5. If you want to include specific header content in your request, click the Additional Headers tab. Enter the header content in the text field.
  6. If you want to include specific body content in your request, click the Body tab. Enter the body content in the Request Body field.
  7. Click Send to execute the test and send the request.
    The response from your test displays in the Response tab. The status of the test and the execution time, in milliseconds, also display in this tab.
  8. Click Close to return to the Edit API Tests page.

Creating Auto Generated Tests

The Generate Tests function lets you automatically create the following types of test cases with a single click:

  • Positive Test Case
    A test case that includes parameter values that comply with all defined parameter constraints and returns an expected successful response code.
  • Negative Test Case
    A test case that includes parameter values that violate one or more defined parameter constraints and returns an expected failed response code.
  • Edge Test Case
    A test case that includes parameter values at the edges of defined parameter constraints and returns an expected successful or failed response code. For example, for parameters with minimum and maximum values, the test case includes values at the boundaries of the defined range to ensure that inclusive and exclusive values return the expected responses.

You can generate tests for a single operation or for all operations within the API model.

Note: When you create auto generated tests, either at the operation or the model level, existing auto generated tests are deleted before generating a new set of tests.

Follow these steps to generate tests for an operation:

  1. Create an API Model or edit an existing one to access the Edit API Tests page. 
  2. Click a resource in the left pane to view the operations under that resource.
  3. Click an operation name to the details for that operation in the right side of the page.
  4. Click the Tests tab.
    The Tests tab opens. If you previously generated test cases for this operation, you can review the existing tests.
    Note: To regenerate a new set of test cases, you must first remove all existing generated tests. 
    AutoGenTests.png
  5. Click Generate Tests at the top of the tab.
    A list of the automatically generated tests for the selected operation displays. The following values display for each test:
    • ID
      A system-generated ID for the test, created by API Test Maker.
    • Details
      Lists the defined parameters and values that are used in the test of the operation.
    • Type
      Indicates whether the test is defined as a positive, negative, or edge case test.
    • Status Code
      Defines the expected response code that determines whether the test passes or fails.
  6. To execute your tests, click Execute Tests.
    Note: Clicking Execute Tests executes all of the tests that are defined for this operation, regardless of whether the check box is selected or not. The check box is only used for removing tests.
    For more information about executing the tests, see Executing Tests in API Test Maker.
  7. To remove one or more tests:
    1. Select the check box for each test that you want to remove and click Remove Tests.
    2. Click OK in the confirmation window. 

Follow these steps to generate tests for all operations:

  1. Click Generate All Tests at the top of the left panel.
    ExecuteAll.png
    A status bar displays at the top of page that shows you the progress of your test generation process.
  2. Click an individual operation to view or execute the generated tests for a specific operation.
    For more information about executing the tests, see Executing Tests in API Test Maker.

Creating Manual Tests

The Add Manual Test function lets you create test cases for a specific operation with parameter values that you specify.

Follow these steps:

  1. Create an API Model or edit an existing one to access the Edit API Tests page. 
  2. Click a resource in the left pane to view the operations under that resource.
  3. Click an operation name to the details for that operation in the right side of the page.
  4. Click the Tests tab.
    The Tests tab opens. If you previously generated manual test cases for this operation, you can review the existing tests. 
  5. Click Add Manual Test in the Manual section of the tab.
    The Add Test Case window opens. This window displays all of the parameters that are defined for this operation with their current values and constraints.
    ManualTest.png
  6. To change one or more of the parameter values, type the desired value in the Value field.

    Parameter values are populated from the reference values defined in your model. If you change the values in the Add Test Case window, those changes are not saved in your model. To change a value for all future tests, change the reference value in the Parameters tab of your API model.

    As you type values, the field is highlighted in green if the value is valid for the parameter. Invalid values are highlighted in yellow, but you are not prevented from using an invalid value in your test.

  7. Select one of the following types for your test:
    • Positive Test Case
      A test case that includes parameter values that comply with all defined parameter constraints and returns an expected successful response code.
    • Negative Test Case
      A test case that includes parameter values that violate one or more defined parameter constraints and returns an expected failed response code.
  8. Click Save.
    Your test displays in the list of manually generated tests in the Manual section of the tab. The following values display for each test:
    • ID
      A system-generated ID for the test, created by API Test Maker.
    • Details
    • Lists the defined parameters and values that are used in the test of the operation.
    • Type
      Indicates whether the test is defined as a positive, negative, or edge case test.
    • Status Code
      Defines the expected response code that determines whether the test passes or fails.
  9. To execute your tests, click Execute Tests.
    Note: Clicking Execute Tests executes all of the tests that are defined for this operation, regardless of whether the check box is selected or not. The check box is only used for removing tests.
    For more information about executing the tests, see Executing Tests in API Test Maker.
  10. To edit an existing manual test:
    1. Click Edit Test Case EditIcon.png for the test that you want to edit.
      The Edit Test Case window opens.
    2. Make the desired changes to the test case and click Save.

Creating SQL Injection Tests

SQL injection tests check your API for vulnerability from SQL injection attacks where a partial or complete SQL query is injected along with an API request.

To create SQL injection tests, you define the full or partial query that you want to inject and the response code that you would expect to have returned. You should define the details for your SQL injection tests prior to creating auto generated tests. If you define these details after creating your tests, you have to regenerate the tests for your set of tests to include the SQL injection test.

Follow these steps:

  1. Click Manage API Authentication in the menu bar at the top of the page.
    The Manage API Test page opens.
  2. Click the Name of the API model that you want to update.
    The Edit API Tests page opens.
  3. Click SQL Injection SQLInjectionIcon.png at the top of the page.
    The SQL Injection window opens.
    SQLInjection.png
  4. Enter the full or partial query that you want to inject in the Query field.
  5. Enter the expected response code in the Response Code field.
  6. Select or clear the Use SQL injection in all future tests check box.
    • Selecting this check box adds a SQL injection test to every set of auto generated tests that you create for a given operation.
    • Clearing this check box prevents a SQL injection test from being added to any test sets.
      Click Save.
  7. To remove an existing SQL injection query, click Delete DeleteIcon.png next to the query that you want to remove.
Have more questions? Submit a request

0 Comments

Article is closed for comments.