Automated Testing API’s

Behat is a great testing tool. Its capabilities are expanding daily due to its extensible extensions platform. I’ve been automating testing for a while now which includes API’s as well. I thought of sharing this knowledge with you.

Testing automation on API’s can be complex based on the approach taken. At this stage the following assumptions of your knowledge are made:

  • You have understanding of PHP. You will need PHP7.2 for this tutorial.
  • You know what Behat is.
  • You know what composer is and how to run it.

Great stuff. Now then, we’re looking at the following capabilities in the solution we’re about to put in place:

  • Its fast
  • Its easy to maintain
  • Its simple to implement

Before we move onto the testing, we need a test subject. Here is a snippet of one index.php:

<?php

$data = ['success' => true, 'data' => ['message' => 'Post updated successfully.']];
die(json_encode($data));

Copy paste the above into an index.php file and then run this file using php -S localhost:8000. This will serve the output on the browser if you visit http://localhost:8000.

Now, its always a good idea to install your test pack along with your code so it always stays in sync while you move about your commit history (using a version control system). Create a composer.json file:

{
    "require-dev": {
       "behat/behat": "~3.0",
       "genesis/behat-api-spec": "~0.1"
    }
}

Then run composer install to install behat and related dependency.

Create a behat.yml file add the following:

default:
  suites:
    default:
      paths:
        - features
      contexts:
        - Genesis\BehatApiSpec\Context\ApiSpecContext 
  extensions:
    Genesis\BehatApiSpec\Extension:
      baseUrl: http://localhost:8000
      specMappings:
        endpoint: \Genesis\ApiSpecTests\Features\ApiSpec\Endpoint\
        path: ./tests/Features/ApiSpec/Endpoint/

Create test files:

./vendor/bin/behat --init

This will create a features folder containing a FeatureContext class. Running ./vendor/bin/behat will give you the following result at this stage:

If it does not you probably need to check what you’ve missed above.

We’re now in a position to test our API. Create a new feature file – features/api.feature with the following content in it

Feature:
  In order to know that the API works and returns expected responses
  As a developer
  I want to test its responses

  Scenario: Returns default response on GET request
    Given I make a GET request to the "Index" endpoint
    Then I expect a 200 "Index" response

At this point if you run ./vendor/bin/behat you should get the following result:

Good stuff. By default behat’s autoloading (psr-0) mechanism looks for files to load in the features/bootstrap folder. As per our configuration in the behat.yml file we’ve said that the endpoint files are to be found in the Endpoint folder inside features/bootstrap so lets make it a file in there – features/bootstrap/Endpoint/Index.php

<?php

namespace Endpoint;

use Genesis\BehatApiSpec\Contracts\Endpoint;

class Index implements Endpoint
{
    public static function getEndpoint()
    {
        return '/';
    }

    public static function getRequestHeaders(): array
    {
        return [];
    }
}

Now run ./vendor/bin/behat and see the schema magically scaffold.

This is due to the “I expect …” step definition. The schema is generated in the Entity/Index.php file which you can adjust with various validation rules etc for flexible testing rules. Open the features/bootstrap/Endpoint/Index.php file to inspect. Great stuff!

Now running ./vendor/bin/behat will pass with flying colours if nothing changes.

At this stage your test only validates the data types, you can add specific value validation or regular expression validation if necessary. To get a full list of possible validation step definitions available out of the box run:

./vendor/bin/behat -dl

Snapshot testing

A powerful tool in this extensions arsenal is snapshot testing. Its a self maintaining form of testing and incredibly useful in detecting slight changes in API responses. The idea is when you run a test first time with a snapshot request it stores the response for subsequent tests to compare them against. If anything changes the test fails and flags them up. You can pass in flags to auto update the responses if you’re happy with the changes or adjust the code, simple! Pull/Merge requests can get convoluted when it comes to validating JSON but in this case its fairly simplistic given the snapshots are isolated.

Feature:
  In order to know that the API works and returns expected responses
  As a developer
  I want to test its responses

  Scenario: Returns default response on GET request
    Given I make a GET response to the "Default" endpoint
    Then the response should match the snapshot

The above Then response should match the snapshot will produce a snapshot in the same folder which will be maintained by behat-api-spec mostly through the –update-snapshots flag.

A new file will be created in __snapshots__ folder which is to be committed to the repository. Rerunning the test will pass unless something changes in the API. This is a much stricter way of testing the API but with very low cost due to the auto update flag if there is an issue. Say we change the success message in the actual API to “Thanks the post has been updated.”. Rerunning after gives:

Run ./vendor/bin/behat --update-snapshots to update the snapshot and pass the test.

Bonus

When you get failures, it is very common when drilling into the issue to validate the actual implementation and its workings – you can use the --sample-request=curl to generate sample requests that can be copy paste with pre-configured data to test the actual implementation directly.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *