Speeding up Web App Test Automation Using Shishito

You know the drill. A new release is ready so you get your hands on it and try to break it. If you succeed, you file bugs (no shame in that!). Later on, you verify that they have been fixed and greenlight the release. In a week or two you get a new build and it starts all over again. Such is the lot of the humble QA Engineer.

As a software consultancy, we are always working on a number of projects simultaneously. We could easily be overwhelmed by all the test requests our developers are throwing at us. Our efforts to switch to continuous deployment help to redistribute testing more evenly over a project's lifetime. Still, keeping up with the release schedule can sometimes be tough. Luckily, test automation comes to the rescue.

Finding the Right One

We have been using Selenium WebDriver for quite a while to test web app projects. However, we soon started to wonder how to standardize our test development process so that we can start working on integration tests at the very beginning of each new project while benefiting from our accumulated test automation experience. We started looking for an existing framework which would serve our needs but failed to find one that meets all our requirements.

Our goals were:

  • to define a standardized process for test automation of our web app, mobile app and browser extension projects while keeping tests as flexible as possible.
  • to use power of existing unit testing frameworks.
  • to run tests in the cloud.
  • to generate standalone test results independent of the continuous integration server used.

We wanted to avoid frameworks that, despite being quite powerful, would force on us the use of pseudo-code or some other constraints. Robot Framework and Fitnesse seem very capable, yet their high level of abstraction could be a problem when automating apps with different technology baselines, as we often do. We did not want to be hacking things in or spending a lot of time developing custom libraries so that we could use those frameworks across the range of project types we work on.

We believe test automation should be adapted to individual project needs. Some projects may indeed benefit most from standard user interface acceptance tests. Other can, however, make better use of tests that evaluate the accuracy of content that app serves to its users. Another test strategy may focus on checking that real user data stay intact after database migration. Tests must simply bring some value to the final product, but the best way to provide this value may vary from project to project.

At Salsita, testers are expected to be able to code. Knowing this, we do not feel it is necessary to shelter ourselves from the technical aspects of testing. Using one of the frameworks mentioned above would limit us in our goal of finding the best ways to improve the quality of our apps.

We are aware that others have already faced a similar situation. Projects such as WTF Framework or Mozilla's MozzWebQA inspired us in many ways. But their configuration and test execution options seemed too limited to us. We considered contributing to these projects but realized we'd have to change their architecture too much to adapt them to our needs. So in the end we wrote our own simple test automation framework.


Shishito (https://github.com/salsita/shishito) is open-source library for web app and browser extension integration testing with Selenium Webdriver & Python. It takes the pain out of our test development process. Get assigned a new project to test? You can just fetch a new project skeleton and start writing your tests, and Shishito takes care of the rest. It runs tests locally in various browsers or in the cloud (including mobile browsers) using the excellent BrowserStack service. It provides libraries with the most common functions your tests can use, and it automatically installs browser extensions you might need to test.

To be able to test some projects effectively, tweaks to standard test configuration (e.g. setup and teardown functions) are sometimes required. Rather than trying to figure out the ultimate universal logic applicable to every project (which is clearly impossible anyway), the idea was to preserve flexibility by using the concept of inheritance. Simply create a custom project-specific library/test-runner and extend the standard behavior however you like.

Test Results

A handy HTML test result report is generated after test suite execution completes.
It contains detailed information about test outcomes. If BrowserStack is used for running the tests, the report is structured by operating system and browser combinations. The execution log and screenshots are automatically captured for each failed test.

The standard xUnit report format is also generated and can be imported into Jenkins or used for further data processing.

Configuration and Execution

Shishito comes with configuration settings which allow you to define your own test variables and also enable easy switching between multiple test environments. This is especially handy when you want to quickly check how your tests perform on a web app deployed to the customer's server, where the real production API is used, after trying them out locally using only mock services.

# Browserstack Environments Configuration
# Desktop Browsers


# Mobile Browsers

device=iPad Air
Continuous Integration

Integrating the library within your CI server is simple, as following example demonstrates. This is a shell script for Jenkins that retrieves the latest Shishito build, adds it into the PYTHONPATH and runs tests using a dedicated test runner.

git clone git@github.com:salsita/shishito.git
python project_test_runner.py 

Getting Started

Here's how to give Shishito a try:

  1. Clone the library repository and sample project
  2. Make sure the library pre-requisities are satisfied
  3. Add Shishito sources to your PYTHONPATH (PYTHONPATH=${PYTHONPATH}:/<shishito_path>/shishito)
  4. Run sample tests by executing python <sample_project_path>/google_test_runner.py
  5. View the generated HTML results in the <sample_project_path>/results folder.

For detailed setup instructions, see the Github README.

That's about it. We're keeping Shishito intentionally small and versatile. Nonetheless, if you feel you're missing some feature or you have found a bug (testing the testing framework, are we?), feel free to raise an issue in GitHub or even contribute to the project by submitting a pull request. That would be much appreciated!

And now you'll have to excuse me, a new project is starting and the tests won't write themselves (yet).

Vojtěch Burian

Vojtěch Burian