Speeding up Behat tests for Drupal on the Travis environment

Roomify, LLC · January 27, 2016

Background

Implementing continuous integration of behaviorally-driven tests is a fairly heavy-weight process. In order to run a comprehensive battery of test cases, it’s necessary to set up a complete testing environment for each commit. This involves things like:

  •  downloading:
    • a browser executable
    • drush
    • Drupal core 
    • all dependent modules
    • Behat itself
    • Selenium
  • installing Drupal
  • instantiating an HTTP server

Making this process as efficient as possible has many benefits, including preserving shared resources for public repos (or your money, for private repos!) and speeding up one’s entire development workflow. Below we will describe some of the tactics we employ to make testing on Travis faster.  

Tactics

Caching

A great deal of the time taken during a build is in simply downloading and installing the various dependencies. It would be possible to speed things up considerably by simply generating an entire build environment and storing it in the repo for travis to use. However, in addition to being an ugly solution, it would take manual intervention any time a dependency was updated. We can instead get a sizable boost in performance by leveraging Travis’ dependency and directory caching. In our case, for the drupal bat module we cache the drush cache, composer cache and behat bin directories. This prevents drush and composer from re-downloading remote packages if they haven’t been changed. We also cache our download of Selenium.

Speeding up composer

When you use composer with Travis’ default settings, you will notice a few things that are… non-optimal. The first message you will see is the following:

You are running composer with xdebug enabled. This has a major impact on runtime performance.

Travis enables xdebug by default - this is necessary for code coverage reporting, but if you are only doing behaviorally driven testing, you can improve things somewhat by following the instructions at the link.

The second message you will see will be something like:

Failed to download pear/console_table from dist: Could not authenticate against github.com
Now trying to download from source

When composer can not obtain the packaged version of a library from github, (due to API rate limits) it will clone the entire repository - depending on the size of the repository and the length of its history, the size of the download could be an order of magnitude greater. To solve this problem, one needs to provide a github oauth token for travis to use. This blog post by Cees Jan Kiewiet provides a great tutorial for doing so.

Future optimizations

At this time, the actual test run is a small enough percentage of the total build time that we’re not particularly concerned with optimizing its speed. However, as our test suite grows, we will likely explore the following possibilities:

  • Enable the PHP APC extension
  • Enable memcache
  • Enable CSS/JS aggregation in the Drupal installation
  • Consider code-level optimization of specific Behat steps

Results

We performed three builds for each of the following scenarios on our bat_drupal repository. Each build was performed on PHP 5.4, 5.5 and 5.6 environments, for a total of 9 builds per scenario. The results:

Un-optimized build

#1: 15 min 7 sec
#2: 34 min 47 sec
#3: 58 min 7 sec (The PHP 5.6 build failed while attempting to download a symfony package from source)

Average: 36 min 0 sec

All optimizations, with Travis’ cache cleared before each build

#1: 12 min 47 sec
#2: 13 min 31 sec
#3: 13 min 25 sec

Average: 13 min 14 sec

All optimizations, with cache warmed

#1: 9 min 43 sec
#2: 9 min 0 sec
#3 11 min 49 sec

Average: 10 min 10 sec

Conclusion

Some of the biggest gains actually come from setting up a github oAuth token. Taking the time to implement these strategies for Travis builds is well worthwhile, with significant, measurable improvements available. Otherwise, your test execution times are at the mercy of Travis’ connection to the outside world. Doing this will help preserve an important resource for the open source community for public repos, and in the case of private repos, might just save you some money!

Twitter, Facebook