On Using Jasmine in xpcshell Tests

supporting behaviour-driven development in your projects. Like most JS frameworks...

Tomas Brambora
Tomas Brambora

Table of Contents

Jasmine is a JavaScript framework for supporting behaviour-driven development in your projects.

Like most JS frameworks, when used for client side development Jasmine expects an environment that meets certain conditions (e.g. it assumes the existence of a global window object).

In Salsita, however, we're in the business of creating browser add-ons - which happens to mean that most of the time, some of those conditions are not met.

In the world of XUL-based Firefox add-ons, code is (or should be anyway) usually structured in code modules, which import each other and are themselves imported from the XUL window in a manner similar to HTML. This provides a nice way to encapsulate the logic into individual files. Furthemore, Firefox add-ons can use a special console application, xpcshell, that allows for relatively convenient (and more importantly - fast) automated testing of these modules. When it comes to BDD (and Jasmine in particular), however, there are some flies to be found in our agile testing ointment.

First, as has already been said, there is no global window object. That should come as no surprise, since, well, xpcshell is a console application. Luckily enough, when we go through Jasmine's code, it is clear that we need just a handful of methods from the window object - namely those related to timers: setTimeout, setInterval and the respective clear* counterparts. Hence, the solution is easy - we make a fake window object and fob it off to Jasmine.


/**
 * Fake global window object and some global functions (used to allow us to
 * import scripts that require those). */
var EXPORTED_SYMBOLS = ["window", "setTimeout", "clearTimeout", "setInterval", "clearInterval"];
if (!Cc) var Cc = Components.classes;
if (!Cu) var Cu = Components.utils;
if (!Ci) var Ci = Components.interfaces;
var window = {
    document: {},
    location: {},
    setTimeout: setTimeout,
    setInterval: setInterval,
    clearTimeout: clearTimeout,
    clearinterval: clearInterval
};
var _timers = [];

function setTimer(fun, timeout, type) {
    var timer = Cc["@mozilla.org/timer;1"].createInstance(Ci.nsITimer);
    _timers.push(timer);
    var event = {
        notify: function (timer) {
            fun();
        }
    };
    timer.initWithCallback(event, timeout, type);
    return timer;
}
function setTimeout(fun, timeout) {
    return setTimer(fun, timeout, Ci.nsITimer.TYPE_ONE_SHOT);
};

function setInterval(fun, timeout) {
    return setTimer(fun, timeout, Ci.nsITimer.TYPE_REPEATING_SLACK);
};

function clearTimeout(timer) {
    if (!timer) {
        return;
    }
    timer.cancel();
    var i = _timers.indexOf(timer);
    if (i & gt; = 0) {
        _timers.splice(_timers.indexOf(timer), 1);
    }
}
var clearInterval = clearTimeout; 

We're going to import this fake window object in the head_init.js fixture file that is run before the tests in our test directory (in the newer Firefox versions, you have to specify the fixtures in the xpcshell.ini file).

An important point to note is that we do not want to use the standard Component.utils.import call to import Jasmine within our tests. Doing that would require changes to the library code, because JS code modules expect exported symbols to be specified explicitly using EXPORTED_SYMBOLS. Instead, we use mozIJSSubScriptLoader and load the library into the test scope (after importing the fake window object).

var loader = Cc["@mozilla.org/moz/jssubscript-loader;1"].getService(Ci.mozIJSSubScriptLoader);
loader.loadSubScript("resource://myAppId/frameworks/jasmine.js");
loader.loadSubScript("resource://myAppId/frameworks/jasmineReporter.js");

Next, we create the alias so that resource://<addonid>/ URLs map to the correct path (and the Cu.imports in our modules work when the code is run in xpcshell). The following snippet is taken from a recipe on MDC.

file = do_get_file(".", false);
var ioService = Cc["@mozilla.org/network/io-service;1"].getService(Ci.nsIIOService);
var resProt = ioService.getProtocolHandler("resource").QueryInterface(Ci.nsIResProtocolHandler);
var aliasURI = ioService.newFileURI(file);
resProt.setSubstitution("myAppId", aliasURI);

And we're almost there! The last thing is to make sure the BDD specs are actually run. Jasmine runs all the specs asynchronously, whereas xpcshell expects synchronous tests. That means if we don't tell xpcshell to wait until the specs have finished, none of the specs will actually be run. Therefore, we use a little trick here.

// Initialize Jasmine BDD framework.
var jasmineEnv = jasmine.getEnv();
var reporter = new jasmine.ConsoleReporter(dump, function(runner) {
  var results = runner.results();
  if (results.failedCount > 0) {
    // throw using xpcshell do_throw to report an error (and make the test fail).
    do_throw("Test failed");
  }
  // Inform xpcshell that we're done.
  do_test_finished();
}, false);
jasmineEnv.addReporter(reporter);
function runSpecs(specFun) {
  // Tell xpcshell that we're doing asynchronous stuff.
  do_test_pending();
  // Load the test suite.
  specFun();
  // Run the test specs.
  jasmineEnv.execute();
} 

The runSpecs function is to be called from the test_* files that contain the actual test code (the parameter is a wrapper function for the Jasmine test suite). It sets a "pending" flag that tells xpcshell we're running something asynchronous here and it should not quit right away but rather wait for us to signal that we are done.

We're using the reporter's callback function to check whether all the specs have been run. If there were any failures, we call xpcshell's do_throw to make the test fail (which quits the tests, so there's no need to unset the pending flag). Otherwise, we just unset the flag and - we're done!

Code & ToolsJavaScript EngineeringTesting & QA

Talk To Our Spicy Experts