Automated Firefox tests with add-ons installed

Mozmill has a feature that allows the tester to install an add-on during the test run. Until recently this was only used by one of our automated testruns, which was for specifically testing the installed add-on rather than Firefox.

With the recent development on the endurance tests project, it has been necessary to take more advantage of this feature. A bug was reported where if Adblock Plus – our most popular add-on – is installed memory usage increased rapidly when navigating a web page, and none of the memory was being released. To start investigating this I created a very basic (and specific) test for the site mentioned in the bug report, and then simply hacked something together based on the existing add-ons test run. A short time later, the need to run the endurance tests with multiple add-ons installed came up, so I hacked some more to get that in place too. Rather than keep these hacks around, it made sense to allow testers to specify add-ons to be installed during any of our testruns, so I started to work on the necessary patches.

As a result, testers can now run any of our automation scripts with one or more add-ons installed by simply specifying the addons command line parameter. To install multiple add-ons simply repeat the parameter. The argument can either be a path on your machine, or a web/ftp server. In the latter case the add-on will be downloaded to a temporary location before the testrun and removed at the end. The latest version of Mozmill (1.5.2) also now disables the compatibility check, meaning that we can run tests with add-ons that are not marked as compatible with the version of Firefox in use.

An example of running the endurance tests with two add-ons installed:

./testrun_endurance.py --addons=https://addons.mozilla.org/firefox/downloads/latest/748/addon-748-latest.xpi --addons=/Users/dave/Downloads/noscript.xpi --delay=1000 --iterations=10 /Applications/Firefox.app/

Introducing Firefox Endurance Testing

Since late last year I have been working on a prototype of an Endurance Testing project for Firefox. The idea is to use our existing Mozmill framework for automating UI testing of Firefox to write tests that stress and strain the browser over time. I’ve heard many times from people that Firefox needs to be restarted once in a while because it’s become sluggish, and indeed I’ve experienced this myself. The problem is that there are rarely clear steps to reproduce this issues as they normally are an accumulation of many actions over an extended period of time. What actions cause a degradation in performance, and why? This is what we hope to discover with the endurance tests project.

The initial implementation of endurance tests is rather simple: Create a test snippet that exercises a function of Firefox, and execute it repeatedly whilst gathering details of resources in use. Ultimately we may come up with more elaborate tests, and but it’s important to get a proof of concept.

There are several components to the endurance tests:

  1. Command line automation script
  2. Test snippets
  3. Resource gathering
  4. Reporting

The number of iterations each snippet repeats can be set on the command line, as well as an optional delay between each iteration. The normal command line options allow for logging, and reporting to a Mozmill Dashboard instance.

Triggering the endurance tests currently looks something like this:

./testrun_endurance.py --delay=1000 --iterations=50 --report=http://davehunt.couchone.com/mozmill /Applications/Minefield.app

This will then launch Firefox, run through all of the endurance tests (each one iterating over it’s test snippet 50 times), and then close Firefox. Because I’ve included a report parameter, the report will also be sent to the Mozmill Dashboard instance. These reports are currently available here.

Here’s a short screencast that demonstrates running the endurance tests:

If you’re interested in following the progress of the endurance tests project, check out the project page or the tracking bug for phase one.

I’m (sort of) going to Selenium Conference 2011!

For a long time now I’ve been chasing the idea of a conference for Selenium. It probably started when I heard about the San Francisco Selenium users meetup group, and was certainly on my mind when I started the London version of the group.

I think it was back in 2009 when Adam Goucher started some initial discussions on when/where such a conference would be held, and although it’s taken a while, now here we are just 34 (there’s an inside joke there) days from Selenium Conference 2011! Okay, so it’s not the first Selenium conference[1], but it’s still pretty awesome!

As if it wasn’t enough excitement that I’ll be attending, and finally meeting some of the people I’ve had the pleasure of working with over the last few years, but I will also be giving a presentation on automating canvas using Selenium! For me this is equal parts exciting and terrifying as I’m not a confident speaker, but it does help that I’ll be co-presenting with my good friend (and old colleague), Andy Smith. It also helps that together we have created a presentation and demo that I’m really pleased with and excited to show off!

Unfortunately, it turns out that the Mozilla all-hands will clash with the conference. I have to say I’m really frustrated about this, although I do understand organising these events is not easy, and that it would have been a struggle for me to fit in a longer trip. So the plan now is for me to hire a car and drive between San Francisco and Mountain View for the various planned activities.

Today the speaker schedule was announced, and it turns out that we will be the first talk after Jason Huggins gives the opening keynote. Part of this fills me with fear as Jason will be a tough act to follow, and any technical issues might not have been ironed out. On the other hand at least we will get the nervous job of presenting out of the way early and then relax into the event.

If you’re planning on going to Selenium Conference you can get your tickets here, or if you’re interested in our presentation then we’ll be giving a preview of it at the London Selenium users meetup group on March 23rd, which you can sign up for here.

[1] The first Selenium conference was Selenium Camp, which took place last weekend. Here’s a nice writeup from David Burns who was invited along to give the opening talk.

Mozilla in London for Selenium Meetup #3

On Wednesday, the third London Selenium meetup went ahead. Attendance was good considering there was a London Underground strike in action, however it had clearly made an impact. I arrived at midday and enjoyed a chilled out afternoon at Google’s offices – much more relaxed than my previous visit as I wasn’t actually presenting this time!

The event started with a presentation on how Mozilla uses Selenium to test their sites. Stephen Donner described the infrastructure and technologies in place, and explained the benefits of a page object approach to testing.

This was followed by a rather anticipated update from Simon Stewart on the progress of Selenium 2 (beta soon!) We were also fortunate enough to have Jason (Mr Selenium) Huggins at the event, and he even stepped up to answer some questions on the advantages of moving to Selenium 2.

David Burns then demonstrated a much updated version of his GTAC 2009 presentation on automating gathering client side performance metrics – now using Selenium 2! There was also a peek into using the bleeding edge WebTimings API.

In my personal favourite presentation of the evening (probably because it’s my new area of work), Henrik Skupin gave a demonstration of how Mozilla are approaching crowdsourcing their test automation with the upcoming MozMill Crowd Testing addon for Firefox. There were definitely a few ‘wow’s from the audience when Henrik ran the tests.

Something I found very encouraging was the quality of questions coming from the audience. I’d like to thank those that came, it’s great to get everyone together, and really great to recognise people from previous events!

This was also the first event where the #ldnse hashtag was actively used on Twitter, which is also encouraging. After LDNSE2 I was considering trying to find someone else to continue the events, but I’m glad I didn’t as I’m now really looking forward to organising LDNSE4! Our frequency at the moment is about one every six months, so I’ll be looking to at least keep this regular.

Thanks again to everyone for coming, to all of our presenters and contributors, and to Google for hosting again!

Update: slides/videos/blogs available below.

YouTube channel (all the videos):
http://www.youtube.com/user/londonselenium

How Mozilla Uses Selenium – Stephen Donner
Slides: http://www.slideshare.net/stephendonner/selenium-londonmeetup-5671730
Video (Part 1): http://www.youtube.com/watch?v=Kvd_TIxLziI
Video (Part 2): http://www.youtube.com/watch?v=ATtXDuUlt9Q

Update on Selenium 2 – Simon Stewart & Jason Huggins
Video (Part 1): http://www.youtube.com/watch?v=AYJMct82YXg
Video (Part 2): http://www.youtube.com/watch?v=HYSJUSI3_VU

Client-side profiling with Selenium 2 – David Burns
Slides: http://prezi.com/dgqpq7bywuin/client-side-profiling-with-selenium-2/
Video (Part 1): http://www.youtube.com/watch?v=2TSJHJfbOHE
Video (Part 2): http://www.youtube.com/watch?v=NrvN8HwmpQ4

Crowd-sourcing Automated Firefox UI Testing – Henrik Skupin
Slides: http://www.slideshare.net/hskupin/crowdsourced-automated-firefox-ui-testing
Video (Part 1): http://www.youtube.com/watch?v=O8NaG07NoLc
Video (Part 2): http://www.youtube.com/watch?v=TIfH5Bku20U
Blog: http://www.hskupin.info/2010/11/19/mozmill-crowd-talk-at-selenium-meetup-3-in-london/