Smartphones

Tablets

Laptops

Monitors

3D-Printers

Cameras

Solid State Drives

Flash Drives
The Product Chart Blog

A blog about x, y and other dimensions

UrlDiff - Simple visual regression testing

A few days ago, a small CSS change in the Smartphone Section caused the 3D-Printers Section to go haywire in Chrome. I did not notice before pushing it to the production server. For one because I use Firefox when I am coding, and second because I was focusing on the Smartphone Section.

To prevent this in the future, I decided it's time for automated visual regression testing!

I took a look at diff.io but that would cost $200/month and still be a bit limited with 258 daily page comparisons. If you have 50 pages in your test and already ran 5 tests today - then what? I also took a look at Ghost Inspector. While it has a lot of nice functionality, for some reason it failed on the Product Chart pages.

Existing self-hosted solutions like this one based on wraith come with a complex set of dependencies. And run in a 2-pass way. First they render two sets of screenshots, write them to disk, then they compare them and report on the number of differences.

Thinking about it, I decided that my favorite solution would be a simple shellscript, that visually compares all pages of my development server with the corresponding pages on the production server. One by one, without hitting the disk at all. And halting as soon as a difference is detected.

Not long ago, I read about cutycapt, a command line tool that renders websites via WebKit. Could it be used to compare two versions without much overhead?

Installation is easy:

apt-get install cutycapt
Now let's pass the output of two cutycapt calls to cmp:
$ cmp -s <( cutycapt --out-format=bmp --out=/dev/stdout --url=google.com ) \
         <( cutycapt --out-format=bmp --out=/dev/stdout --url=yahoo.com  )
That outputs:
$ /dev/fd/63 /dev/fd/62 differ: byte 3, line 1

Wow, that's nice! Visual comparison from the shell with just one dependency and no temp files.

Time to think up a little config file for the tests. I immediately knew I wanted something simple like this:

urldiff.confwww.server1.com
www.server2.com
/
/about
/blog
/animals/dogs
/animals/cats
...
Turns out the script to process it only needs 16 lines of bash:

urldiff.shc() { cutycapt --out-format=bmp --out=/dev/stdout "$@"; }

{
 read prefix1; read prefix2;

 while read -r url
 do
  echo $url 
  a="$prefix1$url"
  b="$prefix2$url"
  while ! cmp -s <( c --url=$a ) <( c --url=$b )
  do
   echo " different. hit return to retry."
   read -s input </dev/tty
   echo $url 
  done
 done
} < urldiff.conf

That's it. Just put all your urls in urldiff.conf and then run urldiff.sh anytime to assert that no visual regression took place.

When urldiff hits a page that is different, it will kindly ask you to fix your stuff and then checks the page again.

Even though not explicitely coded into the script, some convinient additional functionality is automatically available:

That's it. Happy urldiffing!