Rails 6: parallel system tests with external Selenium Docker containers

In the previous post, I described my setup for Rails development with Docker and Kubernetes, using Telepresence so that my local containers appear as if they were part of the Kubernetes cluster. In this post, I will show how I now use separate Selenium containers to run Capybara/system tests, instead of Chrome and chromedriver installed in the app’s own Docker image. I have had a few problems with setting up Chrome/chromedriver and the relevant Capybara configuration correctly, so I found that using an external, ready Selenium setup instead can actually simplify things a lot.

Besides using Selenium containers to run tests, since my last post I also switched from Alpine to the Debian-based Ruby image for my app. This is because I had problems using Alpine with Rails 6 and it was just easier to use the Debian version in order to save some headaches with incompatibility issues due to the use of musl instead of glibc, although this means that the final Docker image is of course somewhat bigger.

This is what my current Dockerfile looks like:

ARG RUBY_VERSION=2.6.4

FROM ruby:$RUBY_VERSION-slim AS development

RUN apt-get update -qq && DEBIAN_FRONTEND=noninteractive apt-get install -yq --no-install-recommends curl wget

RUN curl -sL https://deb.nodesource.com/setup_11.x | bash -

RUN curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - \
  && echo 'deb http://dl.yarnpkg.com/debian/ stable main' > /etc/apt/sources.list.d/yarn.list

RUN apt-get update -qq \
  && DEBIAN_FRONTEND=noninteractive apt-get -yq dist-upgrade \
  && DEBIAN_FRONTEND=noninteractive apt-get remove -yq cmdtest \
  && DEBIAN_FRONTEND=noninteractive apt-get install -yq --no-install-recommends \
    sudo \
    build-essential \
    default-mysql-client default-libmysqlclient-dev \
    nodejs yarn \
    git-core \
    imagemagick \
    tzdata \
  && apt-get clean \
  && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* \
  && truncate -s 0 /var/log/*log

ENV RAILS_ENV=development
ENV RACK_ENV=development
ENV RAILS_LOG_TO_STDOUT=true
ENV RAILS_ROOT=/home/rails/app
ENV LANG=C.UTF-8
ENV GEM_HOME=/home/rails/bundle
ENV BUNDLE_PATH=$GEM_HOME
ENV BUNDLE_APP_CONFIG=$BUNDLE_PATH
ENV BUNDLE_BIN=$BUNDLE_PATH/bin
ENV PATH=$RAILS_ROOT/bin:$BUNDLE_BIN:$PATH

RUN gem update --system

RUN useradd --create-home -s /bin/bash rails \
  && echo "rails ALL=(ALL:ALL) NOPASSWD: ALL" | tee -a /etc/sudoers

WORKDIR $RAILS_ROOT

RUN mkdir -p $RAILS_ROOT/tmp/cache \
  && mkdir -p $RAILS_ROOT/node_modules \
  && mkdir -p $RAILS_ROOT/public/packs \
  && mkdir -p $BUNDLE_PATH \
  && chown -R rails:rails /home/rails

USER rails

COPY --chown=rails:rails Gemfile Gemfile.lock ./

RUN gem install bundler \
  && bundle install -j "$(getconf _NPROCESSORS_ONLN)"  \
  && rm -rf $BUNDLE_PATH/cache/*.gem \
  && find $BUNDLE_PATH/gems/ -name "*.c" -delete \
  && find $BUNDLE_PATH/gems/ -name "*.o" -delete  

COPY --chown=rails:rails package.json yarn.lock ./

RUN yarn install

COPY --chown=rails:rails . ./

EXPOSE 3000

CMD ["bundle", "exec", "puma", "-Cconfig/puma.rb"]

# Production

FROM ruby:$RUBY_VERSION-slim AS production

RUN apt-get update -qq && DEBIAN_FRONTEND=noninteractive apt-get install -yq --no-install-recommends curl

RUN curl -sL https://deb.nodesource.com/setup_11.x | bash -

RUN curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - \
  && echo 'deb http://dl.yarnpkg.com/debian/ stable main' > /etc/apt/sources.list.d/yarn.list

RUN apt-get update -qq \
  && DEBIAN_FRONTEND=noninteractive apt-get -yq dist-upgrade \
  && DEBIAN_FRONTEND=noninteractive apt-get install -yq --no-install-recommends \
    default-mysql-client default-libmysqlclient-dev \
    git-core \
    imagemagick \
    tzdata \
    nodejs yarn \
    sudo

WORKDIR /app

ENV RAILS_ENV=production
ENV RACK_ENV=production
ENV RAILS_LOG_TO_STDOUT=true
ENV RAILS_SERVE_STATIC_FILES=true
ENV RAILS_ROOT=/home/rails/app
ENV LANG=C.UTF-8
ENV GEM_HOME=/home/rails/bundle
ENV BUNDLE_PATH=$GEM_HOME
ENV BUNDLE_APP_CONFIG=$BUNDLE_PATH
ENV BUNDLE_BIN=$BUNDLE_PATH/bin
ENV PATH=/app/bin:$BUNDLE_BIN:$PATH
ENV SECRET_KEY_BASE=blah

RUN useradd --create-home -s /bin/bash rails \
  && echo "rails ALL=(ALL:ALL) NOPASSWD: ALL" | tee -a /etc/sudoers

WORKDIR $RAILS_ROOT

COPY --from=development --chown=rails:rails $BUNDLE_PATH $BUNDLE_PATH
COPY --from=development --chown=rails:rails $RAILS_ROOT ./

RUN RAILS_ENV=production bundle exec rake assets:precompile

RUN rm -rf node_modules tmp/* log/* app/assets vendor/assets lib/assets test \ 
  && yarn cache clean 

RUN mkdir -p $RAILS_ROOT/tmp/cache \
  && mkdir -p $RAILS_ROOT/tmp/pids \
  && mkdir -p $RAILS_ROOT/node_modules \
  && mkdir -p $RAILS_ROOT/public/packs \
  && mkdir -p $BUNDLE_PATH \
  && chown -R rails:rails /home/rails

USER rails

EXPOSE 3000

CMD ["bundle", "exec", "puma", "-Cconfig/puma.rb"]

As you can see, it’s more or less the same configuration as with the Alpine version, but the Debian way. The biggest difference is that I am no longer installing Chrome and chromedriver, since like I said I am now using an external Selenium setup for my tests.

While for the development environment I use the integration with Kubernetes thanks to Telepresence, for the test environment I set up everything with local containers so that tests run faster. I use Docker Compose for this, so this is what my docker-compose.yml looks like:

version: '3.7'

networks:
  myapp:

volumes:
  myapp_mysql:
  myapp_redis:
  myapp_rails_cache:
  myapp_bundle:
  myapp_node_modules:
  myapp_packs:

services:
  selenium:
    image: selenium/hub
    container_name: myapp-selenium
    ports:
      - 4444:4444
    networks:
      - myapp
    environment:
     GRID_MAX_SESSION: 10
  chrome:
    image: selenium/node-chrome
    networks:
      - myapp
    depends_on:
      - selenium
    environment:
     HUB_HOST: myapp-selenium
     NODE_MAX_INSTANCES: 5
     NODE_MAX_SESSION: 5

  mysql:
    container_name: myapp-mysql
    image: percona
    ports:
      - 3308:3306
    env_file:
      - ~/.secrets/myapp-test.env
    volumes:
      - myapp_mysql:/var/lib/mysql
    networks:
      - myapp

  redis:
    container_name: myapp-redis
    image: redis
    volumes:
      - myapp_redis:/var/lib/redis
    networks:
      - myapp

  app:
    container_name: myapp-test
    image: username/myapp-dev
    env_file:
      - ~/.secrets/myapp-test.env
    volumes:
      - ${PWD}:/home/rails/app:cached
      - myapp_rails_cache:/home/rails/app/tmp/cache
      - myapp_bundle:/home/rails/bundle
      - myapp_node_modules:/home/rails/app/node_modules
      - myapp_packs:/home/rails/app/public/packs
    depends_on:
      - selenium
      - mysql
      - redis
    networks:
      myapp:
        aliases: 
          - myapp.com

Like for the development environment, I am using volumes to cache bundle, node_mdules, cache and packs between builds/runs. Note the two services selenium and chrome: the first one is to start a Selenium hub which nodes will connect to; the hub will then load balance sessions between any number of nodes. The second service is for the actual node using the Chrome browser. Also note that I am specifying a domain alias for the app container; the browser in the nodes will connect to this domain to run the tests.

To set up the test environment with Docker Compose, you can run:

docker-compose up -d

This will create a single Chrome node, so if you want to use parallel tests with Rails 6 you’ll need to scale the containers. For example, on my Mac I have 4 cores, so I run:

docker-compose scale chrome=4

This will create an additional 3 nodes, and all the nodes will register with the hub. You can verify by looking at the logs of the selenium container:

docker logs -f myapp-selenium

You should see something like this:

10:08:51.964 INFO [DefaultGridRegistry.add] - Registered a node http://172.28.0.5:5555
10:08:56.565 INFO [DefaultGridRegistry.add] - Registered a node http://172.28.0.9:5555
10:08:56.720 INFO [DefaultGridRegistry.add] - Registered a node http://172.28.0.7:5555
10:08:56.766 INFO [DefaultGridRegistry.add] - Registered a node http://172.28.0.8:5555

Selenium is now ready to accept connections and manage sessions using the available nodes. Please note that with this setup, you can even add remote nodes! This enables you to scale your testing workers with multiple machines, if you wish. The important bit is that you set the HUB_HOST environment variable correctly for the remote nodes.

Next, we need to enable parallel testing in Rails 6. This is super easy and it only requires a line in test/test_helper.rb:

class ActiveSupport::TestCase
  parallelize(workers: :number_of_processors)
  ...
end

I prefer leaving the workers argument set to automatically use the available cores, but you can override it if you wish.

Finally, open test/application_system_test_case.rb and change the driven_by line as follows:

driven_by :selenium, using: :headless_chrome, screen_size: [1400, 1400], options: { url: "http://myapp-selenium:4444/wd/hub" }

The url option is what tells Rails/Capybara that we want to use an external Selenium setup instead of a local Chrome/chromedriver setup.

That’s it! You should now be able to open a shell in the app container:

docker-compose exec app bash

and run system tests as usual:

rails test:system

If all is set up correctly the tests will be performed by the Selenium nodes and will take a lot less time to run compared to a single worker setup.

Please note that for this to work you cannot set a static port with Capybara, because otherwise it will attempt to run multiple worker processes all binding to the same port. So instead of something like:

Capybara.server_port = "3000"

you should have something like:

Capybara.always_include_port = true

and change your config/environments/test.rb as follows:

config.action_mailer.default_url_options = { host: 'myapp.com' }

without specifying the port. myapp.com should be the same as specified for the network alias in the docker-compose.yml file.

I really like the paralle tests feature in Rails 6. Compared to other solutions I have tried in the past it makes it a lot easier to set up parallel testing and it does make test runs a lot quicker. Give it a try, and let me know in the comments if you run into any issues with this setup.