Testing Rails apps in Docker containers
An important part of a good development experience is a fast feedback loop from the tests. I like to be able to run the tests in the currently open file just by a keyboard shortcut. If I can also run the test under the cursor then I feel at home.
In the previous post I described how to setup a Ruby on Rails development environment with Docker on OSX. But I didn't have yet a working solution for testing.
I have been a terminal guy for years and I feel comfortable with Vim when working on Rails projects. vim-rspec plugin allows me to quickly run the tests whenever I need it.
Recently I've been working more and more on JavaScript/TypeScript projects and I fell in love with Visual Studio Code. I'd like to see how it is to work on a Rails project with it.
In this post I'll describe how to extend our Docker dev environment with a complete testing environment.
Separate testing environment
The app I'm currently working on depends heavily on environment variables for 3rd party services configuration. It uses different set of variables for different environments: development, testing, staging, production.
Containers are way cheaper than VMs so let's reuse our development image to create a container with testing environment.
Here is our new ./docker-compose.yml
:
# ./docker-compose.yml
version: "3"
services:
app:
build: .
env_file:
- .env
ports:
- 3000:3000 # app
- 35729:35729 # live reload
links:
- postgres
volumes:
- app-sync:/app:nocopy
test:
build: .
env_file:
- .env.test
ports:
- 3005:3005 # Capybara listens here
links:
- postgres
- selenium
volumes:
- app-sync:/app:nocopy
command: bin/spring server
environment:
TEST_APP_HOST: test
TEST_APP_PORT: 3005
SELENIUM_HOST: selenium
RAILS_ENV: test
networks:
default:
aliases:
- test
postgres:
image: postgres:9.4
ports:
- 5432
volumes:
- './postgres-data:/var/lib/postgresql/data'
selenium:
image: selenium/standalone-chrome-debug
ports:
- 5900:5900
volumes:
app-sync:
external: true
The new test
container will reuse the same app-sync
volume that we use for the app
container so all the changes that we make locally will be automatically synchronized with both of them.
Notice that we use different environment variables via env_file
option.
The test
container doesn't have to run the Rails server so we replace the default command
with bin/spring server
. That will run the Spring server in the foreground allowing the container to wait for our testing instructions.
Setting up Selenium
To be able to run the "feature" specs inside the test
container we need to configure Capybara to use external Selenium server. You can use any of the Selenium docker images. I chose selenium/standalone-chrome-debug
as it allows me to connect to the container via VNC (Screen Sharing on Mac) on port 5900
and inspect the automated browser.
But we have a chicken and egg problem here. Capybara has to know about the Selenium server and the Selenium server has to be able to connect to our app served by Capybara. Usually we can just use links
to connect the services with docker-compose
but it will not allow for a circular dependency. To solve this we will use the networks.default.aliases
option to expose the test
server as test
in the default network, which is shared between all the containers. With that the Selenium container will be able to access the Capybara server through https://test:3005
.
Capybara configuration
Next step is to configure Capybara to use the Selenium server.
Here is the relevant fragment of my ./spec/features/feature_helper.rb
:
# ./spec/features/feature_helper.rb
# ...
SELENIUM_HOST = ENV['SELENIUM_HOST']
TEST_APP_HOST = ENV['TEST_APP_HOST']
TEST_APP_PORT = ENV['TEST_APP_PORT']
Capybara.register_driver :selenium_remote do |app|
Capybara::Selenium::Driver.new(
app,
browser: :remote,
url: "http://#{SELENIUM_HOST}:4444/wd/hub",
desired_capabilities: Selenium::WebDriver::Remote::Capabilities.chrome
)
end
Capybara.javascript_driver = :selenium_remote
Capybara.server_port = TEST_APP_PORT
Capybara.server_host = '0.0.0.0'
Capybara.app_host = "http://#{TEST_APP_HOST}:#{TEST_APP_PORT}"
# ...
See how the variables exposed in the ./docker-compose.yml
are used. It should be self-explanatory what's happening here.
Running the tests
Spin up the stack with docker-sync
:
$ docker-sync-stack start
It everything goes fine you should be able to run the tests with docker-compose
:
$ docker-compose exec test bin/rspec spec
Visual Source Code integration
Now let's configure Visual Source Code to run our tests on demand. All we need is the fantastic Rails Run Specs plugin by Peter Negrei. We only have to change the configuration to use our own spec command. In vscode
press ⌘ + ,
key combination to enter the settings editor and add this line there:
"ruby.specCommand": "docker-compose exec test bin/rspec"
Now you can open any of your specs and press:
⌘ + [shift] + t
to run all the tests in the file⌘ + l
to run all the test under cursor (that's lowercase L)⌘ + y
to run all the tests in the file
So cool!
It really looks like a solid setup to me. I'm gonna give it a try and do my future work with it. My good old MacBook Air seems to be happy about it too. Running even multiple containers at the same time doesn't seem like a big deal to it. I especially like the fact that I can have a totally separated CI environment running next to my development one. No more surprises with environment variables' leaks. And I can code in whatever editor I want.
The only annoyance is the fact that Docker for Mac requires a lot of disk space and I need to use an external hdd for it. But the smart guys are actively working on it and I'm convinced that soon this will not be a problem any more.
Back to work.