Ruby on Rails development environment in OSX with Docker
So far I have been happy using vagrant for setting up development environments for different clients' projects. In the era of containers I was curious how a Docker based development environment would perform.
Putting a Rails app into a container
There is a lot of articles about how to put a new Rails app into a container for development purposes. I started with this one.
I wanted to test the idea on one of the projects I currently work on. It's a Rails 4 app. The objective was to have a fully working development environment with
guard-livereload refreshing my browser whenever I make changes to the files.
My current development machine is a MacBook Air with macOS Sierra and I already have Docker for Mac installed:
$ docker --version Docker version 17.03.1-ce, build c6d412e $ docker-compose --version docker-compose version 1.11.2, build dfed245
Creating the image
We need to create a docker image with our application inside.
In the root directory of the rails project create
# ./Dockerfile FROM ruby:2.4 RUN apt-get update && apt-get install -y build-essential nodejs RUN mkdir /app WORKDIR /app COPY Gemfile /app/Gemfile COPY Gemfile.lock /app/Gemfile.lock RUN gem install bundler && bundle install --jobs 20 --retry 5 COPY . ./ EXPOSE 3000 CMD ["sh", "docker-cmd.sh"]
Here is our
#!/bin/sh set -ex # run guard with livereload in background bin/bundle exec guard -i & # start the server bin/rails server -b 0.0.0.0
Now we can create the docker image:
$ docker build . -t my-app
You can test the image immediately:
$ docker run --rm -p 3000:3000 -t my-app
My app requires a Postgres database so I see errors in the terminal when I access http://localhost:3000 but it is running! Now we need to connect it with a database.
You can stop the running app with
(Check output of
docker ps and if the container is still there stop it with
docker stop <CONTAINER ID>)
Connecting the database
We will use another docker image for our database and connect it to our app image with Docker Compose.
We need to create another file in the root of our project (
# ./docker-compose.yml app: build: . ports: - 3000:3000 - 35729:35729 # livereload port links: - postgres volumes: - './:/app' postgres: image: postgres:9.4 ports: - 5432 volumes: - './postgres-data:/var/lib/postgresql/data'
This file defines a simple stack composed of two components - our app and postgres database.
We use docker volumes to keep the database data in the host machine (
postgres-data). (Don't forget to add this folder to
We also use volumes to link our host's project root folder to
/app folder inside the app container, so that any change made locally will be immediately visible inside the container.
Next we need to configure our app to talk to the database.
./config/database.yml and update the
# ./config/database.yml default: &default adapter: postgresql encoding: unicode pool: 5 development: <<: *default database: my_app_dev host: postgres # <----- username: postgres # <----- test: <<: *default database: my_app_test host: postgres # <----- username: postgres # <----- # ...
Now let's build our stack:
$ docker-compose build
And let's start it:
$ docker-compose up
Before we access the app we should create the database(s). In another terminal window run:
$ docker-compose exec app bin/rake db:create $ docker-compose exec app bin/rake db:migrate $ docker-compose exec app bin/rake db:test:prepare
Now we should be able to see our app working at http://localhost:3000
It is so sloooow!
The application works (LiveReload too), but it is just not usable. It is extremely slow. It takes about 20s to load the front page! Looks that this problem affects only OSX. There is an issue #77 addressing it.
A quick google search pointed me into
Making it fast
docker-sync promises to:
Run your application at full speed while syncing your code for development, finally empowering you to utilize docker for development under OSX/Windows/Linux
Install it with:
$ gem install docker-sync $ docker-sync-stack --version 0.4.6
The documentation is not great and it took me a while to set it up but eventually I made it work using
Making LiveReload work was a bit tricky. The changes to the local files were properly synchronized with the container but for some reasont LiveReload wasn't detecting them. It turned out that
rsync by default is not overwriting the files directly. Instead it is creating a new hidden file, then it removes the old copy, and only then it renames the hidden file as the new one (or something like that). LiveReload doesn't pick new files - it works only with modifications of the existing files.
Fortunately we can tell
rsync to change files in place with
We have to create another
yml file for
# ./docker-sync.yml version: "2" syncs: app-sync: sync_strategy: 'rsync' src: './' sync_host_port: 10872 sync_excludes: ['.gitignore', '.git/', 'tmp', 'log', 'README.md', 'postgres-data/', '.docker*'] sync_args: '-v --inplace' notify_terminal: false watch_excludes: ['.*/.git', '.gitignore', 'docker-*.yml', 'Dockerfile', 'postgres-data', '.docker*'] watch_args: '-v'
We also have to update our
# ./docker-compose.yml app: build: . ports: - 3000:3000 - 35729:35729 links: - postgres volumes: - app-sync:/app:nocopy # <-- the only change postgres: image: postgres:9.4 ports: - 5432 volumes: - './postgres-data:/var/lib/postgresql/data'
And here is how we run it:
$ docker-sync-stack start
The app loads now in a few seconds. Nice.
Here is how you can run the tests in the container (
$ docker-compose exec app bin/rspec spec/controllers
I don't know yet how to setup the integration testing. I have a lot of "feature" tests using
selenium and I'd like to run them inside the container. I will probably use another container to host a selenium server and run the tests using that. Maybe I'll write about it once I figure it out.
Overall the experience is quite positive. I can quickly spin up the development environment and start coding. The memory and CPU utilization look much better than with vagrant. I'm gonna do some work in this setup to see how it feels in longer run.
There are a few issues though:
- I had to configure Docker to keep its data on an external hdd (it eats too much space - see #371)
docker-syncis a hack. I hope one day Docker will perform much better on mounted volumes.