Docker everything!

8
minutes
Mis à jour le
14/6/2022

Share this post

Use Docker and Docker Compose even in development environment to improve Developer experience and streamline development workflow

#
DevOps
#
Developer experience
#
Docker
Sulidi Maimaitiming
Tech Lead

With Docker and Docker Compose, you can get rid of the “Requirements” section of your software project and I think it’s beautiful! This is a fictitious fable, however inspired by true facts that I have experienced myself or observed from up-close, about a team failing in unexpected ways.

"Back when cars were a novelty, the instructions for starting a Model-T Ford were more than two pages long. With modern cars, you just push a button - the starting procedure is automatic and foolproof. A person following a list of instructions might flood the engine, but the automatic starter won't."

Andrew Hunt, Dave Thomas - The Pragmatic Programmer

Picture yourself as the software architect of a new project in a popular JavaScript framework, let’s say React. You have just spent a couple of days bootstrapping the codebase from scratch, inaugurating the CI/CD pipeline and setting up a few pages that you consider to be perfect references of what you want the code to look like so that your team can draw inspiration from them and have a quick reference once they start developing. But never did you know how much struggle you were about to go through before getting the team to deliver full speed.

On the first day you meet your fellow team members, you spend a few hours getting to know each other. Of course you brought some croissants and chocolatines from the bakery on your way to the office, you grab a nice cup of coffee fresh from the French press. James has been working as a developer the longest and before you realize you are debating the pros and cons of Server Side Rendering.

What could possibly go wrong?

First James comes up to you with a cryptic error message on his screen. He is not sure what to do, but he is sure that you are not going to be able to fix it.

You google the message up, read through a couple posts on Stack Overflow and come up with a solution quite quickly: James has the wrong version of Node.js installed. It never occurred to you that anyone could still be using version 12. But it just happens that James had been maintaining a pretty old codebase for the last few weeks.

As a quick solution, you add Node.js 14 to the list of requirements in your README file and write a short paragraph on how to install the project locally.

Before long, you are once again sollicitated for a similar error message, this time by Sophia. You are able to spot the source of the issue pretty fast, Sophia is using Node 17, and it is also not compatible with the project.

Since this is the second time you have issues with the version of Node.js, you decide to add nvm to the list of requirements and an nvmrc file that specifies the required Node.js version. You install the project using nvm yourself and are surprised that it fails despite the correct version being specified in the nvmrc file. A little investigation leads you to discover that you need to run nvm use every time you open up a new terminal. A little more investigation leads you to the README page the nvm GitHub repository which references a short script to add to your shrc file to automatically run the nvm use command for you.

You add the link to the shell script to your Requirements section as well, a little bit surprised that this bit of knowledge is not shared across your company and make a note to write a short tutorial for other colleagues.

A couple of days later, you welcome Jaden, the new trainee, to your team. You show him around the office and run him through the codebase to help him get started. Shortly after lunch, he comes back to you with a question about his setup. You are happy to see that the instructions left by you have successfully lead Jaden to the right version of Node.js. You are also slightly amused to see a distant yet familiar error. The yarn install command miserably fails with no such file or directory: 'install'. You have encountered precisely the same error before, at the beginning of your carrier as a developer and just to be sure you asks Jaden how he installed yarn. As you expected, he simply ran apt install yarn since he is using a classic Ubuntu setup and ended up installing the cmdtest package. To prevent this from happening again, you make yourself a mental note to add more detailed instructions in the Requirements section.

Jaden installs yarn the recommended way and hit yarn install again. The process starts running, only to fail in the middle with an error that says that one of your dependencies "does not yet support your current environment: OS X Unsupported architecture (arm64)"...

This list could go on, the point is that every workstation is a little different and it shows in the development workflow. The environments and the build processes that are not fully automated are allowed to drift and generate unexpected failures, extra-work and mental overhead to collaboration because we have to check that the environment has been correctly set up.

Is there a better way?

In my opinion, yes, and as you may expect, it is Docker and Docker Compose.

There are two major upsides in containerizing your application for the development environment.

First, it allow you to declaratively and predictably manage the environment in which your application runs. It is a nice and easy way to manage your application's infrastructure, dependencies, and configuration. When you tweak the configuration of a local machine, it takes an extra step, both cognitively and physically, for that change to be reflected on another machine: you need to update some form of documentation, explain the change to the rest of the team and they in turn have to apply the change. On the contrary, with a shared Docker Compose configuration, the change you want to bring is expressed as code and simply works across the team if it works for you.

Second and more importantly, Dockerizing your development environment removes the cognitive load of maintaining the configuration of your local machine. Indeed, one may argue that some of the configuration management can be scripted, but then rises the concern of running a script on someone else’s machine, which sounds very scary to me, especially if not all the members of the team share the same basic configuration of a computer. Running your application as a containerized process means that you don't have to worry about breaking someone else's computer and you have the time and attention-span to worry about more important stuff.

This is especially true if you are evolving in a multi-language environment, if you have to switch frequently between different projects, different repos, different services. They can all share the same "Requirements" sections: Docker and Docker Compose and you can focus on more important stuff than environment configuration.

How do you know it’s right?

It’s ridiculously simple! Remove the tool you are using for your project from your local machine and see if it works!

Sample project setup in Go

To get an application up and running, you don't really need much setup. In your directory of choice, create

  • a Dockerfile: dev.Dockerfile
  • a Compose file: compose.yml

In our case, with a minimal setup in Go and using the Gin framework, we will also need

  • a Go module file: go.mod
  • a source file for our server: server.go

Next let's install Gin by running

docker-compose run app bash -c "go get github.com/gin-gonic/gin"

This should create a go.sum file in your working directory after installing the dependencies, the content of the file doesn't really matter in our case.

Tada, your development environment is ready! Try it for yourself, run in a dedicated terminal or in detached mode

docker-compose up --detached app

And then

curl localhost:8080/ping

You should receive the following response

{"message":"pong"}

See? You never even had to install Go or Gin! And neither will your colleagues, your environments will always be kept in sync because the build of your application is perfectly reproducible!

Caveats
Performance issues on macOS

If you are using macOS, you may experience performance issues when running Docker with Docker for Mac (you can see this article from the Docker blog on how they recently improved performance on macOS). I am by no means expert in the specificities of Docker on macOS but you may want to try using Docker-Sync to gain better performance or tweak with different volume mount (see this Stack Overflow thread).

Git hooks

If your project relies on Git hooks, especially pre-commit hooks to run a linter or minimal tests, my recommendation is to design your Makefile to run only from inside the container and your git hooks from the outside, something resembling:

docker-compose exec my-app bash -c "make lint"

IDE integration

When your application runs inside a container instead of directly on your local machine, it may complicate IDE integration. I have not found a satisfying solution to this issue yet, but it is a sacrifice I am willing to make. Your ideas to improve this part of the experience are more than welcome!

Thanks to BaptisteG and OthmaneEB for their guidance on this article