Why it helps to separate service containers in docker
Setting up an effective development environment using docker requires a change of thinking when coming from regular VM tools such as Vagrant. This post discusses some of the concepts around separating services into different containers, and why you would want to do this. I'll show some actual examples using docker-compose in a future post.
When you start using docker, it can be tempting to build the containers for your projects in a similar way as you would virtualise a project machine environment in a tool such as Vagrant. That is, to create a single image/container for running your project that contains the entire set of software needed to run that project.
For instance, for a developer to run a python website project, they may need to use a variety of different software - Python, Postgres, Ruby for Sass and maybe even nodeJS/Grunt for a front-end build process. The database may also make use of other extensions and libraries such as memcached etc. Building a Vagrant environment for a project like this would involve installing all the services together in one machine. Using VMs this way is sensible, as they are typically heavy and slow beasts. The CPU and RAM requirements mean that trying to run multiple VMs with different services for your database server, web server and any other tools is often prohibitively taxing for the average desktop machine.
A docker container can be built in the same way, but the speed and resource advantages of Docker effectively remove the resource limitation on the number of virtualised containers, allowing us to explore other, better options. In fact, the recommended docker way way to design containers is to keep services separated as much as you can. It involves changing your mindset from virtualising machines, to virtualising processes.
Separating process/services like this has several advantages:
- Smaller container file size
- Easier to switch versions of software
- Easier to debug software conflicts
- Much easier re-use of services between projects
- Ability to use images from the docker hub
- Scalability
- Brings your dev environment closer to deployment
Let's look at each of these in a bit more detail:
Smaller container file size
Each container only needs to install the minimal OS and software dependencies to accomplish what it needs for that service to work. This means they can often start off with much smaller base images and not require a full ubuntu install for instance for the off-chance that you might want some other software later. The only disadvantage of this, it that using many containers with different base images can occupy a lot of space. However, with clever container design, and sharing common container ancestors, you can often keep the combined container size low.
Easier to switch versions of software
Changing a version in a single large container would mean updating the Dockerfile and rebuilding the container, which is slow if it contains a lot of software and other services. Using separate containers for each service means simply rebuilding a single service and it's dependencies, which is must faster. If you are using well tagged base images such as the official docker hub images, this change is as simple as changing a software version number in a config file. This is great to quickly test dependency updates for bug fixes and the like.
Easier to debug software conflicts
Since services are separated and isolated, tracking down bugs is much faster. You don't have the chance of different pieces software conflicting with each other and producing a hard to find bug. You can also more easily run services in isolation to determine where an error is being generated.
Much easier re-use of services between projects
This is a big plus for separating services correctly. It is much easier to write a generic container for a single particular service which could also be re-used on other projects. For example, if all your python projects use a common setup, you can use a common container for them all, whilst varying other components in the project such as the database server. When you are working on many projects (e.g. in an agency environment), this can really improve your setup time for new projects, as well as reduce maintenance by sharing a common source.
Ability to use images from the docker hub
The docker hub registry provides community-driven pre-written docker files and built images for many common pieces of software. This makes the 're-use of services' even more useful, as not only can you share common services with your own projects, you can make use of the many services already built by others.
If you had virtualised your entire development machine as a single container, it's unlikely that your exact configuration would be be shared by someone else. Even providing a global database of images, there would need to be a huge number of images to account for all the differing versions of minor pieces of software involved. Encapsulating the services separately means that you can easily pick an existing runtime environment from the docker hub that will work exactly as you want, meaning less time writing your own configurations.
Scalability & Bringing your dev environment closer to deployment
Chances are, your web deployment system doesn't run everything on one giant server. You've probably (sensibly) got separate database servers and web servers and possibly other tools. We know this makes sense from an Ops perspective because for many similar reasons as those outlined above, but also for reliability, security and scalability. Separating services in our development environment also allows us to bring them closer in line with the deployed versions, sharing the same setup and reducing the chance of errors arising from different development and production environments. Additionally, while this discussion is mostly concerned with the use of docker with development environments, if you build services separately, it makes it much easier to clone and scale just a database server for example, if you ever utilise these in a deployment system.
These are just benefits I've gleaned from using Docker and following the separate services principle. I hope that helps others to think about how to construct their docker projects. In a later post I'll show some concrete example of how I've been using docker-compose to separate services in front-end, Drupal and python projects.