How I use docker with drupal
Why use docker for drupal?
As you may have seen from my earlier posts, I've been making use Docker a lot recently for neatly packaging the dev environments of projects I'm working on.
I've already espounded the advantages of docker in other posts, but I'd thought it might be useful to share (and document for my own benefit), how I've been running my own personal drupal projects using docker. It's also useful to document and explain the reasoning behind a particular setup and note things I found that didn't work for me.
To quickly recap, I'm using docker in my own projects to:
- Switch dev machines easily. I frequently move between Mac OS, Windows and Linux between different machines at home and work, so this minimises the differences in running projects between them.
- Fix software versions easily (e.g. mysql, php etc.)
- Set up a new machine quickly. It's surprising how often you need to either setup a new machine or re-install an old one. Getting dev projects set up on these machines is now a hassle free process.
- More easily deploy (in future). I hope to test some container based hosting systems in the future, which would mean my dev and live environments can be identical.
Lessons learned
There are already some great pre-built drupal images out there on the docker registry, but partly because none quite fitted my workflow, and partly to better understand how it works, I setup my own.
When you first start building docker containers, it's very tempting to build a project container in the same way as a traditional VM. My first attempt was like this, with a Dockerfile
containing an installation of php, mysql and the frontend tools I was working on. This worked, but was slow, and wasn't using many of the advantages of separating services that docker offers. My second attempt, and the setup I am using now, uses docker-compose to define separate containers for each of the services my drupal project requires.
Splitting services and using docker-compose was a big advantage. It was much easier to do environment-based debugging, and to setup and maintain different containers. It also made it super easy to update software versions independently. Using images from the docker hub made this very easy.
I can roughly split the main components of the project into the following services:
- php Needed to run drupal
- mySQL database backend for drupal
- nodeJS Used for compiling my front-end assets and some build scripts
So it's clear I need at least a separate container for each of these systems. Here's how I split these using docker compose We can elaborate a little on the requirements for each of these:
Drupal container
- Based on the
php-apache
image in the docker registry. I chose this over a pre-built drupal image, as I wanted to know exactly what was being installed and how it worked. - I needed more than the very basic php image, since drupal would require some extra php extensions such as imagemagick. These were setup in a local dockerfile which extends the
php-apache
registry image. - The default
up
command starts an apache server exposed to the containers host on port 80 - I also want to be able to run drush commands. I considered moving this to it's own container, but it requires the exact same environment and database setup as drupal, so didn't make sense to make another container for it.
Example commands:
docker-compose up drupal
to start a local serverdocker-compose run drupal drush pm-update
docker-compose.yaml:
drupal:
build: docker/drupal/
volumes:
- ./project:/var/www/html
links:
- db
ports:
- "8000:80"
docker/drupal/Dockerfile:
FROM php:5.6-apache
# Install modules
RUN apt-get update && apt-get install -y \
libfreetype6-dev \
libjpeg62-turbo-dev \
libmcrypt-dev \
libpng12-dev \
&& docker-php-ext-install iconv mcrypt pdo_mysql mbstring \
&& docker-php-ext-configure gd --with-freetype-dir=/usr/include/ --with-jpeg-dir=/usr/include/ \
&& docker-php-ext-install gd
RUN a2enmod rewrite
# To avoid wierd double iconv error
RUN rm /usr/local/etc/php/conf.d/ext-iconv.ini
# Install drush
RUN mkdir -p /usr/src/drush
WORKDIR /usr/src/drush
RUN curl -OL https://github.com/drush-ops/drush/archive/6.6.0.tar.gz
RUN tar -xvf 6.6.0.tar.gz --strip-components=1
RUN rm 6.6.0.tar.gz
RUN chmod u+x ./drush
RUN ln -s /usr/src/drush/drush /usr/bin/drush
WORKDIR /var/www/html
mySQL container
- Uses the community base mysql image with no changes
- Environment variables set in the docker-compose config determine the database settings
- The default
up
command starts the DB server, which is exposed to the host on port 3306 - I also added a second container base don the same image to be able to use the mysql client tools to easily access the DB on the command line. It links to the server container and environment variables are used to make supply authentication.
- I also mount my db_dumps folder as a volume into into the mysql tools container to allow easy db dumps or imports from the host machine
Example commands:
docker-compose up db
To run the DB serverdocker-compose run mysqltools mysql
for a mysql command linedocker-compose run mysqltools mysqldump
to dump to the mounted project folder on the host.
docker-compose.yml:
db:
image: mysql:5.5
environment:
MYSQL_ROOT_PASSWORD: 'root'
MYSQL_DATABASE: 'grahamgilchrist'
ports:
- "3306:3306"
mysqltools:
image: mysql:5.5
volumes:
- ./db_dumps:/db_dumps
working_dir: /db_dumps
links:
- db
environment:
MYSQL_DATABASE: 'grahamgilchrist'
MYSQL_HOST: db
NodeJS container
- Uses my own custom nodeJS container which install packages within the container at build time.
- Packages are installed from
package.json
in the project root - GruntJS is installed as a global node package using a custom
Dockerfile
in the project - The default
up
command starts thegrunt watch
development task which auto-recompiles frontend assets as they change - I also added another container for installing bower packages needed by frontend tasks. This is based on a community image in the docker registry. The bower container mounts the current directory and installs packages directly into the mounted project folder from the host.
Example commands:
docker-compose up grunt
starts the grunt watch commanddocker-compose run grunt grunt sass
Runs the grunt sass taskdocker-compose run bower install
docker-compose.yaml:
grunt:
build: docker/grunt
command: grunt watch
volumes:
- ./:/usr/src/app
bower:
image: blinkmobile/bower:1.3.12
environment:
bower_allow_root: true
bower_analytics: false
working_dir: /data
volumes:
- ./:/data
docker/grunt/Dockerfile:
FROM grahamgilchrist/node-onbuild:0.12
#install grunt-cli
RUN npm install -g [email protected] \
&& npm cache clear
Future plans
By and large, I've been happy with how this setup has worked out, but there's a few things I'd still like to improve:
- Add a container for deployment scripts. Currently these still rely on specific programs on the container machine such as rsync, ftp etc., and it would be great to have these in containers also.
- Utilise drush build to build the entire site from a simple modules manifest. This will avoid having to check in a fully-built site, and with docker will allow to build the site from scratch using only a minimal source code repository.