Automated drupal deploys with drush make and Wercker
I've recently updated all my Drupal sites to use a build system based on drush make, and with continuous delivery using the Wercker service. Doing this has massively reduced the maintenance overhead for updating small sites to be as simple as editing a single file in the git repo. I thought I'd share the process and ideas in case it helps anyone else with similar goals.
Background
I maintain a bunch of Drupal sites for friends and family, but due to the nature of these, they often have sporadic development.
I've already mentioned how I use docker with Drupal in an earlier post. Due to the time lag between development changes, I'll usually need to re-setup my Drupal environment or database again each time I need to work on these sorts of projects. Docker simplifies this immensely, allowing me to get the whole guaranteed environment setup with just a few commands.
However, another issue I've been facing recently is that of security/module maintenance. Like any modern web site, Drupal sites need regular maintenance to install at least security updates, but also for bug fixes and new features. The Drupal notification module is great for emailing alerts when critical security updates are released, but my workflow for dealing with these was quite slow, and therefore sites would end up getting behind on these important updates. I wanted to improve and automate this process so that maintenance would be a breeze, and I hoped to use a site build/deploy system based on docker.
Existing workflow
My standard way of dealing with Drupal sites in version control was to commit the entire Drupal site folder, including files and all core/contrib modules. The only exception was a local-settings.php
file which I created per environment to hold any environment specific settings such as database connection details. Deployment was then just a case of sending the folder via FTP or rsync to the target server, making sure not to overwrite the local settings and uploaded files.
When I received an alert that a new security update for a module was available, my workflow would then be:
- Clone site to development machine
- Setup Drupal instance with docker
- Run
drush pm-update
or download modules manually into the appropriatesites/all/modules
folder - Check and commit the chnaged files form the updated module
- Push to remote git repo
- Deploy the updates
- Run the database update script on the live site
This worked ok for me initially, but once the number of sites I was maintaining grew, this became quite a long-winded chore to update them all, and I looked for ways to automate this. I wanted a way to tell a site to update to specific module versions whilst having to run the least number of commands manually. I also wanted this to happen remotely as part of some kind of automated build process not tied to the machine I was developing on.
Identifying the issues
After thinking about this problem a while, I identified two main parts of the process which needed automation.
- Build - I wanted to to automate the two steps which downloaded new module versions via drush, and then committed these to the repo.
- Deploy - Once a new repo state was created I wanted to automatically deploy this from the git source master branch to the destination server.
Automating both these things would reduce the maintenance time significantly.
Build
Whilst it would certainly be possible to write a shell script which does the drush module download, and git commit, i wanted a way to specify the use of particular module versions. The solution I settled on was based on an excellent tool called drush make. Drush make allows you to specify the contents of your Drupal site as a yaml file of dependencies. Running drush make project.make.yaml ./project
then downloads the specified versions of Drupal core and contributed modules and constructs a fully working Drupal site folder. This solution meant I no longer needed to store core and contrib module code in my repo which has two advantages:
- Saving repo space - no longer storing duplicates of core and module code that is also hosted on drupal.org
- Easy version changes - just change the module version number in the yaml file
Here is an example a project.make.yml
file:
core: 7.x
api: '2'
projects:
drupal: '7.51'
admin_menu: '3.0-rc5'
ctools: '1.11'
date: '2.9'
link: '1.4'
page_title: '2.7'
pathauto: '1.3'
token: '1.6'
views: '3.14'
wysiwyg: '2.2'
site_theme:
type: theme
download:
type: copy
url: 'file://./project_source/themes/site_theme'
As you can see from the last entry in the yaml file, custom modules and themes can be easily integrated into this system. I kept these in the separate folder root of the repository, so that they could be accessed by the drush make command, but you could also split these out into their own repositories easily and install from git via drush.
Using this build system allows the repo to remain very small and light. A minimal file structure now looks something like this:
- project.make.yml
- project_source/
- themes/
- site_theme
# public files folder
- files/
- ...
- settings
- settings.php
- docker-compose.yml
We have a docker-compose.yml
file which is used to setup the environment for the project during development, and the drush make file for creating the Drupal site folder. A project_source
folder contains the files unique to this specific site, including any custom modules or themes, the uploaded file system and the Drupal settings.php
file. Note that I keep a backup of the live server files directory committed in the repo for use during development, but this folder is not modified during deployment.
The docker-compose.yml
file follows the structure outlined in an earlier post, with the addition of a new drush section looking like this:
drush:
image: drush/drush:7
volumes:
- ./project:/app
- ./project.make.yml:/project.make.yml
links:
- db
This means the only commands needed to get a local dev site up and running are now:
- Import database
docker-compose run mysqltools sh -c "mysql -uroot -proot db_name < db_dump_name.sql"
- Build site
docker-compose run drush make /project.make.yml
- Copy uploaded files folder
cp -R project_source/files/ project/sites/default/files
- Add settings.php with DB connection details and site specific settings
cp -R project_source/settings/settings.php project/sites/default
- Run dev site
docker-compose up drupal
Importantly, these can all be very easily repeated by a build system, which leads me to...
Deploy
The last step in this automation procedure is deployment. Having created an Drupal site which can be automatically built from a list of module versions, we now also want to deploy this site. I wanted a continuous delivery (CD) system which would monitor the master branch of the repository in github or bitbucket, and automatically build and deploy a site from that source. After some investigation, I settled on the Wercker service, due to its ability to work with private bitbucket repos on the free plan, and the extensive registry of community supplied steps and build box environments. Integration with this service was very easy, just requiring I add a wercker.yaml
file to the root of the repo and setting up bitbucket integration via the web interface.
The wercker.yaml
contains two wercker "pipelines". One for building the site with drush, and one for deploying the site. In this instance, I only really need one pipeline for the entire process, but separating these out means that it is easier to provide a second or alternative deploy later or just to test one part of the process.
Here's an example of the yaml file I am using:
box: busybox
build:
box: php
steps:
- install-packages:
packages: unzip git
- script:
name: Install Drush
code: |-
php -r "readfile('https://s3.amazonaws.com/files.drush.org/drush.phar');" > drush
php drush core-status
chmod +x drush
sudo mv drush /usr/local/bin
- script:
name: Drush make
code: drush make project.make.yml build/
- script:
name: Copy settings.php and files
code: |-
cp project_source/settings/settings.php build/sites/default/settings.php
cp -R project_source/files build/sites/default/files
The build section runs in a predefined PHP environment, and simply installs drush and replicates the same site build commands which we ran locally. The output of this build step contains a build/
folder with a fully working drupal site which is ready to be deployed.
The second pipeline step in the wercker.yml
is for deployment:
# We now have a working Drupal site ready to be piped to deployment
deploy:
box: debian
steps:
- install-packages:
packages: ssh-client rsync
- add-to-known_hosts:
hostname: $SSH_HOST
# Make temp file, and store its location in env var PRIVATEKEY_PATH
- mktemp:
envvar: PRIVATEKEY_PATH
# Write contents of private key env var to the temp file
# $PRIVATEKEY is an ENV var containing the private key needed to ssh to the deploy server
- create-file:
name: write key
filename: $PRIVATEKEY_PATH
content: $PRIVATEKEY
overwrite: true
hide-from-log: true
- script:
name: rsync files
code: rsync -rv --delete --compress --filter="exclude /sites/default/files/***" -e "ssh -i $PRIVATEKEY_PATH -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no" ./build/ $SSH_USER@$SSH_HOST:/var/www
The steps here essentially deal with using rsync to deploy the output of the build step to the appropriate server. This pipeline utilises three environment variables set inside wercker to determine the ssh user and host, and to provide a private key for ssh access to the server.
New workflow
After adding drush make and Wercker, the system is now totally automated, and working very nicely. The new workflow is much quicker, only involving edits to just the drush make file in the master git branch:
- Clone repo
- Update
project.make.yml
versions - Commit file
- push to remote git repo
- Drupal site auto-builds and deploys to remote server without any interaction
- Run the database update script on the live site
This process is now so simple that I can even just edit the project.make.yml
files directly in the github/bitbucket web interface, and not even have to touch the command line, reducing the update process to two steps - editing the file, and running the db update. This also has the great advantage that I can make these updates remotely, when I am not at a machine with a git terminal. The builds are also stored by Wercker, so if something goes wrong, I can easily login to Wercker and deploy an older build.
The disadvantage of this process is now a reliance on external systems. If wercker or drupal.org go down, I won't be able to build any projects, but I'm hoping that will be rare enough it won't be a problem.
Future improvements:
This process has made updating Drupal sites so much easier, but there's definitely some future improvements I want to add which I may cover in future blog posts:
- Backup DB archive before wercker rsync
- I'd like to be able to take a database backup with wercker before a deploy, so that in the unlikely event of anything going wrong, I can easily revert the database as well as the file system. This is usually easy if the remote host allows command line database access.
- Make wercker use the same docker containers as the dev project
- Currently I repeat the build steps and create a new environment in wercker for the build. It shoudl be possible to use the same docker container environments as we do locally, which will keep the build environments DRY and consistent
- Move wercker commands to separate wercker step repositories, so that they can be re-used between projects easily.
- Wercker allows you to publish steps yourself to its registry. Wrapping up the drush build process in a single step would mean the same step could be used more easily for multiple sites
- Look in to docker image deployment
- At the moment I copy the deployed site to a web server which is already provisioned. I'd like the servers to also be provisioned from a reproducible state, potentially using docker images.