Node packages in docker and the node-onbuild container

I've recently been experimenting with ways to run node projects inside of docker, resulting in a new "onbuild" container I'm now using for node projects on Github.

A built image of this container is also available on the docker hub registry at grahamgilchrist\node-onbuild

What is ONBUILD?

Many of the official docker hub images utilise the ONBUILD feature of docker. This allows adding files to and running commands in the container after the dockerfile has been built, meaning that you can distribute a pre-built image for a Dockerfile environment, but still perform some common tasks for that type of project.

For example, the official NodeJS docker hub repo offers an 'onbuild' tagged image, which is identical to the regular node Dockerfile image, but once built also copies a package.json file from your project working directory into the container and runs npm install. This means one Dockerfile/image can be used for many standard nodeJS projects. This pattern is common across the official docker hub images, for project environments like python, NodeJS and PHP.

What's wrong with the official one?

This works great with containers which are meant to be built once. You run docker build and your project folder is happily copied into the container file system. It also works great for environments like Python, where running a pip install requirements.txt (the python equivalent of npm install) installs some packages into a system directory outside of the project folder.

However, for development work in docker, we frequently want to mount a project directory as a volume inside our container to allow us to instantly see modifications to the source code without rebuilding. Since NodeJS projects store their packages in a node_modules folder within the project directory by default, any attempt to mount the project folder using the official docker hub repo will overwrite the node_modules directory from the build step.

What about a container for npm?

One simple way to get around this problem is to use the base node image form the docker hub, and run npm inside the container. Using a docker-compose definition like the following, we can run docker-compose run npm install to install node packages into the mounted project folder as you would on a local machine.

npm:
  image: node:0.12
  volumes:
    - ./web:/data
  working_dir: /data
  entrypoint: ['npm', '--no-bin-link']

I experimented with this method for a while, but it had problems because:

  • Mac OS disk write speed was very slow (see boot2docker volume mounts on Mac OS)
  • The environment was partly stored outside the container and hence could be accidentally or intentionally modified by the user such that even a new docker build would not produce a clean environment which would run the project. Having the environment totally contained was much cleaner and less prone to errors.

The node-onbuild container

What I really desired was a container which ran the npm install as part of the build process, and kept the packages internally within the container file system. The solution came when I realised that by default, node looks for node_modules directories recursively up the directory tree, starting at the current working directory. This meant we could install the packages one level up the directory tree inside of the container file system, and apps running inside the working directory would still see them and be able to require() them.

The node-onbuild implements this idea by installing all node_modules into a directory one level above the mounted folder. The modules are thus still entirely contained inside the docker container and not the mounted directory.

This also has the side benefit that you can temporarily test changes to a module, by checking out a version of it into the node_modules folder of the host system working directory for quick testing, as they will take priority over those one directory up inside the container.

How to use the image

The repository structure on the node-onbuild repo follows the same directory convention as official docker hub repos (e.g. python: https://github.com/docker-library/python) whereby each version of node has it's own directory with a Dockerfile.

To use the image

  • Mount a directory with a package.json file and anything else you want to be available (e.g. node project files) to /usr/src/app
  • The docker build step will copy package.json into the container and install the packages one directory above the project
  • docker run -v ./:/usr/src/app grahamgilchrist/node-onbuild:0.12 node-command

To use with docker compose:

docker-compose.yaml:

node:
  image: grahamgilchrist/node-onbuild:0.12
  volumes:
    - ./:/usr/src/app
  • docker-compose run node node-command

I hope others may find this approach useful as well in your node projects!