Testing Django with docker-compose and Postgresql

At the moment it’s pretty common to have following setup for local development- docker-compose simulation production setup with application, database and all other containers needed to run the app- sometimes even with nginx. This is a good pattern to have local setup as close to production as possible. However, running unit/integration tests in this case may be just slow- slowest part of testing is communication with DB and usually it’s avoided by testing with in-memory sqlite. What if you can’t use in-memory sqlite database because you’re tied to database specific functions? You can mock your tests to avoid any DB usage and this is the most performant way but also forces you not to test anything that is returned by database- so integration tests can’t be really run this way.

Docker-compose to the rescue

There is very simple solution to this problem- use tmpfs in your tests. I assume that you’re testing with py.test(because you are, right? 😉 ). Docker-compose file(docker-compose-test.yml for example) could look like this:

version: '3'

services:
 db_tmpfs:
   image: postgres
   tmpfs:
     - /var/lib/postgresql/data

 test:
   build: .
   command: py.test /code/
   volumes:
     - .:/code
   depends_on:
     - db_tmpfs
   environment:
     DJANGO_DATABASE_URL: psql://postgres@db_tmpfs:5432/postgres
     DJANGO_SECRET_KEY: adsf

This way you can run your application with persistent db withdocker-compose -f docker-compose-test.yml run test

There will be Postgres DB created, but kept in memory for the duration of tests- this will significantly reduce time wasted on DB disk operations.