Replacing Virtualenv with Docker

I hate working with python on my local machine. Conflicting versions are a pain in the ass.

When I learned how to use dockerized python instead of my local python installation it changed my life. In this tutorial I’ll give an outline of how you can change your workflows as well.

Find the code for this tutorial here.

Install Docker

Begin by installing docker on your laptop.

Here are instructions for mac.

Here are instructions for windows.

Configure a Dockerfile

Docker works via a Dockerfile.

In “laymans” terms Dockerfiles define another operating system with its own filesystem and set of programs installed.

When the Dockerfile is “built” it creates an “image”.

We then “run” programs in our built image. This allows us to run programs in an operating system* that is separate from the one on our laptop.

In other words, each of our projects can have completely distinct, separete, clean environments. Unlike virtualenv, which is python specific, this approach can be used for any language/project you work on.

Place this Dockerfile in your project directory. The filename should just be Dockerfile.

FROM python:3.10

COPY ./requirements.txt /usr/local/src/app/requirements.txt
RUN pip install -r /usr/local/src/app/requirements.txt

ENTRYPOINT /bin/bash

This file belongs in your project directory. It assumes your requirements.txt is also at the root of your project directory. If requirements.txt is somewhere else in your project, change the ./requirements.txt path to point at it relative to your project root.

It copies your requirements.txt from your laptop’s file system into the docker images file system.

It then uses the pip program that already exists in the base docker image (defined on FROM python:3.10) to install the requirements.txt dependencies into your docker image.

Configure docker-compose.yaml

Next we need to configure a docker-compose.yaml file in the same directory as your Dockerfile.

docker-compose is a tool that makes it easier to work with docker.

I use a file like this:

version: "3"
services:
  app:
    build:
      context: .
    working_dir: /opt/project
    volumes:
      - .:/opt/project

Now we can run a command like this:

$ docker-compose run app
root@e543fff6db26:/opt/project# python --version
Python 3.10.5

And docker-compose will build our Dockerfile into an image and give us a shell into the running container.

The volumes section of the docker-compose file mounts the project directory in our laptop into our docker container’s file system.

This allows us to work on our project files while using the programs installed in our container.

This means that we can use a clean, separate python installation for each of our projects without dealing with conflicting versions or virtual environments.

As we can see by running python --version, the version matches that of the base

Gotchas

There are some gotchas here.

  1. if you add a requirement to requirements.txt, you will need to rebuild the image. You can achieve this with docker-compose build app. After doing this, the new dependency will be present when you run the container.
  2. There are probably more that I’m forgetting about right now.

Leave a Reply

Your email address will not be published.