is there a python package that allows teams to share venvs through a git-like interface?

If the virtual environment is on a shared drive(group readable), then your team members should be able to access it. A virtual environment is just a directory.

But my understanding of this is that it does not eliminate the need for individuals to install their own local versions of python. Is that correct?

Virtual environments have their own python binaries, which you can see when you run which python inside the virtual environment after it is activated.

So is there a special package to enable this? Or is it just a regular venv that is included in the git repository with the other files? If we do this, then we must all put the venvs in the same place in on our file systems OR we have to go in and manually change the VIRTUAL_ENV variable in activate.bat. Is that correct?

I would advise against uploading a virtual environment directory to version control, since it contains binaries, configuration files that don’t belong in there. Its also unnecessary to do this because the dependencies are tracked in a requirements.txt file, which list the pip dependencies and is committed to version control. Additionally, When the virtual environment is activated, the VIRTUAL_ENV environment variable is automatically exported, so there is no need to modify it.

Conclusion

For simplicity, its probably best to have each user create their own virtual environment and install the dependencies from requirements.txt on their local machines. This also ensures users don’t make a change to the virtual environment that will affect other users, which is a drawback of the above shared drive approach.

If they want to pull the latest requirements, then pulling the latest change using git pull and reinstalling the dependencies with pip install -r requirements.txt is good enough. You just have to ensure the virtual environment is activated, otherwise the dependencies will get installed system wide. This is where the pipenv package also comes in handy.

Usually in my team projects, the README contains instructions to get this setup for each team member.

Additionally, as Daniel Farrell helpfully mentioned in the comments, pip won’t be able to manage packages like libffi, openssl, python-devel etc. inside a virtual environment. This is where using Docker containers become useful, since you can install dependencies inside a isolated environment built on top of the host operating system. This ensures the dependencies don’t mess with the system wide packages, which is a good practice to follow in any case.

An example Dockerfile I have used in the past:

FROM python:3.8-slim-buster

# Set environment variables:
ENV VIRTUAL_ENV=/opt/venv
ENV PATH="$VIRTUAL_ENV/bin:$PATH"

# Create virtual environment:
RUN python3 -m venv $VIRTUAL_ENV

# Install dependencies:
COPY requirements.txt .
RUN pip install -r requirements.txt

# Run the application:
COPY app.py .
CMD ["python", "app.py"]

Which I modified from this Elegantly activating a virtualenv in a Dockerfile article.

CLICK HERE to find out more related problems solutions.

Leave a Comment

Your email address will not be published.

Scroll to Top