Table of contents
What is a Python Environment?
When you install Python on your machine, you get a single global Python interpreter and a shared pool of installed packages. Every script you run draws from that same pool by default.
That works fine for small experiments — but the moment you're juggling multiple projects,
it becomes a problem. Project A might need Django 4.2, while Project B was built on Django 3.2
and breaks on anything newer. Since only one version can live in the global environment at a time,
you're stuck.
A Python environment is an isolated container that bundles:
- A Python interpreter (a specific version)
- A set of installed packages, independent of every other environment
- Its own
pipand associated tooling
Changes inside one environment have zero effect on anything outside of it. You can have as many environments as you like — one per project is the standard approach.
Why Virtual Environments Matter
Here's a concrete scenario. You have two projects side by side:
📁 project-a/ ← needs requests==2.28.0
📁 project-b/ ← needs requests==2.31.0
Without virtual environments, installing requests 2.31.0 for Project B silently breaks Project A.
There's no warning — things just stop working.
Virtual environments solve this by giving each project its own isolated space. Beyond that, they help with:
- Reproducibility — you can snapshot exactly what's installed and recreate it on any machine
- Clean system Python — on macOS and Linux the OS itself may rely on certain Python packages; touching those can cause real damage
- Smoother collaboration — teammates clone the repo, restore the environment from a file, and are running immediately
In practice: always use a virtual environment, even for small projects. It costs nothing and saves real headaches.
venv — The Built-in Way
venv is Python's built-in module for creating virtual environments. No installation needed —
it ships with Python 3.3 and above. For most projects it's all you'll ever need.
Creating an Environment
Navigate to your project folder and run:
python3 -m venv .venv
This creates a .venv directory inside your project. The name .venv is just a convention —
you can name it anything — but it's widely recognized by editors, linters, and .gitignore
templates, so it's a good habit to stick with it.
If you need a specific Python version, point directly at that interpreter:
python3.11 -m venv .venv
Activating and Deactivating
Creating the environment doesn't switch you into it automatically. You need to activate it first. The command differs by OS:
macOS / Linux:
source .venv/bin/activate
Windows (Command Prompt):
.venv\Scripts\activate.bat
Windows (PowerShell):
.venv\Scripts\Activate.ps1
Once active, your shell prompt will show the environment name:
(.venv) your-machine:project-a $
From this point on, python and pip refer to the isolated environment — not your system Python.
To leave the environment:
deactivate
Your prompt returns to normal and all commands go back to the global Python.
What's Inside
The .venv directory has a predictable structure:
📁 .venv/
├── 📁 bin/ ← python, pip, activate scripts (Linux/macOS)
├── 📁 Scripts/ ← same, but on Windows
├── 📁 lib/ ← installed packages live here
│ └── 📁 python3.x/
│ └── 📁 site-packages/
└── 📄 pyvenv.cfg ← records which Python this env was built from
The pyvenv.cfg file is worth knowing about — it stores the path to the base Python interpreter
and the environment version. You don't need to edit it, but it's useful to know it exists when
debugging environment issues.
Managing Dependencies
Installing Packages
With the environment active, install packages exactly as you normally would with pip:
pip install requests
pip install django==4.2
pip install "fastapi>=0.100"
Everything gets installed into .venv/lib/, completely separate from any other environment.
Freezing Dependencies
Once your project is working, capture the exact list of installed packages:
pip freeze > requirements.txt
This produces a file like:
certifi==2024.2.2
charset-normalizer==3.3.2
idna==3.6
requests==2.31.0
urllib3==2.2.1
Every package and its precise version is recorded. Commit this file to your repository — it's the contract for what your project needs to run.
Restoring Dependencies
On a fresh machine (or when a teammate clones the repo), create a new environment and restore everything in one command:
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
The project is now running with exactly the same packages as the original environment.
pyenv — Managing Python Versions
venv creates isolated environments, but it uses whatever Python version is already on your system.
If you need to work with Python 3.9 on one project and Python 3.12 on another, venv alone
won't help — you'd need both versions installed.
That's exactly what pyenv does: it lets you install and switch between multiple Python versions on a single machine without touching the system Python.
Installing pyenv
macOS (via Homebrew):
brew install pyenv
Linux (via the installer script):
curl https://pyenv.run | bash
After installation, add the following to your shell config file (~/.zshrc, ~/.bashrc, or equivalent):
export PYENV_ROOT="$HOME/.pyenv"
export PATH="$PYENV_ROOT/bin:$PATH"
eval "$(pyenv init -)"
Then restart your shell:
exec "$SHELL"
Installing a Python Version
List available versions:
pyenv install --list
Install a specific one:
pyenv install 3.11.9
pyenv install 3.12.3
See what you have installed:
pyenv versions
Which outputs something like:
system
3.11.9
* 3.12.3 (set by /home/user/.pyenv/version)
The * marks the currently active version.
Setting a Version
Set a version globally (affects your whole machine):
pyenv global 3.12.3
Set a version locally (scoped to the current directory — stored in a .python-version file):
pyenv local 3.11.9
The local version takes priority over the global one. Commit .python-version to your repo
so everyone on the team uses the same Python version automatically.
Set a version for just the current shell session:
pyenv shell 3.10.14
Combining pyenv with venv
pyenv and venv complement each other perfectly. Use pyenv to get the Python version you need,
then use venv to create an isolated environment from it:
pyenv local 3.11.9
python -m venv .venv
source .venv/bin/activate
Now .venv is built on Python 3.11.9, regardless of what's installed globally on the system.
conda — Environments for Data Work
conda is a different kind of tool. Where venv and pip focus purely on Python packages,
conda is a full environment and package manager that can handle non-Python dependencies too —
C libraries, CUDA drivers, R packages. It's the standard in data science and machine learning workflows.
conda comes with either Anaconda (a large, batteries-included distribution) or
Miniconda (minimal installer — install only what you need). For most development work, Miniconda is preferred.
Creating a conda Environment
Create a new environment with a specific Python version:
conda create --name myenv python=3.11
Or create it in a specific directory (useful for project-local environments):
conda create --prefix ./.condaenv python=3.11
Activating and Deactivating
Activate by name:
conda activate myenv
Activate a prefix-based environment:
conda activate ./.condaenv
Your prompt will update to reflect the active environment:
(myenv) your-machine:project $
Deactivate:
conda deactivate
Managing Packages with conda
Install packages using conda install:
conda install numpy pandas matplotlib
conda resolves dependencies across all installed packages to avoid conflicts — a step beyond what pip does.
You can still use pip inside a conda environment for packages not available in the conda repositories:
pip install some-package
Just be aware that mixing conda install and pip install in the same environment can sometimes
cause subtle conflicts. A good rule of thumb: install as much as possible via conda first, then
use pip only for what's missing.
Exporting and Restoring
Export the environment to a YAML file:
conda env export > environment.yml
The resulting file looks like this:
name: myenv
channels:
- defaults
dependencies:
- python=3.11.9
- numpy=1.26.4
- pandas=2.2.1
- pip:
- some-pip-only-package==1.0.0
Restore the environment from the file:
conda env create -f environment.yml
Commit environment.yml to your repository alongside (or instead of) requirements.txt for conda-based projects.
Choosing the Right Tool
With several options available, here's a quick guide to picking the right one:
| Scenario | Recommended tool |
|---|---|
| General Python development |
|
| Need multiple Python versions |
|
| Data science / ML project |
|
| Need both Python version control and isolation |
|
| Team uses mixed Python/non-Python dependencies |
|
For most web development and backend work, pyenv + venv is the sweet spot:
lightweight, well-supported, and zero overhead.
For data science and machine learning, conda wins because of its ability to manage
native library dependencies like BLAS, CUDA, and HDF5 alongside Python packages.
Best Practices
Always create a virtual environment for every project — even one-off scripts. It takes five seconds and eliminates an entire class of dependency problems.
Name your environment .venv — it's recognized automatically by VS Code, PyCharm,
most .gitignore templates, and the wider Python tooling ecosystem.
Never commit the environment directory itself — add .venv/ and .condaenv/ to .gitignore.
Commit the dependency file (requirements.txt or environment.yml) instead.
Pin your dependencies — use exact versions in requirements.txt for production projects.
This avoids surprises when a dependency releases a breaking update.
Use .python-version with pyenv — committing this file means everyone on the team
gets the same Python version automatically, without any extra coordination.
Prefer conda channels carefully — stick to conda-forge or defaults for stability.
Mixing channels from different sources is a common source of hard-to-debug conflicts.
