Home Python Python Virtual Environments Explained — Setup, Usage and Best Practices

Python Virtual Environments Explained — Setup, Usage and Best Practices

In Plain English 🔥
Imagine you're a chef who cooks at two different restaurants. Restaurant A wants you to use only olive oil; Restaurant B insists on butter. You can't mix them up or both meals are ruined. A Python virtual environment is your own private kitchen for each project — it keeps that project's ingredients (libraries) completely separate from every other project's ingredients, so nothing ever gets mixed up or breaks each other.
⚡ Quick Answer
Imagine you're a chef who cooks at two different restaurants. Restaurant A wants you to use only olive oil; Restaurant B insists on butter. You can't mix them up or both meals are ruined. A Python virtual environment is your own private kitchen for each project — it keeps that project's ingredients (libraries) completely separate from every other project's ingredients, so nothing ever gets mixed up or breaks each other.

Every Python developer hits the same wall eventually: you install a package for one project, and suddenly another project breaks. Or your colleague sends you code that works on their machine but explodes on yours with a cryptic version error. These aren't rare edge cases — they're the everyday reality of working with Python, and they happen because Python installs packages globally by default, meaning every project on your computer shares the same pool of libraries.

Virtual environments solve this by giving each project its own isolated bubble. Inside that bubble, you can install exactly the packages — and exactly the versions — that project needs, without touching anything else. Delete the bubble when the project is done and your computer is as clean as when you started.

By the end of this article you'll know how to create a virtual environment from scratch, activate and deactivate it, install packages inside it, save a record of those packages so teammates can replicate your setup exactly, and spot the traps that catch beginners off guard. No prior experience needed — we'll build everything step by step.

Why Global Package Installation Is a Ticking Time Bomb

When you first install Python and run pip install requests, that library lands in a single global location on your computer. Every Python script you ever write shares that same copy. Sounds convenient — until it isn't.

Picture this: your first project uses requests version 2.20. Six months later you start a new project that needs requests version 2.31 because it uses a feature that didn't exist before. You upgrade globally. Your old project breaks. You downgrade to fix the old project. The new project breaks. You're stuck in a loop with no clean way out.

This exact scenario has a name in software: dependency conflict. It's not hypothetical — it's the single most common source of pain for Python beginners and professionals alike.

Virtual environments break this cycle permanently. Each environment is a self-contained folder that holds its own Python interpreter and its own set of packages. Project A's requests 2.20 and Project B's requests 2.31 can coexist peacefully on the same machine because they live in completely different folders and never meet.

check_global_python_paths.py · PYTHON
1234567891011121314151617181920
import sys
import site

# This script shows you WHERE Python is currently looking for packages.
# Run this BEFORE creating a virtual environment to see the global setup.
# Then run it again AFTER activating one — the paths will be completely different.

print("=== Python Executable Location ===")
# This tells you WHICH python binary is currently running this script
print(f"Python binary: {sys.executable}")

print("\n=== Where Python Looks for Installed Packages ===")
# sys.path is the list of folders Python searches when you write 'import something'
for index, path in enumerate(sys.path):
    print(f"  [{index}] {path}")

print("\n=== Site-packages Directory (where pip installs things) ===")
# site.getsitepackages() returns the folders where pip drops installed libraries
for packages_dir in site.getsitepackages():
    print(f"  {packages_dir}")
▶ Output
=== Python Executable Location ===
Python binary: /usr/local/bin/python3

=== Where Python Looks for Installed Packages ===
[0]
[1] /usr/local/lib/python311.zip
[2] /usr/local/lib/python3.11
[3] /usr/local/lib/python3.11/lib-dynload
[4] /usr/local/lib/python3.11/site-packages

=== Site-packages Directory (where pip installs things) ===
/usr/local/lib/python3.11/site-packages
🔥
Why this output matters:Notice the Python binary points to a system-wide location like `/usr/local/bin/python3`. When you activate a virtual environment later, this path will change to something inside your project folder — that's your proof the isolation is working.

Creating and Activating Your First Virtual Environment

Python ships with a built-in module called venv (short for virtual environment). You don't need to install anything — it's already there, waiting. You create a virtual environment with a single terminal command, and from that moment on, that folder is your project's private kitchen.

The general pattern is: python -m venv . The name is just a folder name — most developers use venv or .venv (with a dot prefix so the folder is hidden by default on Mac/Linux).

Once created, you need to activate it. Activating tells your terminal 'for every command I run from now on, use the Python and pip inside this folder, not the global ones'. The activation command is slightly different depending on your operating system, but the effect is identical.

After activation your terminal prompt usually changes to show the environment name in parentheses — that's your visual confirmation that you're inside the bubble. When you're done working, deactivate drops you back to the global environment.

virtual_environment_setup.sh · BASH
12345678910111213141516171819202122232425262728293031323334353637383940414243
# ── STEP 1: Create a project folder and move into it ──────────────────────────
mkdir my_weather_app
cd my_weather_app

# ── STEP 2: Create the virtual environment ────────────────────────────────────
# 'python -m venv' tells Python to run the built-in venv module
# '.venv' is the name of the folder that will hold the environment
# Using a dot prefix (.venv) keeps it hidden from casual 'ls' listings
python -m venv .venv

# You'll now see a .venv folder. Peek inside — it's just files:
# .venv/
#   bin/          ← Python binary and activation scripts (Mac/Linux)
#   lib/          ← Where pip will install packages FOR THIS PROJECT ONLY
#   pyvenv.cfg    ← Config file pointing back to the original Python

# ── STEP 3: Activate the environment ──────────────────────────────────────────

# On macOS / Linux:
source .venv/bin/activate

# On Windows (Command Prompt):
# .venv\Scripts\activate.bat

# On Windows (PowerShell):
# .venv\Scripts\Activate.ps1

# ── STEP 4: Confirm the environment is active ─────────────────────────────────
# Your prompt should now show (.venv) at the start, like:
# (.venv) user@machine:~/my_weather_app$

# Double-check by asking WHICH python is active:
which python          # Mac/Linux
# Should print: /path/to/my_weather_app/.venv/bin/python
# (NOT /usr/local/bin/python3 — that's the global one)

# ── STEP 5: Install a package INSIDE the environment ──────────────────────────
# This installs 'requests' ONLY for this project, not globally
pip install requests

# ── STEP 6: Deactivate when you're done working ───────────────────────────────
deactivate
# Prompt returns to normal — you're back in the global environment
▶ Output
# After 'source .venv/bin/activate':
(.venv) user@machine:~/my_weather_app$

# After 'which python':
/home/user/my_weather_app/.venv/bin/python

# After 'pip install requests':
Collecting requests
Downloading requests-2.31.0-py3-none-any.whl (62 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.6/62.6 kB 1.2 MB/s eta 0:00:00
Installing collected packages: requests
Successfully installed requests-2.31.0

# After 'deactivate':
user@machine:~/my_weather_app$
⚠️
Watch Out: Windows PowerShell Execution PolicyOn Windows, running `.venv\Scripts\Activate.ps1` might throw 'running scripts is disabled on this system'. Fix it by running `Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser` in PowerShell once. This is a Windows security setting — it doesn't affect your code at all.

Freezing Dependencies So Your Team Gets Identical Setups

Creating a virtual environment is half the job. The other half is making sure anyone else — a teammate, your future self on a new laptop, a deployment server — can recreate that exact same environment instantly.

That's what requirements.txt is for. It's a plain text file that lists every package your project depends on, along with the exact version numbers. Think of it as a recipe card: instead of 'bring the ingredients', you hand someone the full recipe and they produce the identical dish.

You generate it with one command: pip freeze > requirements.txt. The pip freeze part lists all installed packages with their versions; the > requirements.txt part saves that list into a file. To recreate the environment from that file, anyone runs pip install -r requirements.txt.

This is not optional polish — it's professional practice. Every serious Python project has a requirements.txt (or its more advanced cousin pyproject.toml). Without it, your project only works on your machine, which is the definition of a broken workflow.

freeze_and_recreate_environment.sh · BASH
123456789101112131415161718192021222324252627282930
# ── SCENARIO: You've built your weather app. Time to share it. ────────────────

# Make sure the virtual environment is active first:
source .venv/bin/activate

# Install the packages the app actually needs:
pip install requests==2.31.0
pip install python-dotenv==1.0.0

# ── STEP 1: Freeze the environment into a requirements file ───────────────────
# 'pip freeze' lists EVERY installed package with its exact version
# '>' redirects that output into a file called requirements.txt
pip freeze > requirements.txt

# Let's see what that file looks like:
cat requirements.txt

# ── STEP 2: Simulate what a teammate would do on their machine ────────────────
# They clone your repo, then:
mkdir colleagues_machine_simulation
cd colleagues_machine_simulation
python -m venv .venv
source .venv/bin/activate

# One command installs EVERYTHING, at EXACTLY the same versions:
# '-r' means 'read from this file'
pip install -r ../requirements.txt

# ── STEP 3: Verify they got the same setup ────────────────────────────────────
pip list
▶ Output
# Output of 'cat requirements.txt':
certifi==2023.7.22
charset-normalizer==3.3.0
idna==3.4
python-dotenv==1.0.0
requests==2.31.0
urllib3==2.0.7

# Output of 'pip list' on colleague's machine:
Package Version
------------------ ---------
certifi 2023.7.22
charset-normalizer 3.3.0
idna 3.4
pip 23.3.1
python-dotenv 1.0.0
requests 2.31.0
urllib3 2.0.7
⚠️
Pro Tip: Always add .venv to .gitignoreNever commit the `.venv` folder to Git. It's large, machine-specific, and completely reproducible from `requirements.txt`. Add `.venv/` to your `.gitignore` file. Commit the `requirements.txt` file. Your repo stays lean and your teammates get everything they need.

A Real-World Mini Project Tying It All Together

Theory is fine, but let's build something real inside a virtual environment so the whole workflow clicks. We'll write a small script that fetches the current weather for a city using the requests library — a package we install inside a virtual environment, not globally.

This example shows the complete developer workflow you'll repeat on every Python project you ever build: create environment → activate → install dependencies → write code → freeze requirements → deactivate. Once this muscle memory kicks in, you'll do it automatically.

Pay attention to the Python script itself too. It runs perfectly inside the virtual environment because requests is installed there. If you deactivate the environment and try to run the same script with the global Python, it would fail with ModuleNotFoundError: No module named 'requests' — unless you also happen to have it globally. That's virtual environment isolation working exactly as intended.

fetch_weather.py · PYTHON
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172
# fetch_weather.py
# Run this INSIDE an active virtual environment where 'requests' is installed.
# Setup commands (run in terminal first):
#   python -m venv .venv
#   source .venv/bin/activate          (Mac/Linux)
#   pip install requests==2.31.0
#   pip freeze > requirements.txt
#   python fetch_weather.py

import requests  # Only available because we installed it in our virtual environment
import sys

# Open-Meteo is a free weather API — no API key needed, perfect for learning
WEATHER_API_BASE_URL = "https://api.open-meteo.com/v1/forecast"

# Coordinates for London, UK — change these to your city
LATITUDE = 51.5074
LONGITUDE = -0.1278
CITY_NAME = "London"

def fetch_current_temperature(latitude: float, longitude: float) -> dict:
    """
    Calls the Open-Meteo API and returns the current temperature data.
    Returns a dict with 'temperature' and 'unit' keys, or raises on failure.
    """
    # Build the query parameters the API expects
    query_params = {
        "latitude": latitude,
        "longitude": longitude,
        "current_weather": True,   # Ask for current conditions, not a forecast
        "temperature_unit": "celsius"
    }

    # requests.get() makes an HTTP GET request — this is why we installed 'requests'
    api_response = requests.get(WEATHER_API_BASE_URL, params=query_params, timeout=10)

    # Raise an exception immediately if the server returned an error status (4xx, 5xx)
    api_response.raise_for_status()

    # .json() parses the response body from a raw string into a Python dictionary
    weather_data = api_response.json()

    current_conditions = weather_data["current_weather"]
    return {
        "temperature": current_conditions["temperature"],
        "unit": "°C",
        "wind_speed_kmh": current_conditions["windspeed"]
    }


def display_weather_report(city: str, weather: dict) -> None:
    """Prints a clean, readable weather summary to the terminal."""
    print("\n" + "=" * 40)
    print(f"  Current Weather — {city}")
    print("=" * 40)
    print(f"  Temperature : {weather['temperature']}{weather['unit']}")
    print(f"  Wind Speed  : {weather['wind_speed_kmh']} km/h")
    print("=" * 40 + "\n")


if __name__ == "__main__":
    print(f"Fetching weather for {CITY_NAME}...")

    try:
        current_weather = fetch_current_temperature(LATITUDE, LONGITUDE)
        display_weather_report(CITY_NAME, current_weather)
    except requests.exceptions.ConnectionError:
        print("ERROR: No internet connection. Check your network and try again.")
        sys.exit(1)
    except requests.exceptions.HTTPError as http_err:
        print(f"ERROR: The weather API returned an error: {http_err}")
        sys.exit(1)
▶ Output
Fetching weather for London...

========================================
Current Weather — London
========================================
Temperature : 14.2°C
Wind Speed : 18.5 km/h
========================================
🔥
Interview Gold: What happens if you skip the virtual environment?If you install `requests` globally and share only your `.py` file, a colleague cloning your repo gets `ModuleNotFoundError` immediately. The professional fix is always: virtual environment + `requirements.txt` committed to the repo. Interviewers love this answer because it shows you think about reproducibility, not just making code work on your own machine.
AspectNo Virtual Environment (Global)With Virtual Environment
Package storage locationOne shared system folder for all projectsIsolated folder per project — no sharing
Risk of version conflictsHigh — upgrading for one project breaks anotherZero — each project has its own versions
Replicating setup on another machineManual, error-prone, undocumentedOne command: pip install -r requirements.txt
Cleaning up after project endsHunt down and uninstall packages manuallyDelete the .venv folder — done completely
Working on two projects simultaneouslyOnly one version of any package at a timeDifferent versions side-by-side, no conflict
CI/CD and deployment pipelinesFragile — depends on server's global stateReliable — environment is fully specified

🎯 Key Takeaways

  • A virtual environment is an isolated folder containing its own Python binary and packages — activating it redirects ALL pip and python commands to use that folder instead of the global system.
  • python -m venv .venv creates the environment; source .venv/bin/activate (Mac/Linux) or .venv\Scripts\activate (Windows) turns it on — your prompt shows (.venv) to confirm it's active.
  • pip freeze > requirements.txt captures the exact state of your environment — anyone can recreate it identically with pip install -r requirements.txt, making your project reproducible across machines.
  • Never commit the .venv folder to Git — it's machine-specific and can be hundreds of MB; always commit requirements.txt instead and regenerate the environment from it.

⚠ Common Mistakes to Avoid

  • Mistake 1: Forgetting to activate the environment before installing packages — Symptom: you run pip install requests and it installs globally instead of into your project. You only notice when a teammate clones your repo and gets ModuleNotFoundError despite running pip install. Fix: always check your terminal prompt for the (.venv) prefix before running any pip command. If it's missing, run source .venv/bin/activate (Mac/Linux) or .venv\Scripts\activate (Windows) first.
  • Mistake 2: Committing the .venv folder to Git — Symptom: your repository balloons to hundreds of megabytes, cloning takes ages, and teammates on different operating systems get broken environments because the compiled binaries inside .venv are platform-specific. Fix: create a .gitignore file in your project root and add .venv/ to it immediately after creating the environment. Only ever commit requirements.txt, never the folder itself.
  • Mistake 3: Running pip freeze > requirements.txt with test packages installed — Symptom: your requirements.txt includes development tools like pytest or black that your production server doesn't need, causing slower deployments and potential security bloat. Fix: maintain two files — requirements.txt for production dependencies and requirements-dev.txt for development tools. Generate them separately, install dev requirements with pip install -r requirements-dev.txt on your local machine only.

Interview Questions on This Topic

  • QWhy would you use a virtual environment instead of just installing packages globally? Can you describe a real scenario where not using one caused a problem?
  • QIf a colleague clones your Python project and gets a ModuleNotFoundError when they run it, what's the most likely cause and what's the correct fix?
  • QWhat's the difference between `pip freeze` and `pip list`, and when would you use each one?

Frequently Asked Questions

Do I need to create a new virtual environment for every Python project?

Yes, and that's the whole point. One environment per project guarantees that each project's dependencies stay isolated. It takes about 10 seconds to create one, and it saves hours of debugging mysterious version conflicts down the road. Think of it as a professional habit, not extra work.

What's the difference between venv, virtualenv, and conda?

venv is built into Python 3.3+ and covers 90% of use cases with zero installation required — use this unless you have a specific reason not to. virtualenv is a third-party package that predates venv and offers a few extra features like support for older Python versions. conda is a completely different tool from Anaconda that manages both Python packages AND non-Python system dependencies, making it popular in data science but overkill for most web or scripting projects.

If I delete my virtual environment folder by accident, do I lose my code?

No — your code lives in your project folder, not inside .venv. The virtual environment folder only contains Python itself and installed libraries. As long as you have your requirements.txt committed to Git, you can recreate the entire environment in seconds with python -m venv .venv followed by pip install -r requirements.txt. This is exactly why committing requirements.txt to Git is non-negotiable.

🔥
TheCodeForge Editorial Team Verified Author

Written and reviewed by senior developers with real-world experience across enterprise, startup and open-source projects. Every article on TheCodeForge is written to be clear, accurate and genuinely useful — not just SEO filler.

← PreviousUnit Testing with pytestNext →Python Packaging and pip
Forged with 🔥 at TheCodeForge.io — Where Developers Are Forged