Environment
When you run code locally, you manage your own Python environment. On C3, you declare your dependencies in your .c3 config and C3 builds the environment on the GPU automatically.
Python projects
Point python.project at a directory containing pyproject.toml and uv.lock:
python:
project: ./
C3 uses uv for fast, reproducible installs. When your job starts, C3 runs uv sync on the GPU to create the virtual environment before executing your script.
Generate a lock file
If you don't have a uv.lock yet:
uv lock
This creates a lock file that pins exact dependency versions, ensuring your job gets the same environment every time.
Subdirectory projects
If your Python project is in a subdirectory:
python:
project: ./my_project
The path is relative to the .c3 file.
Environment caching
C3 caches built environments based on the hash of your uv.lock file. If your dependencies haven't changed since the last job, the cached environment is reused instantly with no rebuild. This means:
- First job: Full
uv sync(typically 10-30 seconds depending on dependencies) - Subsequent jobs with same lock file: Cached environment, near-instant setup
- After changing dependencies: Run
uv locklocally, then deploy. First job with the new lock file rebuilds, then it is cached again.
python.project instead of pip install in your scriptAlways declare dependencies via python.project rather than running pip install in your bash script. C3 caches the environment across jobs based on your uv.lock hash — unchanged dependencies are reused instantly. Running pip install in the script would reinstall everything from scratch on every job.
What gets installed
C3 GPU VMs come with:
- Python (via uv)
- CUDA drivers and toolkit
- Standard system libraries
Your pyproject.toml + uv.lock define everything else. This keeps environments reproducible: the same lock file always produces the same environment, regardless of which GPU or data center runs your job.
Full example
# .c3
project: my-experiment
script: run.sh
gpu: l40
time: "02:00:00"
python:
project: ./
datasets:
- ref: /datasets/imagenet
mount: /data/imagenet
output:
- ./results
# run.sh
#!/bin/bash
python3 train.py
# pyproject.toml
[project]
name = "my-experiment"
requires-python = ">=3.11"
dependencies = [
"jax[cuda12]",
"flax",
"optax",
"numpy",
]