Whew this is a lot.
The user-facing changes are:
- `uv toolchain` to `uv python` e.g. `uv python find`, `uv python
install`, ...
- `UV_TOOLCHAIN_DIR` to` UV_PYTHON_INSTALL_DIR`
- `<UV_STATE_DIR>/toolchains` to `<UV_STATE_DIR>/python` (with
[automatic
migration](https://github.com/astral-sh/uv/pull/4735/files#r1663029330))
- User-facing messages no longer refer to toolchains, instead using
"Python", "Python versions" or "Python installations"
The internal changes are:
- `uv-toolchain` crate to `uv-python`
- `Toolchain` no longer referenced in type names
- Dropped unused `SystemPython` type (previously replaced)
- Clarified the type names for "managed Python installations"
- (more little things)
https://github.com/astral-sh/uv/pull/2419 appears to have only applied
this retry to wheels that were already downloaded (though I would have
to look more carefully to be certain). In
https://github.com/astral-sh/uv/issues/1491, we've gotten continued
reports of spurious failures on Windows and tracing reveals that we are
not applying our retry logic during the rename. I believe we're in this
code path — switching to our backoff retry should resolve the failures.
## Summary
We may choose to persist these eventually, but for now, it's useful to
have them colocated with the cache, and in their own dedicated bucket
(so, at the very least, we can keep track of the use-cases).
Closes https://github.com/astral-sh/uv/issues/4219.
With the change, we remove the special casing of workspace dependencies
and resolve `tool.uv` for all git and directory distributions. This
gives us support for non-editable workspace dependencies and path
dependencies in other workspaces. It removes a lot of special casing
around workspaces. These changes are the groundwork for supporting
`tool.uv` with dynamic metadata.
The basis for this change is moving `Requirement` from
`distribution-types` to `pypi-types` and the lowering logic from
`uv-requirements` to `uv-distribution`. This changes should be split out
in separate PRs.
I've included an example workspace `albatross-root-workspace2` where
`bird-feeder` depends on `a` from another workspace `ab`. There's a
bunch of failing tests and regressed error messages that still need
fixing. It does fix the audited package count for the workspace tests.
When parsing requirements from any source, directly parse the url parts
(and reject unsupported urls) instead of parsing url parts at a later
stage. This removes a bunch of error branches and concludes the work
parsing url parts once and passing them around everywhere.
Many usages of the assembled `VerbatimUrl` remain, but these can be
removed incrementally.
Please review commit-by-commit.
## Summary
pip passes these as positional arguments, and at least one build backend
relies on that. My personal opinion is that it's a spec violation, and
the build backend should be updated, but I'd prefer to favor
compatibility over strictness here.
Closes https://github.com/astral-sh/uv/issues/3509.
## Test Plan
`cargo run pip install cryptacular==1.6.2`
## Summary
This PR consolidates the concurrency limits used throughout `uv` and
exposes two limits, `UV_CONCURRENT_DOWNLOADS` and
`UV_CONCURRENT_BUILDS`, as environment variables.
Currently, `uv` has a number of concurrent streams that it buffers using
relatively arbitrary limits for backpressure. However, many of these
limits are conflated. We run a relatively small number of tasks overall
and should start most things as soon as possible. What we really want to
limit are three separate operations:
- File I/O. This is managed by tokio's blocking pool and we should not
really have to worry about it.
- Network I/O.
- Python build processes.
Because the current limits span a broad range of tasks, it's possible
that a limit meant for network I/O is occupied by tasks performing
builds, reading from the file system, or even waiting on a `OnceMap`. We
also don't limit build processes that end up being required to perform a
download. While this may not pose a performance problem because our
limits are relatively high, it does mean that the limits do not do what
we want, making it tricky to expose them to users
(https://github.com/astral-sh/uv/issues/1205,
https://github.com/astral-sh/uv/issues/3311).
After this change, the limits on network I/O and build processes are
centralized and managed by semaphores. All other tasks are unbuffered
(note that these tasks are still bounded, so backpressure should not be
a problem).
## Summary
All of the resolver code is run on the main thread, so a lot of the
`Send` bounds and uses of `DashMap` and `Arc` are unnecessary. We could
also switch to using single-threaded versions of `Mutex` and `Notify` in
some places, but there isn't really a crate that provides those I would
be comfortable with using.
The `Arc` in `OnceMap` can't easily be removed because of the uv-auth
code which uses the
[reqwest-middleware](https://docs.rs/reqwest-middleware/latest/reqwest_middleware/trait.Middleware.html)
crate, that seems to adds unnecessary `Send` bounds because of
`async-trait`. We could duplicate the code and create a `OnceMapLocal`
variant, but I don't feel that's worth it.
## Introduction
PEP 621 is limited. Specifically, it lacks
* Relative path support
* Editable support
* Workspace support
* Index pinning or any sort of index specification
The semantics of urls are a custom extension, PEP 440 does not specify
how to use git references or subdirectories, instead pip has a custom
stringly format. We need to somehow support these while still stying
compatible with PEP 621.
## `tool.uv.source`
Drawing inspiration from cargo, poetry and rye, we add `tool.uv.sources`
or (for now stub only) `tool.uv.workspace`:
```toml
[project]
name = "albatross"
version = "0.1.0"
dependencies = [
"tqdm >=4.66.2,<5",
"torch ==2.2.2",
"transformers[torch] >=4.39.3,<5",
"importlib_metadata >=7.1.0,<8; python_version < '3.10'",
"mollymawk ==0.1.0"
]
[tool.uv.sources]
tqdm = { git = "https://github.com/tqdm/tqdm", rev = "cc372d09dcd5a5eabdc6ed4cf365bdb0be004d44" }
importlib_metadata = { url = "https://github.com/python/importlib_metadata/archive/refs/tags/v7.1.0.zip" }
torch = { index = "torch-cu118" }
mollymawk = { workspace = true }
[tool.uv.workspace]
include = [
"packages/mollymawk"
]
[tool.uv.indexes]
torch-cu118 = "https://download.pytorch.org/whl/cu118"
```
See `docs/specifying_dependencies.md` for a detailed explanation of the
format. The basic gist is that `project.dependencies` is what ends up on
pypi, while `tool.uv.sources` are your non-published additions. We do
support the full range or PEP 508, we just hide it in the docs and
prefer the exploded table for easier readability and less confusing with
actual url parts.
This format should eventually be able to subsume requirements.txt's
current use cases. While we will continue to support the legacy `uv pip`
interface, this is a piece of the uv's own top level interface. Together
with `uv run` and a lockfile format, you should only need to write
`pyproject.toml` and do `uv run`, which generates/uses/updates your
lockfile behind the scenes, no more pip-style requirements involved. It
also lays the groundwork for implementing index pinning.
## Changes
This PR implements:
* Reading and lowering `project.dependencies`,
`project.optional-dependencies` and `tool.uv.sources` into a new
requirements format, including:
* Git dependencies
* Url dependencies
* Path dependencies, including relative and editable
* `pip install` integration
* Error reporting for invalid `tool.uv.sources`
* Json schema integration (works in pycharm, see below)
* Draft user-level docs (see `docs/specifying_dependencies.md`)
It does not implement:
* No `pip compile` testing, deprioritizing towards our own lockfile
* Index pinning (stub definitions only)
* Development dependencies
* Workspace support (stub definitions only)
* Overrides in pyproject.toml
* Patching/replacing dependencies
One technically breaking change is that we now require user provided
pyproject.toml to be valid wrt to PEP 621. Included files still fall
back to PEP 517. That means `pip install -r requirements.txt` requires
it to be valid while `pip install -r requirements.txt` with `-e .` as
content falls back to PEP 517 as before.
## Implementation
The `pep508` requirement is replaced by a new `UvRequirement` (name up
for bikeshedding, not particularly attached to the uv prefix). The still
existing `pep508_rs::Requirement` type is a url format copied from pip's
requirements.txt and doesn't appropriately capture all features we
want/need to support. The bulk of the diff is changing the requirement
type throughout the codebase.
We still use `VerbatimUrl` in many places, where we would expect a
parsed/decomposed url type, specifically:
* Reading core metadata except top level pyproject.toml files, we fail a
step later instead if the url isn't supported.
* Allowed `Urls`.
* `PackageId` with a custom `CanonicalUrl` comparison, instead of
canonicalizing urls eagerly.
* `PubGrubPackage`: We eventually convert the `VerbatimUrl` back to a
`Dist` (`Dist::from_url`), instead of remembering the url.
* Source dist types: We use verbatim url even though we know and require
that these are supported urls we can and have parsed.
I tried to make improve the situation be replacing `VerbatimUrl`, but
these changes would require massive invasive changes (see e.g.
https://github.com/astral-sh/uv/pull/3253). A main problem is the ref
`VersionOrUrl` and applying overrides, which assume the same
requirement/url type everywhere. In its current form, this PR increases
this tech debt.
I've tried to split off PRs and commits, but the main refactoring is
still a single monolith commit to make it compile and the tests pass.
## Demo
Adding
d1ae3b85d5/pyproject.json
as json schema (v7) to pycharm for `pyproject.toml`, you can try the IDE
support already:

[dove.webm](c293c272-c80b-459d-8c95-8c46a8d198a1)
In *some* places in our crates, `serde` (and `rkyv`) are optional
dependencies. I believe this was done out of reasons of "good sense,"
that is, it follows a Rust ecosystem pattern where serde integration
tends to be an opt-in crate feature. (And similarly for `rkyv`.)
However, ultimately, `uv` itself requires `serde` and `rkyv` to
function. Since our crates are strictly internal, there are limited
consumers for our crates without `serde` (and `rkyv`) enabled. I think
one possibility is that optional `serde` (and `rkyv`) integration means
that someone can do this:
cargo test -p pep440_rs
And this will run tests _without_ `serde` or `rkyv` enabled. That in
turn could lead to faster iteration time by reducing compile times. But,
I'm not sure this is worth supporting. The iterative compilation times
of
individual crates are probably fast enough in debug mode, even with
`serde` and `rkyv` enabled. Namely, `serde` and `rkyv` themselves
shouldn't need to be re-compiled in most cases. On `main`:
```
from-scratch: `cargo test -p pep440_rs --lib` 0.685
incremental: `cargo test -p pep440_rs --lib` 0.278s
from-scratch: `cargo test -p pep440_rs --features serde,rkyv --lib` 3.948s
incremental: `cargo test -p pep440_rs --features serde,rkyv --lib` 0.321s
```
So while a from-scratch build does take significantly longer, an
incremental build is about the same.
The benefit of doing this change is two-fold:
1. It brings out crates into alignment with "reality." In particular,
some crates were _implicitly_ relying on `serde` being enabled
without explicitly declaring it. This technically means that our
`Cargo.toml`s were wrong in some cases, but it is hard to observe it
because of feature unification in a Cargo workspace.
2. We no longer need to deal with the cognitive burden of writing
`#[cfg_attr(feature = "serde", ...)]` everywhere.
Previously, uv-auth would fail to compile due to a missing process
feature. I chose to make all tokio features we use top level features,
so we can share the tokio cache between all test invocations.
## Summary
This is mainly a cleanup PR to leverage uv-version in uv-virtualenv
instead of passing it via `uv`.
In #1852 I introduced the ability to pass extra cfg to `gourgeist` for
the primary purpose of passing the uv version, but since the dawn of the
uv-version crate dynamically passing more values to pyvenv.cfg is no
longer needed.
## Test Plan
Existing `uv` tests should still verify `uv = <version>` exists in the
venv and make sure no regressions were introduced.
Since we're using anstream's strip stream, we can force color output
from child processes and strip them when we redirect to a file
Before:

After:

Redirecting to a file correctly strips color codes.
## Summary
This PR enables `--require-hashes` with unnamed requirements. The key
change is that `PackageId` becomes `VersionId` (since it refers to a
package at a specific version), and the new `PackageId` consists of
_either_ a package name _or_ a URL. The hashes are keyed by `PackageId`,
so we can generate the `RequiredHashes` before we have names for all
packages, and enforce them throughout.
Closes#2979.
To get more insights into test performance, allow instrumenting tests
with tracing-durations-export.
Usage:
```shell
# A single test
TRACING_DURATIONS_TEST_ROOT=$(pwd)/target/test-traces cargo test --features tracing-durations-export --test pip_install_scenarios no_binary -- --exact
# All tests
TRACING_DURATIONS_TEST_ROOT=$(pwd)/target/test-traces cargo nextest run --features tracing-durations-export
```
Then we can e.g. look at
`target/test-traces/pip_install_scenarios::no_binary.svg` and see the
builds it performs:

Needed to prevent circular dependencies in my toolchain work (#2931). I
think this is probably a reasonable change as we move towards persistent
configuration too?
Unfortunately `BuildIsolation` needs to be in `uv-types` to avoid
circular dependencies still. We might be able to resolve that in the
future.
## Summary
Now that we're resolving metadata more aggressively for local sources,
it's worth doing this. We now pull metadata from the `pyproject.toml`
directly if it's statically-defined.
Closes https://github.com/astral-sh/uv/issues/2629.
This is driving me a little crazy and is becoming a larger problem in
#2596 where I need to move more types (like `Upgrade` and `Reinstall`)
into this crate. Anything that's shared across our core resolver,
install, and build crates needs to be defined in this crate to avoid
cyclic dependencies. We've outgrown it being a single file with some
shared traits.
There are no behavioral changes here.
If you pass a `pyproject.toml` that use Hatch's context formatting API,
we currently fail because the dependencies aren't valid under PEP 508.
This PR makes the static metadata parsing a little more relaxed, so that
we appropriately fall back to PEP 517 there.
## Summary
Hatch allows for highly dynamic customization of metadata via hooks. In
such cases, Hatch
can't upload the PEP 517 contract, in that the metadata Hatch would
return by
`prepare_metadata_for_build_wheel` isn't guaranteed to match that of the
built wheel.
Hatch disables `prepare_metadata_for_build_wheel` entirely for pip.
We'll instead disable
it on our end when metadata is defined as "dynamic" in the
pyproject.toml, which should
allow us to leverage the hook in _most_ cases while still avoiding
incorrect metadata for
the remaining cases.
Closes: https://github.com/astral-sh/uv/issues/2130.
Scott schafer got me the idea: We can avoid repeating the path for
workspaces dependencies everywhere if we declare them in the virtual
package once and treat them as workspace dependencies from there on.
## Summary
If you have a file `typing.py` in the current working directory, `python
-m` doesn't work in some Python versions:
```sh
❯ python -m foo
Could not import runpy module
Traceback (most recent call last):
File "/Users/crmarsh/.local/share/rtx/installs/python/3.9.18/lib/python3.9/runpy.py", line 15, in <module>
import importlib.util
File "/Users/crmarsh/.local/share/rtx/installs/python/3.9.18/lib/python3.9/importlib/util.py", line 2, in <module>
from . import abc
File "/Users/crmarsh/.local/share/rtx/installs/python/3.9.18/lib/python3.9/importlib/abc.py", line 17, in <module>
from typing import Protocol, runtime_checkable
ImportError: cannot import name 'Protocol' from 'typing' (/Users/crmarsh/workspace/uv/typing.py)
```
This did _not_ cause problems for us on Python 3.11 or later, because we
set `PYTHONSAFEPATH`, which avoids adding the current working directory
to `sys.path`. However, on earlier versions, we _were_ failing with the
above. (It's important that we run interpreter discovery in the current
working directory, since doing otherwise breaks pyenv shims.)
The fix implemented here uses `-I` to run Python in isolated mode, which
is even stricter. The downside of isolated mode is that we currently
rely on setting `PYTHONPATH` to find the "fake module" that we create on
disk, and `-I` means `PYTHONPATH` is totally ignored. So, instead, we
run a script directly, and that _script_ injects the path we care about
into `PYTHONSAFEPATH`.
Closes https://github.com/astral-sh/uv/issues/2547.
## Summary
I tried out `cargo shear` to see if there are any unused dependencies
that `cargo udeps` isn't reporting. It turned out, there are a few. This
PR removes those dependencies.
## Test Plan
`cargo build`
## Summary
If a package uses Hatch's `root.uri` feature, we currently error:
```toml
dependencies = [
"black @ {root:uri}/../black_editable"
]
```
Even though we're using PEP 517 hooks to get the metadata, which
_should_ support this. The problem is that we load the full
`PyProjectToml`, which means we parse the requirements, which means we
reject what looks like a relative URL in dependencies.
Instead, we should only enforce a limited subset of `pyproject.toml`
(arguably none).
Closes https://github.com/astral-sh/uv/issues/2475.
The architecture of uv does not necessarily match that of the python
interpreter (#2326). In cross compiling/testing scenarios the operating
system can also mismatch. To solve this, we move arch and os detection
to python, vendoring the relevant pypa/packaging code, preventing
mismatches between what the python interpreter was compiled for and what
uv was compiled for.
To make the scripts more manageable, they are now a directory in a
tempdir and we run them with `python -m` . I've simplified the
pypa/packaging code since we're still building the tags in rust. A
`Platform` is now instantiated by querying the python interpreter for
its platform. The pypa/packaging files are copied verbatim for easier
updates except a `lru_cache()` python 3.7 backport.
Error handling is done by a `"result": "success|error"` field that allow
passing error details to rust:
```console
$ uv venv --no-cache
× Can't use Python at `/home/konsti/projects/uv/.venv/bin/python3`
╰─▶ Unknown operation system `linux`
```
I've used the [maturin sysconfig
collection](855f6d2cb1/sysconfig)
as reference. I'm unsure how to test these changes across the wide
variety of platforms.
Fixes#2326
## Summary
In #1813, we were failing to install `scikit-image==0.19.3` from source
in Python 3.11. Confusingly, though, the trace showed that the build
command exited with status 0...
The issue is that we get results from the PEP 517 hooks by reading from
`stdout` -- that is, we `print` at the end of the script, and parse the
printed output on the other side.
It turns out that for `scikit-image`, in this case, there was output
_after_ the wheel filename:
```
...
no previously-included directories found matching 'doc/gh-pages'
adding license file 'LICENSE.txt'
writing manifest file 'scikit_image.egg-info/SOURCES.txt'
Copying scikit_image.egg-info to build/bdist.macosx-12.6-arm64/wheel/scikit_image-0.19.3-py3.11.egg-info
running install_scripts
scikit_image-0.19.3-cp311-cp311-macosx_14_0_arm64.whl
INFO:
########### EXT COMPILER OPTIMIZATION ###########
INFO: Platform :
Architecture: aarch64
Compiler : clang
CPU baseline :
Requested : 'min'
Enabled : NEON NEON_FP16 NEON_VFPV4 ASIMD
Flags : none
Extra checks: none
CPU dispatch :
Requested : 'max -xop -fma4'
Enabled : ASIMDHP ASIMDDP ASIMDFHM
Generated : none
INFO: CCompilerOpt.cache_flush[864] : write cache to path -> /private/var/folders/nt/6gf2v7_s3k13zq_t3944rwz40000gn/T/.tmp5ZPIbv/built-wheels-v0/pypi/scikit-image/0.19.3/hLW_f7wWeGDOPRlSazQXw/scikit-image-0.19.3.tar.gz/build/temp.macosx-12.6-arm64-3.11/ccompiler_opt_cache_ext.py
```
We need the `scikit_image-0.19.3-cp311-cp311-macosx_14_0_arm64.whl`
line, but we were failing to find it due to all the extra output at the
end (presumedly, some kind of `atexit` logging).
This PR modifies the hooks to instead write their results to files that
are passed in by the parent. On the other end, we then read the results
back from disk. This makes it much more robust to "other" output in the
script.
Closes https://github.com/astral-sh/uv/issues/1813.
## Test Plan
Ran `cargo run pip install scikit-image==0.19.3 --reinstall
--no-cache-dir` on Python 3.11.
## Summary
If a user provides a source distribution via a direct path, it can
either be an archive (like a `.tar.gz` or `.zip` file) or a directory.
If the former, we need to extract (e.g., unzip) the contents at some
point. Previously, this extraction was in `uv-build`; this PR lifts it
up to the distribution database.
The first benefit here is that various methods that take the
distribution are now simpler, as they can assume a directory.
The second benefit is that we no longer extract _multiple times_ when
working with a source distribution. (Previously, if we tried to get the
metadata, then fell back and built the wheel, we'd extract the wheel
_twice_.)
## Summary
This PR adds support for pip's `--no-build-isolation`. When enabled,
build requirements won't be installed during PEP 517-style builds, but
the source environment _will_ be used when executing the build steps
themselves.
Closes https://github.com/astral-sh/uv/issues/1715.
`uv pip install mysqlclient==2.1.1` on python 3.12 on windows, where the
are no binary wheels:

Part of #2052.
## Summary
Adds support for `--system-site-packages`. Unlike `pip`, we won't take
the system site packages into account in subsequent commands. I think
this is ok.
Closes https://github.com/astral-sh/uv/issues/1483.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
With this PR I've added the option environment variables to the wheel
building process, through the `BuildDispatch`. When integrating uv with
our project pixi (https://github.com/prefix-dev/pixi/pull/863). We ran
into this missing requirement, I've made a rough version here, could
maybe use some refinement.
### Why do we need this?
Because pixi allow the user to use a conda activated prefix for wheel
building, this comes with a number of environment variables, like `PATH`
but also `CONDA_PREFIX` amongst others. This allows the user to use
system dependencies from conda-forge to use during an sdist build.
Because we use `uv` as a library we need to pass in the options
programatically. Additionally, in general there is nothing holding a
python sdist back from actually depending on an environment variable,
see
e.g the test package: https://pypi.org/project/env-test-package/
### What about `ConfigSettings`
I think `ConfigSettings` does not suffice because e.g. CMake could
function differently when the `CONDA_PREFIX` is set. Also, we do not
know if the user supplied backend actually support these settings.
### Path handling
Because the user can now also supply a PATH in the environment map, the
logic I had was the following, I format the path so that it has the
following precedence
1. venv scripts dir.
2. user supplied path.
3. system path.
### Improvements
There is some path modification and copying happening everytime we use
the `run_python_script` function, I think we could improve this but
would like some pointers where to best put the maybe split and cached
version, we might also want to use some types to split these things up.
### Finally
I did not add any of these options to the uv executables, I first would
like to know if this is a direction we would want to go in. I'm happy to
do this or make any changes that you feel would benefit this project.
Also tagging @wolfv to keep track of this as well.
## Test Plan
<!-- How was it tested? -->
---------
Co-authored-by: konsti <konstin@mailbox.org>
## Summary
`PythonPlatform` only exists to format paths to directories within
virtual environments based on a root and an OS, so it's now
`VirtualenvLayout`.
`Virtualenv` is now used for non-virtual environment Pythons, so it's
now `PythonEnvironment`.
## Summary
This PR adds a `--python` flag that allows users to provide a specific
Python interpreter into which `uv` should install packages. This would
replace the `VIRTUAL_ENV=` workaround that folks have been using to
install into arbitrary, system environments, while _also_ actually being
correct for installing into non-virtual environments, where the bin and
site-packages paths can differ.
The approach taken here is to use `sysconfig.get_paths()` to get the
correct paths from the interpreter, and then use those for determining
the `bin` and `site-packages` directories, rather than constructing them
based on hard-coded expectations for each platform.
Closes https://github.com/astral-sh/uv/issues/1396.
Closes https://github.com/astral-sh/uv/issues/1779.
Closes https://github.com/astral-sh/uv/issues/1988.
## Test Plan
- Verified that, on my Windows machine, I was able to install `requests`
into a global environment with: `cargo run pip install requests --python
'C:\\Users\\crmarsh\\AppData\\Local\\Programs\\Python\\Python3.12\\python.exe`,
then `python` and `import requests`.
- Verified that, on macOS, I was able to install `requests` into a
global environment installed via Homebrew with: `cargo run pip install
requests --python $(which python3.8)`.
## Summary
When you invoke `python -c`, an empty string is prepended to `sys.path`,
which allows loading modules in the current directory
(https://docs.python.org/3/using/cmdline.html#cmdoption-P). However, in
PEP 517 builds, the current directory should _not_ be part of the path.
There's a flag we can use to disable this behavior (`-P`), but it's only
available in Python 3.11 and later, so instead, I'm doing something
similar to pip's `__main__.py`, which avoids this for `python -m pip`
invocations.
Closes https://github.com/astral-sh/uv/issues/1972.