<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
https://docs.rs/serde_json/latest/serde_json/fn.from_reader.html
suggests that
> When reading from a source against which short reads are not
efficient, such as a
[File](https://doc.rust-lang.org/std/fs/struct.File.html), you will want
to apply your own buffering because serde_json will not buffer the
input. See
[std::io::BufReader](https://doc.rust-lang.org/std/io/struct.BufReader.html).
Without this buffering, we observe a sequence of single byte reads which
can be quite inefficient depending on the underlying filesystem.
This adds buffering with `std::io::BufReader` to resolve this.
<!-- What's the purpose of the change? What does it do, and why? -->
## Test Plan
Unit tests cover this code.
<!-- How was it tested? -->
## Summary
This PR extends #10046 to also handle architectures, which allows us to
correctly include `2.5.1` on the `cu124` index for ARM Linux.
Closes https://github.com/astral-sh/uv/issues/9655.
## Summary
This should address the comment here:
https://github.com/astral-sh/uv/pull/10179#issuecomment-2569189265. We
don't compute implied markers if the marker is already `TRUE`, and we
set it to `TRUE` as soon as we see a source distribution. So if we visit
the source distribution before the wheels, we'll avoid computing these
for any irrelevant distributions.
## Summary
This is yet another variation on
https://github.com/astral-sh/uv/pull/9928, with a few minor changes:
1. It only applies to local versions (e.g., `2.5.1+cpu`).
2. It only _considers_ the non-local version as an alternative (e.g.,
`2.5.1`).
3. It only _considers_ the non-local alternative if it _does_ support
the unsupported platform.
4. Instead of failing, it falls back to using the local version.
So, this is far less strict, and is effectively designed to solve
PyTorch but nothing else. It's also not user-configurable, except by way
of using `environments` to exclude platforms.
Build failures are one of the most common user facing failures that
aren't "obivous" errors (such as typos) or resolver errors. Currently,
they show more technical details than being focussed on this being an
error in a subprocess that is either on the side of the package or -
more likely - in the build environment, e.g. the user needs to install a
dev package or their python version is incompatible.
The new error message clearly delineates the part that's important (this
is a build backend problem) from the internals (we called this hook) and
is consistent about which part of the dist building stage failed. We
have to calibrate the exact wording of the error message some more. Most
of the implementation is working around the orphan rule, (this)error
rules and trait rules, so it came out more of a refactoring than
intended.
Example:

For publishing, we want to allow all simple `[[tool.uv.index]]` entries,
whether they are explicit or not. We don't allow flat indexes here,
assuming that an index you can upload to has a simple index URL (and
generally doesn't have a flat index URL, at least I don't know any case
that has).
The `no_index` branch isn't used atm, but I left it in case the method
gathers more users.
Fixes#9919
## Summary
This PR addresses a significant limitation in the resolver whereby we
avoid choosing the latest versions of packages when the user supports a
wider range.
For example, with NumPy, the latest versions only support Python 3.10
and later. If you lock a project with `requires-python = ">=3.8"`, we
pick the last NumPy version that supported Python 3.8, and use that for
_all_ Python versions. So you get `1.24.4` for all versions, rather than
`2.2.0`. And we'll never upgrade you unless you bump your
`requires-python`. (Even worse, those versions don't have wheels for
Python 3.12, etc., so you end up building from source.)
(As-is, this is intentional. We optimize for minimizing the number of
selected versions, and the current logic does that well!)
Instead, we know recognize when a version has an elevated
`requires-python` specifier and fork. This is a new fork point, since we
need to fork once we have the package metadata, as opposed to when we
see the dependencies.
In this iteration, I've made this behavior the default. I'm sort of
undecided on whether I want to push on that... Previously, I'd suggested
making it opt-in via a setting
(https://github.com/astral-sh/uv/pull/8686).
Closes https://github.com/astral-sh/uv/issues/8492.
When publishing, we currently ask the user to set `--publish-url` to the
upload URL and `--check-url` to the simple index URL, or the equivalent
configuration keys. But that's redundant with the `[[tool.uv.index]]`
declaration. Instead, we extend `[[tool.uv.index]]` with a `publish-url`
entry and allow passing `uv publish --index <name>`.
`uv publish --index <name>` requires the `pyproject.toml` to be present
when publishing, unlike using `--publish-url ... --check-url ...` which
can be used e.g. in CI without a checkout step. `--index` also always
uses the check URL feature to aid upload consistency.
The documentation tries to explain both approaches together, which
overlap for the check URL feature.
Fixes#8864
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
Instead of modifying the error to replace a dummy derivation chain from
construction with the real one, build the error with the real derivation
chain directly.
This came up when trying to improve the build error reporting.
Introduces `DistErrorKind` to avoid error variants for each case that
are only different in one line of the message.
When encountering `dynamic = ["version"]` in the pyproject.toml of a
source dist, we can ignore that and treat it as a statically known
metadata distribution, since the filename tells us the version and that
version must not change on build.
This fixed locking PyGObject 3.50.0 from `pygobject-3.50.0.tar.gz`
(minimized):
```toml
[project]
name = "PyGObject"
description = "Python bindings for GObject Introspection"
requires-python = ">=3.9, <4.0"
dependencies = [
"pycairo>=1.16"
]
dynamic = ["version"]
```
Afterwards, `uv add --no-sync toga` passes on Ubuntu 24.04 without the
pygobject build deps, when previously it needed `{ name = "pygobject",
version = "3.50.0", requires-dist = [], requires-python = ">=3.9" }`.
I've added a check that source distribution versions are respected after
build.
Fixes#9548
## Summary
A lot of good new lints, and most importantly, error stabilizations. I
tried to find a few usages of the new stabilizations, but I'm sure there
are more.
IIUC, this _does_ require bumping our MSRV.
When building only a single crate in the workspace to run its tests, we
often recompile a lot of other, unrelated crates. Whenever cargo has a
different set of crate features, it needs to recompile. By moving some
features (non-exhaustive for now) to the workspace level, we always
activate them an avoid recompiling.
The cargo docs mismatch the behavior of cargo around default-deps, so I
filed that upstream and left most `default-features` mismatches:
https://github.com/rust-lang/cargo/issues/14841.
Reference script:
```python
import tomllib
from collections import defaultdict
from pathlib import Path
uv = Path("/home/konsti/projects/uv")
skip_list = ["uv-trampoline", "uv-dev", "uv-performance-flate2-backend", "uv-performance-memory-allocator"]
root_feature_map = defaultdict(set)
root_default_features = defaultdict(bool)
cargo_toml = tomllib.loads(uv.joinpath("Cargo.toml").read_text())
for dep, declaration in cargo_toml["workspace"]["dependencies"].items():
root_default_features[dep] = root_default_features[dep] or declaration.get("default-features", True)
root_feature_map[dep].update(declaration.get("features", []))
feature_map = defaultdict(set)
default_features = defaultdict(bool)
for crate in uv.joinpath("crates").iterdir():
if crate.name in skip_list:
continue
if not crate.joinpath("Cargo.toml").is_file():
continue
cargo_toml = tomllib.loads(crate.joinpath("Cargo.toml").read_text())
for dep, declaration in cargo_toml.get("dependencies", {}).items():
# If any item uses default features, they are used everywhere
default_features[dep] = default_features[dep] or declaration.get("default-features", True)
feature_map[dep].update(declaration.get("features", []))
for dep, features in sorted(feature_map.items()):
features = features - root_feature_map.get(dep, set())
if not features and default_features[dep] == root_default_features[dep]:
continue
print(dep, default_features[dep], sorted(features))
```
## Summary
This PR enables something like the "final boss" of PyTorch setups --
explicit support for CPU vs. GPU-enabled variants via extras:
```toml
[project]
name = "project"
version = "0.1.0"
requires-python = ">=3.13.0"
dependencies = []
[project.optional-dependencies]
cpu = [
"torch==2.5.1+cpu",
]
gpu = [
"torch==2.5.1",
]
[tool.uv.sources]
torch = [
{ index = "torch-cpu", extra = "cpu" },
{ index = "torch-gpu", extra = "gpu" },
]
[[tool.uv.index]]
name = "torch-cpu"
url = "https://download.pytorch.org/whl/cpu"
explicit = true
[[tool.uv.index]]
name = "torch-gpu"
url = "https://download.pytorch.org/whl/cu124"
explicit = true
[tool.uv]
conflicts = [
[
{ extra = "cpu" },
{ extra = "gpu" },
],
]
```
It builds atop the conflicting extras work to allow sources to be marked
as specific to a dedicated extra being enabled or disabled.
As part of this work, sources now have an `extra` field. If a source has
an `extra`, it means that the source is only applied to the requirement
when defined within that optional group. For example, `{ index =
"torch-cpu", extra = "cpu" }` above only applies to
`"torch==2.5.1+cpu"`.
The `extra` field does _not_ mean that the source is "enabled" when the
extra is activated. For example, this wouldn't work:
```toml
[project]
name = "project"
version = "0.1.0"
requires-python = ">=3.13.0"
dependencies = ["torch"]
[tool.uv.sources]
torch = [
{ index = "torch-cpu", extra = "cpu" },
{ index = "torch-gpu", extra = "gpu" },
]
[[tool.uv.index]]
name = "torch-cpu"
url = "https://download.pytorch.org/whl/cpu"
explicit = true
[[tool.uv.index]]
name = "torch-gpu"
url = "https://download.pytorch.org/whl/cu124"
explicit = true
```
In this case, the sources would effectively be ignored. Extras are
really confusing... but I think this is correct? We don't want enabling
or disabling extras to affect resolution information that's _outside_ of
the relevant optional group.
## Summary
This PR adds context to our error messages to explain _why_ a given
package was included, if we fail to download or build it.
It's quite a large change, but it motivated some good refactors and
improvements along the way.
Closes https://github.com/astral-sh/uv/issues/8962.
## Summary
This PR should not contain any user-visible changes, but the goal is to
refactor the `Resolution` type to retain a dependency graph. We want to
be able to explain _why_ a given package was excluded on error (see:
https://github.com/astral-sh/uv/issues/8962), which in turn requires
that at install time, we can go back and figure out the dependency
chain. At present, `Resolution` is just a map from package name to
distribution; this PR remodels it as a graph in which each node is a
package, and the edges contain markers plus extras or dependency groups.
## Summary
I need this for the derivation chain work
(https://github.com/astral-sh/uv/issues/8962), but it just seems
generally useful. You can't always get a version from a `Dist` (it could
be URL-based!), but when we create a `ResolvedDist`, we _do_ know the
version (and not just the URL). This PR preserves it.
## Summary
At present, when we have a Python requirement and we see a wheel, we
verify that the Python requirement is compatible with the wheel. For
source distributions, though, we verify that both the Python requirement
_and_ the currently-installed version are compatible, because we assume
that we'll need to build the source distribution in order to get
metadata. However, we can often extract source distribution metadata
_without_ building (e.g., if there's a `pyproject.toml` with no dynamic
keys).
This PR thus modifies the source distribution handling to defer that
incompatibility ("We couldn't get metadata for this project, because it
has no static metadata and requires a higher Python version to run /
build") until we actually try to build the package. As a result, you can
now resolve source distribution-only packages using Python versions
below their `requires-python`, as long as they include static metadata.
Closes https://github.com/astral-sh/uv/issues/8767.
## Summary
This is part of making
https://github.com/astral-sh/uv/issues/7299#issuecomment-2385286341
better. You can now use `tool.uv.dependency-metadata` for direct URL
requirements. Unfortunately, you _must_ include a version, since we need
one to perform resolution.
Part of https://github.com/astral-sh/uv/issues/2777
I noticed we're seeing "Python ABI" _a lot_ in error messages which I
did not expect. This improves a common case by being a little more
specific.
## Summary
If you pass a named index via the CLI, you can now reference it as a
named source. This required some surprisingly large refactors, since we
now need to be able to track whether a given index was provided on the
CLI vs. elsewhere (since, e.g., we don't want users to be able to
reference named indexes defined in global configuration).
Closes https://github.com/astral-sh/uv/issues/7899.