## Summary
We still only respect overrides and constraints in the workspace root --
which we may want to change -- but overrides and constraints are now
correctly lowered.
Closes https://github.com/astral-sh/uv/issues/8148.
## Summary
This was an oversight in the initial implementation. We shouldn't
validate sources for the `build-system.requires` field, since extras and
groups can _never_ be active.
Closes https://github.com/astral-sh/uv/issues/9259.
## Summary
The reqwest middleware doesn't retry errors that occur "after" the
request completes -- but in some cases, these do include spurious errors
that we want to retry. See https://github.com/astral-sh/uv/issues/8144
for examples. This PR adds a second retry layer during the response
_handler_, which should help with some of the spurious failures we see
in the linked issue.
Closes https://github.com/astral-sh/uv/issues/8144.
## Summary
This PR enables something like the "final boss" of PyTorch setups --
explicit support for CPU vs. GPU-enabled variants via extras:
```toml
[project]
name = "project"
version = "0.1.0"
requires-python = ">=3.13.0"
dependencies = []
[project.optional-dependencies]
cpu = [
"torch==2.5.1+cpu",
]
gpu = [
"torch==2.5.1",
]
[tool.uv.sources]
torch = [
{ index = "torch-cpu", extra = "cpu" },
{ index = "torch-gpu", extra = "gpu" },
]
[[tool.uv.index]]
name = "torch-cpu"
url = "https://download.pytorch.org/whl/cpu"
explicit = true
[[tool.uv.index]]
name = "torch-gpu"
url = "https://download.pytorch.org/whl/cu124"
explicit = true
[tool.uv]
conflicts = [
[
{ extra = "cpu" },
{ extra = "gpu" },
],
]
```
It builds atop the conflicting extras work to allow sources to be marked
as specific to a dedicated extra being enabled or disabled.
As part of this work, sources now have an `extra` field. If a source has
an `extra`, it means that the source is only applied to the requirement
when defined within that optional group. For example, `{ index =
"torch-cpu", extra = "cpu" }` above only applies to
`"torch==2.5.1+cpu"`.
The `extra` field does _not_ mean that the source is "enabled" when the
extra is activated. For example, this wouldn't work:
```toml
[project]
name = "project"
version = "0.1.0"
requires-python = ">=3.13.0"
dependencies = ["torch"]
[tool.uv.sources]
torch = [
{ index = "torch-cpu", extra = "cpu" },
{ index = "torch-gpu", extra = "gpu" },
]
[[tool.uv.index]]
name = "torch-cpu"
url = "https://download.pytorch.org/whl/cpu"
explicit = true
[[tool.uv.index]]
name = "torch-gpu"
url = "https://download.pytorch.org/whl/cu124"
explicit = true
```
In this case, the sources would effectively be ignored. Extras are
really confusing... but I think this is correct? We don't want enabling
or disabling extras to affect resolution information that's _outside_ of
the relevant optional group.
## Summary
The basic issue here is that `uv add` will compute and store a hash for
each package. But if you later run `uv pip install` _after_ `uv cache
prune --ci`, we need to re-download the source distribution. After
re-downloading, we compare the hashes before and after. But `uv pip
install` doesn't compute any hashes by default. So the hashes "differ"
and we error.
Instead, we need to compute a superset of the already-existing and
newly-requested hashes when performing this re-download. (In practice,
this will always be SHA-256.)
Closes https://github.com/astral-sh/uv/issues/8929.
## Test Plan
```shell
export UV_CACHE_DIR="$PWD/cache"
rm -rf "$UV_CACHE_DIR" .venv .venv-2 pyproject.toml uv.lock
echo $(uv --version)
uv init --name uv-cache-issue
cargo run add --python 3.13 "pycairo"
uv cache prune --ci
rm -rf .venv .venv-2
uv venv --python python3.11 .venv-2
. .venv-2/bin/activate
cargo run pip install "pycairo"
```
## Summary
In the example outlined in https://github.com/astral-sh/uv/issues/8884,
this removes an unnecessary `jupyter_contrib_nbextensions-0.7.0.tar.gz`
segment (replacing it with `src`), thereby saving 39 characters and
getting that build working on my Windows machine.
This should _not_ require a version bump because we already have logic
in place to "heal" partial cache entries that lack an unzipped
distribution.
Closes https://github.com/astral-sh/uv/issues/8884.
Closes https://github.com/astral-sh/uv/issues/7376.
## Summary
See: https://github.com/astral-sh/uv/issues/8884. We build in a
directory that's deep within the cache; to help with file name length
limits, we should build at the top-level of the cache.
## Summary
At present, when we have a Python requirement and we see a wheel, we
verify that the Python requirement is compatible with the wheel. For
source distributions, though, we verify that both the Python requirement
_and_ the currently-installed version are compatible, because we assume
that we'll need to build the source distribution in order to get
metadata. However, we can often extract source distribution metadata
_without_ building (e.g., if there's a `pyproject.toml` with no dynamic
keys).
This PR thus modifies the source distribution handling to defer that
incompatibility ("We couldn't get metadata for this project, because it
has no static metadata and requires a higher Python version to run /
build") until we actually try to build the package. As a result, you can
now resolve source distribution-only packages using Python versions
below their `requires-python`, as long as they include static metadata.
Closes https://github.com/astral-sh/uv/issues/8767.
When resolving workspace dependencies (from one workspace member to
another) from a workspace that's in git, we need to emit these
transitive dependencies as git dependencies, not path dependencies as
all other workspace deps. This fixes a bug where we would treat them as
path dependencies inside the checkout directory, leading either to
clashes (between a local path and another direct git dependency) or
invalid lockfiles (referencing the checkout dir in the lockfile when we
should be referencing the git repo).
Fixes#8087Fixes#4920Fixes#3936 since we needed that information anyway
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
## Summary
We already support `tool.uv.dev-dependencies` in the legacy
non-`[project]` projects. This adds equivalent support for
`[dependency-groups]`, e.g.:
```toml
[tool.uv.workspace]
[dependency-groups]
lint = ["ruff"]
```
Part of #8090
Adds the ability to read group inclusions (`include-group = <name>`) in
the `pyproject.toml`. Resolves groups into concrete dependencies for
resolution.
See https://github.com/astral-sh/uv/pull/8110 for a bit more commentary
on deferred work.
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
Part of #8090
Adds the ability to add and remove dependencies from arbitrary groups
using `uv add` and `uv remove`. Does not include resolving with the new
dependencies — tackling that in #8110.
Additionally, this does not yet resolve interactions with the existing
`dev` group — we'll tackle that separately as well. I probably won't
merge the stack until that design is resolved.
## Summary
This is part of making
https://github.com/astral-sh/uv/issues/7299#issuecomment-2385286341
better. You can now use `tool.uv.dependency-metadata` for direct URL
requirements. Unfortunately, you _must_ include a version, since we need
one to perform resolution.
## Summary
We shouldn't show these in `uv add`, especially when the thing we're
adding is about to have a lower-bound put on it. Now, we only show these
when the user runs `uv lock` or `uv sync`.
## Summary
If you pass a named index via the CLI, you can now reference it as a
named source. This required some surprisingly large refactors, since we
now need to be able to track whether a given index was provided on the
CLI vs. elsewhere (since, e.g., we don't want users to be able to
reference named indexes defined in global configuration).
Closes https://github.com/astral-sh/uv/issues/7899.
## Summary
This PR enables users to provide index credentials via named environment
variables.
For example, given an index named `internal` that requires a username
(`public`) and password
(`koala`), you can define the index (without credentials) in your
`pyproject.toml`:
```toml
[[tool.uv.index]]
name = "internal"
url = "https://pypi-proxy.corp.dev/simple"
```
Then set the `UV_INDEX_INTERNAL_USERNAME` and
`UV_INDEX_INTERNAL_PASSWORD`
environment variables, where `INTERNAL` is the uppercase version of the
index name:
```sh
export UV_INDEX_INTERNAL_USERNAME=public
export UV_INDEX_INTERNAL_PASSWORD=koala
```
## Summary
This PR adds a first-class API for defining registry indexes, beyond our
existing `--index-url` and `--extra-index-url` setup.
Specifically, you now define indexes like so in a `uv.toml` or
`pyproject.toml` file:
```toml
[[tool.uv.index]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cu121"
```
You can also provide indexes via `--index` and `UV_INDEX`, and override
the default index with `--default-index` and `UV_DEFAULT_INDEX`.
### Index priority
Indexes are prioritized in the order in which they're defined, such that
the first-defined index has highest priority.
Indexes are also inherited from parent configuration (e.g., the
user-level `uv.toml`), but are placed after any indexes in the current
project, matching our semantics for other array-based configuration
values.
You can mix `--index` and `--default-index` with the legacy
`--index-url` and `--extra-index-url` settings; the latter two are
merely treated as unnamed `[[tool.uv.index]]` entries.
### Index pinning
If an index includes a name (which is optional), it can then be
referenced via `tool.uv.sources`:
```toml
[[tool.uv.index]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cu121"
[tool.uv.sources]
torch = { index = "pytorch" }
```
If an index is marked as `explicit = true`, it can _only_ be used via
such references, and will never be searched implicitly:
```toml
[[tool.uv.index]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cu121"
explicit = true
[tool.uv.sources]
torch = { index = "pytorch" }
```
Indexes defined outside of the current project (e.g., in the user-level
`uv.toml`) can _not_ be explicitly selected.
(As of now, we only support using a single index for a given
`tool.uv.sources` definition.)
### Default index
By default, we include PyPI as the default index. This remains true even
if the user defines a `[[tool.uv.index]]` -- PyPI is still used as a
fallback. You can mark an index as `default = true` to (1) disable the
use of PyPI, and (2) bump it to the bottom of the prioritized list, such
that it's used only if a package does not exist on a prior index:
```toml
[[tool.uv.index]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cu121"
default = true
```
### Name reuse
If a name is reused, the higher-priority index with that name is used,
while the lower-priority indexes are ignored entirely.
For example, given:
```toml
[[tool.uv.index]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cu121"
[[tool.uv.index]]
name = "pytorch"
url = "https://test.pypi.org/simple"
```
The `https://test.pypi.org/simple` index would be ignored entirely,
since it's lower-priority than `https://download.pytorch.org/whl/cu121`
but shares the same name.
Closes#171.
## Future work
- Users should be able to provide authentication for named indexes via
environment variables.
- `uv add` should automatically write `--index` entries to the
`pyproject.toml` file.
- Users should be able to provide multiple indexes for a given package,
stratified by platform:
```toml
[tool.uv.sources]
torch = [
{ index = "cpu", markers = "sys_platform == 'darwin'" },
{ index = "gpu", markers = "sys_platform != 'darwin'" },
]
```
- Users should be able to specify a proxy URL for a given index, to
avoid writing user-specific URLs to a lockfile:
```toml
[[tool.uv.index]]
name = "test"
url = "https://private.org/simple"
proxy = "http://<omitted>/pypi/simple"
```
As per
https://matklad.github.io/2021/02/27/delete-cargo-integration-tests.html
Before that, there were 91 separate integration tests binary.
(As discussed on Discord — I've done the `uv` crate, there's still a few
more commits coming before this is mergeable, and I want to see how it
performs in CI and locally).
## Summary
This PR enables users to provide multiple source entries in
`tool.uv.sources`, e.g.:
```toml
[tool.uv.sources]
httpx = [
{ git = "https://github.com/encode/httpx", tag = "0.27.2", marker = "sys_platform == 'darwin'" },
{ git = "https://github.com/encode/httpx", tag = "0.24.1", marker = "sys_platform == 'linux'" },
]
```
The implementation is relatively straightforward: when we lower the
requirement, we now return an iterator rather than a single requirement.
In other words, the above is transformed into two requirements:
```txt
httpx @ git+https://github.com/encode/httpx@0.27.2 ; sys_platform == 'darwin'
httpx @ git+https://github.com/encode/httpx@0.24.1 ; sys_platform == 'linux'
```
We verify (at deserialization time) that the markers are
non-overlapping.
Closes https://github.com/astral-sh/uv/issues/3397.
## Summary
Historically, we've allowed the use of wheels that were downloaded from
PyPI even when the user passes `--no-binary`, if the wheel exists in the
cache. This PR modifies the cache lookup code such that we respect
`--no-build` and `--no-binary` in those paths.
Closes https://github.com/astral-sh/uv/issues/2154.
## Summary
`uv cache prune --ci` will remove the source distribution directory. If
we then need to build a _different_ wheel (e.g., you're building a
package that has Python minor version-specific wheels), we fail, because
we expect the source to be there.
Now, if the source is missing, we re-download it. It would be slightly
easier to just _ignore_ that revision, but that would mean we'd also
lose the already-built wheels -- so if you ran against many Python
versions, we'd continuously lose the cached data.
Closes https://github.com/astral-sh/uv/issues/7543.
## Test Plan
We can add tests, but they _need_ to build non-pure Python wheels, which
tends to be expensive...
For reference:
```console
$ cargo run venv --python 3.12
$ cargo run pip install mercurial==6.8.1 --verbose
$ cargo run cache prune --ci
$ cargo run venv --python 3.11
$ cargo run pip install mercurial==6.8.1 --verbose
```
I also did this with a local `.tar.gz` that I downloaded from PyPI.
## Summary
Closes https://github.com/astral-sh/uv/issues/7485.
## Test Plan
```
$ cargo run cache clean
$ cargo run venv
$ cargo run pip install django-allauth==0.51.0
$ cargo run venv
$ cargo run pip install django-allauth==0.51.0
```