mirror of
https://github.com/astral-sh/uv.git
synced 2025-11-19 11:35:36 +00:00
Some checks are pending
CI / Determine changes (push) Waiting to run
CI / lint (push) Waiting to run
CI / cargo clippy | ubuntu (push) Blocked by required conditions
CI / cargo clippy | windows (push) Blocked by required conditions
CI / cargo dev generate-all (push) Blocked by required conditions
CI / cargo shear (push) Waiting to run
CI / cargo test | ubuntu (push) Blocked by required conditions
CI / cargo test | macos (push) Blocked by required conditions
CI / cargo test | windows (push) Blocked by required conditions
CI / check windows trampoline | aarch64 (push) Blocked by required conditions
CI / check windows trampoline | i686 (push) Blocked by required conditions
CI / build binary | macos aarch64 (push) Blocked by required conditions
CI / check windows trampoline | x86_64 (push) Blocked by required conditions
CI / test windows trampoline | aarch64 (push) Blocked by required conditions
CI / test windows trampoline | i686 (push) Blocked by required conditions
CI / test windows trampoline | x86_64 (push) Blocked by required conditions
CI / typos (push) Waiting to run
CI / mkdocs (push) Waiting to run
CI / build binary | linux libc (push) Blocked by required conditions
CI / build binary | linux aarch64 (push) Blocked by required conditions
CI / ecosystem test | pydantic/pydantic-core (push) Blocked by required conditions
CI / integration test | conda on ubuntu (push) Blocked by required conditions
CI / integration test | uv publish (push) Blocked by required conditions
CI / integration test | deadsnakes python3.9 on ubuntu (push) Blocked by required conditions
CI / integration test | free-threaded on windows (push) Blocked by required conditions
CI / integration test | aarch64 windows implicit (push) Blocked by required conditions
CI / integration test | aarch64 windows explicit (push) Blocked by required conditions
CI / integration test | windows python install manager (push) Blocked by required conditions
CI / integration test | pypy on ubuntu (push) Blocked by required conditions
CI / integration test | pypy on windows (push) Blocked by required conditions
CI / integration test | graalpy on ubuntu (push) Blocked by required conditions
CI / integration test | graalpy on windows (push) Blocked by required conditions
CI / integration test | pyodide on ubuntu (push) Blocked by required conditions
CI / integration test | github actions (push) Blocked by required conditions
CI / integration test | free-threaded python on github actions (push) Blocked by required conditions
CI / integration test | determine publish changes (push) Blocked by required conditions
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / build binary | linux musl (push) Blocked by required conditions
CI / build binary | macos x86_64 (push) Blocked by required conditions
CI / build binary | windows x86_64 (push) Blocked by required conditions
CI / build binary | windows aarch64 (push) Blocked by required conditions
CI / build binary | msrv (push) Blocked by required conditions
CI / build binary | freebsd (push) Blocked by required conditions
CI / ecosystem test | prefecthq/prefect (push) Blocked by required conditions
CI / ecosystem test | pallets/flask (push) Blocked by required conditions
CI / smoke test | linux (push) Blocked by required conditions
CI / smoke test | linux aarch64 (push) Blocked by required conditions
CI / check system | alpine (push) Blocked by required conditions
CI / smoke test | macos (push) Blocked by required conditions
CI / smoke test | windows x86_64 (push) Blocked by required conditions
CI / smoke test | windows aarch64 (push) Blocked by required conditions
CI / integration test | activate nushell venv (push) Blocked by required conditions
CI / integration test | pyenv on wsl x86-64 (push) Blocked by required conditions
CI / integration test | registries (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 10 (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | x86-64 python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
zizmor / Run zizmor (push) Waiting to run
An opinionated write-up on why Python packaging needs metadata consistency, and that we need to extend metadata to accommodate ML and scientific users. I didn't add a paragraph related to CUDA or accelerators in general and wheel variants, as this is currently support neither by wheel tags nor by PEP 508 markers, so it's not a strict metadata consistency concern, plus this would get outdated quickly as wheel variants progress.
336 lines
22 KiB
Markdown
336 lines
22 KiB
Markdown
# Resolver internals
|
|
|
|
!!! tip
|
|
|
|
This document focuses on the internal workings of uv's resolver. For using uv, see the
|
|
[resolution concept](../../concepts/resolution.md) documentation.
|
|
|
|
## Resolver
|
|
|
|
As defined in a textbook, resolution, or finding a set of version to install from a given set of
|
|
requirements, is equivalent to the
|
|
[SAT problem](https://en.wikipedia.org/wiki/Boolean_satisfiability_problem) and thereby NP-complete:
|
|
in the worst case you have to try all possible combinations of all versions of all packages and
|
|
there are no general, fast algorithms. In practice, this is misleading for a number of reasons:
|
|
|
|
- The slowest part of resolution in uv is loading package and version metadata, even if it's cached.
|
|
- There are many possible solutions, but some are preferable to others. For example, we generally
|
|
prefer using the latest version of packages.
|
|
- Package dependencies are complex, e.g., there are contiguous versions ranges — not arbitrary
|
|
boolean inclusion/exclusions of versions, adjacent releases often have the same or similar
|
|
requirements, etc.
|
|
- For most resolutions, the resolver doesn't need to backtrack, picking versions iteratively is
|
|
sufficient. If there are version preferences from a previous resolution, barely any work needs to
|
|
be done.
|
|
- When resolution fails, more information is needed than a message that there is no solution (as is
|
|
seen in SAT solvers). Instead, the resolver should produce an understandable error trace that
|
|
states which packages are involved in away to allows a user to remove the conflict.
|
|
- The most important heuristic for performance and user experience is determining the order in which
|
|
decisions are made through prioritization.
|
|
|
|
uv uses [pubgrub-rs](https://github.com/pubgrub-rs/pubgrub), the Rust implementation of
|
|
[PubGrub](https://nex3.medium.com/pubgrub-2fb6470504f), an incremental version solver. PubGrub in uv
|
|
works in the following steps:
|
|
|
|
- Start with a partial solution that declares which packages versions have been selected and which
|
|
are undecided. Initially, only a virtual root package is decided.
|
|
- The highest priority package is selected from the undecided packages. Roughly, packages with URLs
|
|
(including file, git, etc.) have the highest priority, then those with more exact specifiers (such
|
|
as `==`), then those with less strict specifiers. Inside each category, packages are ordered by
|
|
when they were first seen (i.e. order in a file), making the resolution deterministic.
|
|
- A version is picked for the selected package. The version must works with all specifiers from the
|
|
requirements in the partial solution and must not be previously marked as incompatible. The
|
|
resolver prefers versions from a lockfile (`uv.lock` or `-o requirements.txt`) and those installed
|
|
in the current environment. Versions are checked from highest to lowest (unless using an
|
|
alternative [resolution strategy](../../concepts/resolution.md#resolution-strategy)).
|
|
- All requirements of the selected package version are added to the undecided packages. uv
|
|
prefetches their metadata in the background to improve performance.
|
|
- The process is either repeated with the next package unless a conflict is detected, in which the
|
|
resolver will backtrack. For example, the partial solution contains, among other packages, `a 2`
|
|
then `b 2` with the requirements `a 2 -> c 1` and `b 2 -> c 2`. No compatible version of `c` can
|
|
be found. PubGrub can determine this was caused by `a 2` and `b 2` and add the incompatibility
|
|
`{a 2, b 2}`, meaning that when either is picked, the other cannot be selected. The partial
|
|
solution is restored to `a 2` with the tracked incompatibility and the resolver attempts to pick a
|
|
new version for `b`.
|
|
|
|
Eventually, the resolver either picks compatible versions for all packages (a successful resolution)
|
|
or there is an incompatibility including the virtual "root" package which defines the versions
|
|
requested by the user. An incompatibility with the root package indicates that whatever versions of
|
|
the root dependencies and their transitive dependencies are picked, there will always be a conflict.
|
|
From the incompatibilities tracked in PubGrub, an error message is constructed to enumerate the
|
|
involved packages.
|
|
|
|
!!! tip
|
|
|
|
For more details on the PubGrub algorithm, see [Internals of the PubGrub
|
|
algorithm](https://pubgrub-rs-guide.pages.dev/internals/intro).
|
|
|
|
In addition to PubGrub's base algorithm, we also use a heuristic that backtracks and switches the
|
|
order of two packages if they have been conflicting too much.
|
|
|
|
## Forking
|
|
|
|
Python resolvers historically didn't support backtracking, and even with backtracking, resolution
|
|
was usually limited to single environment, which one specific architecture, operating system, Python
|
|
version, and Python implementation. Some packages use contradictory requirements for different
|
|
environments, for example:
|
|
|
|
```
|
|
numpy>=2,<3 ; python_version >= "3.11"
|
|
numpy>=1.16,<2 ; python_version < "3.11"
|
|
```
|
|
|
|
Since Python only allows one version of each package, a naive resolver would error here. Inspired by
|
|
[Poetry](https://github.com/python-poetry/poetry), uv uses a forking resolver: whenever there are
|
|
multiple requirements for a package with different markers, the resolution is split.
|
|
|
|
In the above example, the partial solution would be split into two resolutions, one for
|
|
`python_version >= "3.11"` and one for `python_version < "3.11"`.
|
|
|
|
If markers overlap or are missing a part of the marker space, the resolver splits additional times —
|
|
there can be many forks per package. For example, given:
|
|
|
|
```
|
|
flask > 1 ; sys_platform == 'darwin'
|
|
flask > 2 ; sys_platform == 'win32'
|
|
flask
|
|
```
|
|
|
|
A fork would be created for `sys_platform == 'darwin'`, for `sys_platform == 'win32'`, and for
|
|
`sys_platform != 'darwin' and sys_platform != 'win32'`.
|
|
|
|
Forks can be nested, e.g., each fork is dependent on any previous forks that occurred. Forks with
|
|
identical packages are merged to keep the number of forks low.
|
|
|
|
!!! tip
|
|
|
|
Forking can be observed in the logs of `uv lock -v` by looking for
|
|
`Splitting resolution on ...`, `Solving split ... (requires-python: ...)` and `Split ... resolution
|
|
took ...`.
|
|
|
|
One difficulty in a forking resolver is that where splits occur is dependent on the order packages
|
|
are seen, which is in turn dependent on the preferences, e.g., from `uv.lock`. So it is possible for
|
|
the resolver to solve the requirements with specific forks, write this to the lockfile, and when the
|
|
resolver is invoked again, a different solution is found because the preferences result in different
|
|
fork points. To avoid this, the `resolution-markers` of each fork and each package that diverges
|
|
between forks is written to the lockfile. When performing a new resolution, the forks from the
|
|
lockfile are used to ensure the resolution is stable. When requirements change, new forks may be
|
|
added to the saved forks.
|
|
|
|
## Wheel tags
|
|
|
|
While uv's resolution is universal with respect to environment markers, this doesn't extend to wheel
|
|
tags. Wheel tags can encode the Python version, Python implementation, operating system, and
|
|
architecture. For example, `torch-2.4.0-cp312-cp312-manylinux2014_aarch64.whl` is only compatible
|
|
with CPython 3.12 on arm64 Linux with `glibc>=2.17` (per the `manylinux2014` policy), while
|
|
`tqdm-4.66.4-py3-none-any.whl` works with all Python 3 versions and interpreters on any operating
|
|
system and architecture. Most projects have a universally compatible source distribution that can be
|
|
used when attempted to install a package that has no compatible wheel, but some packages, such as
|
|
`torch`, don't publish a source distribution. In this case an installation on, e.g., Python 3.13, an
|
|
uncommon operating system, or architecture, will fail and complain that there is no matching wheel.
|
|
|
|
## Marker and wheel tag filtering
|
|
|
|
In every fork, we know what markers are possible. In non-universal resolution, we know their exact
|
|
values. In universal mode, we know at least a constraint for the python requirement, e.g.,
|
|
`requires-python = ">=3.12"` means that `importlib_metadata; python_version < "3.10"` can be
|
|
discarded because it can never be installed. If additionally `tool.uv.environments` is set, we can
|
|
filter out requirements with markers disjoint with those environments. Inside each fork, we can
|
|
additionally filter by the fork markers.
|
|
|
|
There is some redundancy in the marker expressions, where the value of one marker field implies the
|
|
value of another field. Internally, we normalize `python_version` and `python_full_version` as well
|
|
as known values of `platform_system` and `sys_platform` to a shared canonical representation, so
|
|
they can match against each other.
|
|
|
|
When we selected a version with a local tag (e.g.,`1.2.3+localtag`) and the wheels don't cover
|
|
support for Windows, Linux and macOS, and there is a base version without tag (e.g.,`1.2.3`) with
|
|
support for a missing platform, we fork trying to extend the platform support by using both the
|
|
version with local tag and without local tag depending on the platform. This helps with packages
|
|
that use the local tag for different hardware accelerators such as torch. While there is no 1:1
|
|
mapping between wheel tags and markers, we can do a mapping for well-known platforms, including
|
|
Windows, Linux and macOS.
|
|
|
|
## Metadata consistency
|
|
|
|
uv, similar to poetry, requires that wheels of a single version of a package in a specific index
|
|
have the same dependencies (`Requires-Dist` in `METADATA`), including wheels build from a source
|
|
distribution. More generally, uv assumes that each wheel has the same `METADATA` file in its
|
|
dist-info directory.
|
|
|
|
numpy 2.3.2 for example has 73 wheels. Without this assumption, uv would have to make 73 network
|
|
requests to fetch its metadata, instead of a single one. Another problem we would have without
|
|
metadata consistency is the lack of a 1:1 mapping between markers and wheel tags. Wheel tags can
|
|
include the glibc version while the PEP 508 markers cannot represent it. If wheels had different
|
|
metadata, a universal resolver would have to track two dimensions simultaneously, PEP 508 markers
|
|
and wheel tags. This would increase complexity a lot, and the correspondence between the two is not
|
|
properly specified. PEP 508 markers have been introduced specifically to allow different
|
|
dependencies between different platform, i.e. to have a single dependency declaration for all
|
|
wheels, such as `project.[optional-]dependencies`. If the markers are not sufficient, we should
|
|
extend PEP 508 markers instead of using a parallel system of wheel tags.
|
|
|
|
Another aspect of metadata consistency is that a source distribution must build into a wheel with
|
|
the same metadata as the wheels, or if there are no wheels, into the same metadata each time. If
|
|
this assumption is violated, sound dependency locking becomes impossible: Consider a package A has a
|
|
source distribution. During resolution, we build A v1 and obtain the dependencies `B>=2,<3`. We lock
|
|
`A==1` and `B==2`. When installing the lockfile on the target machine, we build again and obtain
|
|
dependencies `B>=3,<4` and `C>=1,<2`. The lockfile fails to install: Due to the changed constraints,
|
|
the locked version of `B` is incompatible, and there's no locked candidate for `C`. Re-resolving
|
|
after this would both be a reproducibility problem (the lockfile is effectively ignored) and a
|
|
security concern (`C` has not been reviewed, neither was `B==3`). It's possible to fail on
|
|
installation if that happens, but a late error, possibly during deployment, is a bad user
|
|
experience. There is already a case where uv fails on installation, packages with no source
|
|
distribution and only platform specific wheels incompatible with the current platform. While uv has
|
|
[required environments](https://docs.astral.sh/uv/concepts/resolution/#required-environments) as
|
|
mitigation, this requires a not well known configuration option, and questions around (un)supported
|
|
environments are one of the most common problem for uv users. A similar situation with source
|
|
distributions should be avoided.
|
|
|
|
While older versions of torch and tensorflow had inconsistent metadata, all recent versions have
|
|
consistent metadata, and we are not aware of any major package with inconsistent metadata. There is
|
|
however no requirement in the Python packaging standards that metadata must be consistent, and
|
|
requests to enforce this in the standards have been rejected
|
|
(https://discuss.python.org/t/enforcing-consistent-metadata-for-packages/50008).
|
|
|
|
There are packages that have native code that links against the native code in another package, such
|
|
as torch. These package may support building against a range of torch versions, but once built, they
|
|
are constrained to a specific torch version, and the runtime torch version must match the build-time
|
|
version. These are currently a pain point across all package managers, as all major package managers
|
|
from pip to uv cache source distribution builds. uv supports multiple builds depending on the
|
|
version of the already installed package using
|
|
[ `tool.uv.extra-build-dependencies`](https://docs.astral.sh/uv/concepts/projects/config/#augmenting-build-dependencies)
|
|
with `match-runtime = true`. This is a workaround that needs to be made on the user side for each
|
|
affected package, instead of library developers declaring this requirement, which would be possible
|
|
with native standards support.
|
|
|
|
## Requires-python
|
|
|
|
To ensure that a resolution with `requires-python = ">=3.9"` can actually be installed for the
|
|
included Python versions, uv requires that all dependencies have the same minimum Python version.
|
|
Package versions that declare a higher minimum Python version, e.g., `requires-python = ">=3.10"`,
|
|
are rejected, because a resolution with that version can't be installed on Python 3.9. This ensures
|
|
that when you are on an old Python version, you can install old packages, instead of getting newer
|
|
packages that require newer Python syntax or standard library features.
|
|
|
|
uv ignores upper-bounds on `requires-python`, with special handling for packages with only
|
|
ABI-specific wheels. For example, if a package declares `requires-python = ">=3.8,<4"`, the `<4`
|
|
part is ignored. There is a detailed discussion with drawbacks and alternatives in
|
|
[#4022](https://github.com/astral-sh/uv/issues/4022) and this
|
|
[DPO thread](https://discuss.python.org/t/requires-python-upper-limits/12663), this section
|
|
summarizes the aspects most relevant to uv's design.
|
|
|
|
For most projects, it's not possible to determine whether they will be compatible with a new version
|
|
before it's released, so blocking newer versions in advance would block users from upgrading or
|
|
testing newer Python versions. The exceptions are packages which use the unstable C ABI or internals
|
|
of CPython such as its bytecode format.
|
|
|
|
Introducing a `requires-python` upper bound to a project that previously wasn't using one will not
|
|
prevent the project from being used on a too recent Python version. Instead of failing, the resolver
|
|
will pick an older version without the bound, circumventing the bound.
|
|
|
|
For the resolution to be as universally installable as possible, uv ensures that the selected
|
|
dependency versions are compatible with the `requires-python` range of the project. For example, for
|
|
a project with `requires-python = ">=3.12"`, uv will not use a dependency version with
|
|
`requires-python = ">=3.13"`, as otherwise the resolution is not installable on Python 3.12, which
|
|
the project declares to support. Applying the same logic to upper bounds means that bumping the
|
|
upper Python version bound on a project makes it compatible with less dependency versions,
|
|
potentially failing to resolve when no version of a dependency supports the required range. (Bumping
|
|
the lower Python version bound has the inverse effect, it only increases the set of supported
|
|
dependency versions.)
|
|
|
|
Note that this is different for Conda, as the Conda solver also determines the Python version, so it
|
|
can choose a lower Python version instead. Conda can also change metadata after a release, so it can
|
|
update compatibility for a new Python version, while metadata on PyPI cannot be changed once
|
|
published.
|
|
|
|
Ignoring an upper bound is a problem for packages such as numpy which use the version-dependent C
|
|
API of CPython. As of writing, each numpy release support 4 Python minor versions, e.g., numpy 2.0.0
|
|
has wheels for CPython 3.9 through 3.12 and declares `requires-python = ">=3.9"`, while numpy 2.1.0
|
|
has wheels for CPython 3.10 through 3.13 and declares `requires-python = ">=3.10"`. This means that
|
|
when uv resolves a `numpy>=2,<3` requirement in a project with `requires-python = ">=3.9"`, it
|
|
selects numpy 2.0.0 and the lockfile doesn't install on Python 3.13 or newer. To alleviate this,
|
|
whenever uv rejects a version that requires a newer Python version, we fork by splitting the
|
|
resolution markers on that Python version. This behavior can be controlled by `--fork-strategy`. In
|
|
the example case, upon encountering numpy 2.1.0 we fork into Python versions `>=3.9,<3.10` and
|
|
`>=3.10` and resolve two different numpy versions:
|
|
|
|
```
|
|
numpy==2.0.0; python_version >= "3.9" and python_version < "3.10"
|
|
numpy==2.1.0; python_version >= "3.10"
|
|
```
|
|
|
|
There's one case where uv does consider the upper bound: When the project uses an upper bound on
|
|
requires Python, such as `requires-python = "==3.13.*"` for an application that only deploys to
|
|
Python 3.13. uv prunes wheels from the lockfile that are outside the range (e.g., `cp312` and
|
|
`cp314`) in a post-processing step, which does not influence the resolution itself.
|
|
|
|
## URL dependencies
|
|
|
|
In uv, a dependency can either be a registry dependency, a package with a version specifier or the
|
|
plain package name, or a URL dependency. All requirements in the form `{name} @ {url}` are URL
|
|
dependencies, and also all dependencies that have a `git`,` url`, `path`, or `workspace` source.
|
|
|
|
When a URL is declared for a package, uv pins the package to this URL, and the version this URL
|
|
implies. If there are two conflicting URLs for a package, the resolver errors, as a URL can only be
|
|
declared as something akin to an exact `==` pin, and not as list of URLs. A list of URLs is
|
|
supported through [flat indexes](../../concepts/indexes.md#flat-indexes) instead.
|
|
|
|
uv requires that URLs are either declared directly (in the project, in a
|
|
[workspace member](../../concepts/projects/workspaces.md), in a
|
|
[constraint](../../concepts/resolution.md#dependency-constraints), or in an
|
|
[override](../../concepts/resolution.md#dependency-overrides), any location that is discovered
|
|
directly), or by other URL dependencies. uv discovers all URL dependencies and their transitive URL
|
|
dependencies ahead of the resolution and pins all packages to the URLs and the versions they imply.
|
|
|
|
uv does not allow URLs in index packages. This has two reasons: One is a security and predictability
|
|
aspect, that forbids registry distributions to point to non-registry distributions and helps
|
|
auditing which URLs can be accessed. For example, when only using one index URL and no URL
|
|
dependencies, uv will not install any package from outside the index.
|
|
|
|
The other is that URLs can add additional versions to the resolution. Say the root package depends
|
|
on foo, bar, and baz, all registry dependencies. foo depends on `bar >= 2`, but bar only has version
|
|
1 on the index. With the incremental approach, this is an error: foo cannot be fulfilled, there is a
|
|
resolver error. If URLs on index packages were allowed, it could be that there is a version of baz
|
|
declares a dependency on baz-core and that has a version that declares
|
|
`bar @ https://example.com/bar-2-py3-none-any.whl` adding a version of bar that makes requirements
|
|
resolve. If a dependency can add new versions, discarding any version in the resolver would require
|
|
looking at all possible versions of all direct and transitive dependencies. This breaks the core
|
|
assumption incremental resolvers make that the set of versions for a package is static and would
|
|
require to always fetch the metadata for all possibly reachable version.
|
|
|
|
## Prioritization
|
|
|
|
Prioritization is important for both performance and for better resolutions.
|
|
|
|
If we try many versions we have to later discard, resolution is slow, both because we have to read
|
|
metadata we didn't need and because we have to track a lot of (conflict) information for this
|
|
discarded subtree.
|
|
|
|
There are expectations about which solution uv should choose, even if the version constraints allow
|
|
multiple solutions. Generally, a desirable solution prioritizes use the highest versions for direct
|
|
dependencies over those for indirect dependencies, it avoids backtracking to very old versions and
|
|
can be installed on a target machine.
|
|
|
|
Internally, uv represent each package with a given package name as a number of virtual packages, for
|
|
example, one package for each activated extra, for dependency groups, or for having a marker. While
|
|
PubGrub needs to choose a version for each virtual package, uv's prioritization works on the package
|
|
name level.
|
|
|
|
Whenever we encounter a requirement on a package, we match it to a priority. The root package and
|
|
URL requirements have the highest priority, then singleton requirements with the `==` operator, as
|
|
their version can be directly determined, then highly conflicting packages (next paragraph), and
|
|
finally all other packages. Inside each category, packages are sorted by when they were first
|
|
encountered, creating a breadth first search that prioritizes direct dependencies including
|
|
workspace dependencies over transitive dependencies.
|
|
|
|
A common problem is that we have a package A with a higher priority than package B, and B is only
|
|
compatible with older versions of A. We decide the latest version for package A. Each time we decide
|
|
a version for B, it is immediately discarded due to the conflict with A. We have to try all possible
|
|
versions of B, until we have either exhausted the possible range (slow), pick a very old version
|
|
that doesn't depend on A, but most likely isn't compatible with the project either (bad) or fail to
|
|
build a very old version (bad). Once we see such conflict happen five time, we set A and B to
|
|
special highly-conflicting priority levels, and set them so that B is decided before A. We then
|
|
manually backtrack to a state before deciding A, in the next iteration now deciding B instead of A.
|
|
See [#8157](https://github.com/astral-sh/uv/issues/8157) and
|
|
[#9843](https://github.com/astral-sh/uv/pull/9843) for a more detailed description with real world
|
|
examples.
|