The resolver methods are already too large and complex, especially
`choose_version*`, so i wanted to shrink and simplify them a bit before
adding new methods to them.
I've split `MetadataResponse` into three variants: success, non-fatal
error (reported through pubgrub), fatal error (reported as error trace).
The resulting non-fatal `MetadataUnavailable` type is equivalent to the
`IncompletePackage` type, so they are now merged. (`UnavailableVersion`
is a bit different since, besides the extra `IncompatibleDist` variant,
it have no error source attached). This shows that the missing metadata
variant was unused, which I removed.
Tagging as error messages for the logging format changes.
This PR adds a notion of "conflict markers" to the lock file as an
attempt to address #9289. The idea is to encode a new kind of boolean
expression indicating how to choose dependencies based on which extras
are activated.
As an example of what conflict markers look like, consider one of the
cases
brought up in #9289, where `anyio` had unconditional dependencies on
two different versions of `idna`. Now, those are gated by markers, like
this:
```toml
[[package]]
name = "anyio"
version = "4.3.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "idna", version = "3.5", source = { registry = "https://pypi.org/simple" }, marker = "extra == 'extra-7-project-foo'" },
{ name = "idna", version = "3.6", source = { registry = "https://pypi.org/simple" }, marker = "extra == 'extra-7-project-bar' or extra != 'extra-7-project-foo'" },
{ name = "sniffio" },
]
```
The odd extra values like `extra-7-project-foo` are an encoding of not
just the conflicting extra (`foo`) but also the package it's declared
for (`project`). We need both bits of information because different
packages may have the same extra name, even if they are completely
unrelated. The `extra-` part is a prefix to distinguish it from groups
(which, in this case, would be encoded as `group-7-project-foo` if `foo`
were a dependency group). And the `7` part indicates the length of the
package name which makes it possible to parse out the package and extra
name from this encoding. (We don't actually utilize that property, but
it seems like good sense to do it in case we do need to extra
information from these markers.)
While this preserves PEP 508 compatibility at a surface level, it does
require utilizing this encoding scheme in order
to evaluate them when they're present (which only occurs when
conflicting extras/groups are declared).
My sense is that the most complex part of this change is not just adding
conflict markers, but their simplification. I tried to address this in
the code comments and commit messages.
Reviewers should look at this commit-by-commit.
Fixes#9289, Fixes#9546, Fixes#9640, Fixes#9622, Fixes#9498, Fixes
#9701, Fixes#9734
## Summary
So the error here is:
```rust
ExtractError("cpython-3.11.11%2B20241206-aarch64-apple-darwin-install_only_stripped.tar.gz", Io(Custom { kind: UnexpectedEof, error: TarError { desc: "failed to unpack `/Users/crmarsh/.local/share/uv/python/.cache/.tmpkqFzqE/python/lib/libpython3.11.dylib`", io: Custom { kind: UnexpectedEof, error: TarError { desc: "failed to unpack `python/lib/libpython3.11.dylib` into `/Users/crmarsh/.local/share/uv/python/.cache/.tmpkqFzqE/python/lib/libpython3.11.dylib`", io: Custom { kind: UnexpectedEof, error: "unexpected end of file" } } } } }))
```
This isn't a Reqwest error, so we miss it in
`is_extended_transient_error`.
We could add `TarError` or `ExtractError` here, but... should we? This
PR just extends it to any error that has an IO source. I don't see much
of a downside.
Closes https://github.com/astral-sh/uv/issues/9747.
## Test Plan
First, ran: `uv run ./scripts/create-python-mirror.py --name cpython
--arch aarch64 --os darwin`.
Then, dropped this into `./scripts/mirror/server.py`:
```python
import os
import random
from http.server import SimpleHTTPRequestHandler, HTTPServer
class GlitchyStaticServer(SimpleHTTPRequestHandler):
def do_GET(self):
"""Handle GET request."""
file_path = self.translate_path(self.path)
if not os.path.exists(file_path):
self.send_error(404, "File not found")
return
try:
with open(file_path, 'rb') as f:
file_content = f.read()
# Introduce an "unexpected end of file" glitch randomly
if random.random() < 0.75: # 75% chance of glitch
glitch_point = random.randint(1, len(file_content) - 1)
file_content = file_content[:glitch_point]
self.send_response(200)
self.send_header("Content-type", self.guess_type(file_path))
self.send_header("Content-Length", len(file_content))
self.end_headers()
self.wfile.write(file_content)
except Exception as e:
self.send_error(500, f"Internal Server Error: {e}")
def run(server_class=HTTPServer, handler_class=GlitchyStaticServer, port=8080):
"""Run the server."""
server_address = ('', port)
httpd = server_class(server_address, handler_class)
print(f"Serving on port {port} with glitchy behavior")
httpd.serve_forever()
if __name__ == "__main__":
run()
```
Then ran `python server.py` from that directory.
From there, ran `UV_PYTHON_INSTALL_MIRROR="http://localhost:8080" cargo
run python install 3.11 --reinstall --verbose` to reliably test retries.
## Summary
I'm not sure why this hasn't come up before... But it looks like this
method is only looking at `python.exe` and `python3.exe`? From the user
screenshots, the `python3.12.exe` and `python3.13.exe` are also present,
though.
Closes https://github.com/astral-sh/uv/issues/9667.
I don't see any real reason to forbid executing these in a
cross-platform way
```
❯ echo "print('hello world')" > test.pyw
❯ uv run test.pyw
error: Failed to spawn: `test.pyw`
Caused by: No such file or directory (os error 2)
❯ cargo run -q -- run test.pyw
hello world
```
Closes https://github.com/astral-sh/uv/issues/9757
## Summary
On `main`, if you ask for a source but name a missing subdirectory, you
just get:
```
{source} does not appear to be a Python project, as neither `pyproject.toml` nor `setup.py` are present in the directory
```
But, in reality, the directory doesn't exist at all.
## Summary
We were reading an `.egg-info` file from the root directory that didn't
apply to the root member -- it was for another workspace member. I think
this is driven from some idiosyncracies in the `setuptools` setup for
that workspace member, but it's still wrong to fail.
This PR adds a few measures to fix this:
1. We validate the `egg-info` filename against the package metadata.
2. We skip, rather than fail, if we see incorrect metadata in an
`egg-info` file or similar. This is an optimization anyway; worst case,
we try to build the package, then fail there.
Closes https://github.com/astral-sh/uv/issues/9743.
The `SysVersion` registry entry may or may not include the patch
version, so if we encounter a registry entry without a patch version, we
must not assume that the patch version is 0.
```
Name Property
---- --------
3.9 DisplayName : Python 3.9 (64-bit)
SupportUrl : https://www.python.org/
Version : 3.9.13
SysVersion : 3.9
SysArchitecture : 64bit
Hive: HKEY_CURRENT_USER\Software\Python\PythonCore\3.9
```
Confirmed the fix manually.
Fixes#9668
## Summary
This PR allows users to specify a source both in `project.dependencies`
("production") and `tool.uv.sources` ("development"). It's not intended
as a holistic fix for "production" vs. "development" dependencies, but
in some cases this is good enough with `--no-sources`, and I don't see a
great reason for enforcing it right now.
Closes: https://github.com/astral-sh/uv/issues/9682
Ref: https://github.com/astral-sh/uv/issues/7945 (but I'll leave this
open?)
## Summary
Before:
```console
$ cargo run -- --version
uv 0.5.7 (b17902da0 2024-12-09)
```
After:
```console
$ cargo run -- --version
uv 0.5.7+14 (7cd0ab77a 2024-12-09)
```
Currently `cargo run -- --version` does not includes the number of
commits since last tag, because `cargo-dist` create non-annotated tag,
and
`git log -1 --date=short --abbrev=9 --format='%H %h %cd %(describe)'`
use only annoated tags by default.
```console
$ git log -1 --date=short --abbrev=9 --format='%H %h %cd %(describe)'
7cd0ab77a97cd0ab77a 2024-12-09
```
To include these tags, use `git log -1 --date=short --abbrev=9
--format='%H %h %cd %(describe:tags)'`, which will display:
```console
$ git log -1 --date=short --abbrev=9 --format='%H %h %cd %(describe:tags)'
7cd0ab77a97cd0ab77a 2024-12-09 0.5.7-14-g7cd0ab77a
```
## Summary
Sort of ridiculous, but today this passes, when it should fail:
```toml
[project]
name = "foo"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.13.0"
dependencies = []
[project.optional-dependencies]
async = [
"foo[async]==0.2.0",
]
```
## Summary
In the end, the problem is that `relative_to` has incorrect behavior if
either path is non-normalize (e.g., `foo/bar/../project`). So I've fixed
that method, but we _also_ now normalize `project` upfront, which _also_
fixes the issue.
Closes https://github.com/astral-sh/uv/issues/9692.
## Summary
Small thing I noticed while working on another change: if we error when
extracting `requires-dist`, we go through the full metadata build. We
need to distinguish between fatal errors and "the data isn't static".
## Summary
This is an alternative to #9344. If accepted, I need to audit the
codebase and call sites to apply it everywhere, but the basic idea is:
rather than encoding mutually-incompatible pairs of markers in the
representation itself, we have an additional method on `MarkerTree` that
expands the false-y definition to take into account assumptions about
which markers can be true alongside others. We then check if the the
current marker implies that at least one of them is true.
So, for example, we know that `sys_platform == 'win32'` and
`platform_system == 'Darwin'` are mutually exclusive. When given a
marker expression like `python_version >= '3.7'`, we test if
`python_version >= '3.7'` and `sys_platform != 'win32' or
platform_system != 'Darwin'` are disjoint, i.e., if the following can't
be satisfied:
```
python_version >= '3.7' and (sys_platform != 'win32' or platform_system != 'Darwin')
```
Since, if this can't be satisfied, it implies that the left-hand
expression requires `sys_platform == 'win32'` and `platform_system ==
'Darwin'` to be true at the same time.
I think the main downsides here are:
1. We can't _simplify_ markers based on these implications. So we'd
still write markers like `sys_platform == 'win32' and platform_system !=
'Darwin'`, even though we know the latter expression is redundant.
2. It might be expensive? I'm not sure. I don't think we test for
falseness _that_ often though.
Closes#7760.
Closes#9275.
Instead of modifying the error to replace a dummy derivation chain from
construction with the real one, build the error with the real derivation
chain directly.
This came up when trying to improve the build error reporting.
Introduces `DistErrorKind` to avoid error variants for each case that
are only different in one line of the message.
Add a preview option `uv init --build-backend uv --preview` that uses
the uv build backend when generating the project. The uv build backend
is in preview, so the option is also guarded by preview and hidden from
the help message and docs.
For https://github.com/astral-sh/uv/issues/3957#issuecomment-2518757563
In https://github.com/astral-sh/uv/issues/8155#issuecomment-2508969900,
resolution lowest was complaining about missing lower bounds for a
pacakge, even though the package had a URL, too:
```
uv pip install dist/pymatgen-2024.10.3.tar.gz pymatgen[ci,optional] --resolution=lowest
```
The error was raised from `pymatgen[ci,optional]`, because we were
looking at it before looking at the "URL"
`dist/pymatgen-2024.10.3.tar.gz`.
I've also added constraints and overrides to the bounds lookup, since
they are missing from the dependency graph.
Fixes#8155 (again)
## Summary
Closes#9643.
I modified the `commit` fn so this applies to `uv compile --output-file`
too. But I can move it to the export module if we want to restrict this
to `uv export` only.
## Test Plan
`cargo test`
This is like #9556, but at the level of all other builds, including the
resolver and installer. Going through PEP 517 to build a package is
slow, so when building a package with the uv build backend, we can call
into the uv build backend directly instead: No temporary virtual env, no
temp venv sync, no python subprocess calls, no uv subprocess calls.
This fast path is gated through preview. Since the uv wheel is not
available at test time, I've manually confirmed the feature by comparing
`uv venv && cargo run pip install . -v --preview --reinstall .` and `uv
venv && cargo run pip install . -v --reinstall .`. When hacking the
preview so that the python uv build backend works without the setting
the direct build also (wheel built with `maturin build --profile
profiling`), we can see the perfomance difference:
```
$ hyperfine --prepare "uv venv" --warmup 3 \
"UV_PREVIEW=1 target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --preview" \
"target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --find-links target/wheels/"
Benchmark 1: UV_PREVIEW=1 target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --preview
Time (mean ± σ): 33.1 ms ± 2.5 ms [User: 25.7 ms, System: 13.0 ms]
Range (min … max): 29.8 ms … 47.3 ms 73 runs
Benchmark 2: target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --find-links target/wheels/
Time (mean ± σ): 115.1 ms ± 4.3 ms [User: 54.0 ms, System: 27.0 ms]
Range (min … max): 109.2 ms … 123.8 ms 25 runs
Summary
UV_PREVIEW=1 target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --preview ran
3.48 ± 0.29 times faster than target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --find-links target/wheels/
```
Do we need a global option to disable the fast path? There is one for
`uv build` because `--force-pep517` moves `uv build` much closer to a
`pip install` from source that a user of a library would experience (See
discussion at #9610), but uv overall doesn't really make guarantees
around the build env of dependencies, so I consider the direct build a
valid option.
Best reviewed commit-by-commit, only the last commit is the actual
implementation, while the preview mode introduction is just a
refactoring touching too many files.
When building a wheel from a source distribution or both a source
distribution and a wheel, the versions in their filenames must be the
same.
By inspecting the filenames, we also assert that the filenames from the
build a valid (we don't enforce normalization though, just that uv can
parse them).
Note that we're not yet checking that also the `pyproject.toml` version,
if declared, and METADATA version matches.
When running `lock_requires_python_exact`, we would download CPython
3.12.0 each time. By instead downloading CPython 3.13.0 ahead of time
and passing it in, we speed the test up and avoid timeouts. Locally in
pycharm, the test goes from 6.5s to 500ms.
When encountering `dynamic = ["version"]` in the pyproject.toml of a
source dist, we can ignore that and treat it as a statically known
metadata distribution, since the filename tells us the version and that
version must not change on build.
This fixed locking PyGObject 3.50.0 from `pygobject-3.50.0.tar.gz`
(minimized):
```toml
[project]
name = "PyGObject"
description = "Python bindings for GObject Introspection"
requires-python = ">=3.9, <4.0"
dependencies = [
"pycairo>=1.16"
]
dynamic = ["version"]
```
Afterwards, `uv add --no-sync toga` passes on Ubuntu 24.04 without the
pygobject build deps, when previously it needed `{ name = "pygobject",
version = "3.50.0", requires-dist = [], requires-python = ">=3.9" }`.
I've added a check that source distribution versions are respected after
build.
Fixes#9548
Add the `uv build --list`, a "subcommand" to list the files that would
be included when building a distribution. It does not build the
distribution, except when a source dist is required for source dist ->
wheel. This is an important debugging tool for the include and exclude
options: Did i actually include the files I wanted, or am i shipping a
broken distribution? Are there any temporary files I still need to
exclude?
Cargo offers this as `cargo package --list`.
`--list` is preview-exclusive, since it requires the fast path, which I
also put into preview.
Examples:



I'll fix the error handling in a follow-up.
Tagging as enhancement because it changes the stable output slightly
(two lines instead of one).
CC @charliermarsh for uv-wide consistency in the stdout/stderr handling.
## Summary
This change introduces the `UV_NO_INSTALLER_METADATA` environment
variable
as a way to opt out of the extra installer metadata files that `uv` is
creating.
This is important to achieve reproducible builds in distribution
packaging, allowing to replace usage of
[installer](https://pypi.org/project/installer) with `uv pip install`.
At the time of writing these files are:
- `uv_cache.json`
Contains timestamps which are non-reproducible.
These hashes also leak in to the `RECORD` file.
- `direct_url.json`
Contains the path to the installed wheel.
While not non-reproducible it's not required for distribution packaging.
- `INSTALLER`
Again, not non-reproducible, but of no value in distribution packaging.
## Test Plan
Automated test added.
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>