Instead of modifying the error to replace a dummy derivation chain from
construction with the real one, build the error with the real derivation
chain directly.
This came up when trying to improve the build error reporting.
Introduces `DistErrorKind` to avoid error variants for each case that
are only different in one line of the message.
Add a preview option `uv init --build-backend uv --preview` that uses
the uv build backend when generating the project. The uv build backend
is in preview, so the option is also guarded by preview and hidden from
the help message and docs.
For https://github.com/astral-sh/uv/issues/3957#issuecomment-2518757563
In https://github.com/astral-sh/uv/issues/8155#issuecomment-2508969900,
resolution lowest was complaining about missing lower bounds for a
pacakge, even though the package had a URL, too:
```
uv pip install dist/pymatgen-2024.10.3.tar.gz pymatgen[ci,optional] --resolution=lowest
```
The error was raised from `pymatgen[ci,optional]`, because we were
looking at it before looking at the "URL"
`dist/pymatgen-2024.10.3.tar.gz`.
I've also added constraints and overrides to the bounds lookup, since
they are missing from the dependency graph.
Fixes#8155 (again)
## Summary
Closes#9643.
I modified the `commit` fn so this applies to `uv compile --output-file`
too. But I can move it to the export module if we want to restrict this
to `uv export` only.
## Test Plan
`cargo test`
This is like #9556, but at the level of all other builds, including the
resolver and installer. Going through PEP 517 to build a package is
slow, so when building a package with the uv build backend, we can call
into the uv build backend directly instead: No temporary virtual env, no
temp venv sync, no python subprocess calls, no uv subprocess calls.
This fast path is gated through preview. Since the uv wheel is not
available at test time, I've manually confirmed the feature by comparing
`uv venv && cargo run pip install . -v --preview --reinstall .` and `uv
venv && cargo run pip install . -v --reinstall .`. When hacking the
preview so that the python uv build backend works without the setting
the direct build also (wheel built with `maturin build --profile
profiling`), we can see the perfomance difference:
```
$ hyperfine --prepare "uv venv" --warmup 3 \
"UV_PREVIEW=1 target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --preview" \
"target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --find-links target/wheels/"
Benchmark 1: UV_PREVIEW=1 target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --preview
Time (mean ± σ): 33.1 ms ± 2.5 ms [User: 25.7 ms, System: 13.0 ms]
Range (min … max): 29.8 ms … 47.3 ms 73 runs
Benchmark 2: target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --find-links target/wheels/
Time (mean ± σ): 115.1 ms ± 4.3 ms [User: 54.0 ms, System: 27.0 ms]
Range (min … max): 109.2 ms … 123.8 ms 25 runs
Summary
UV_PREVIEW=1 target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --preview ran
3.48 ± 0.29 times faster than target/profiling/uv pip install --no-deps --reinstall scripts/packages/built-by-uv --find-links target/wheels/
```
Do we need a global option to disable the fast path? There is one for
`uv build` because `--force-pep517` moves `uv build` much closer to a
`pip install` from source that a user of a library would experience (See
discussion at #9610), but uv overall doesn't really make guarantees
around the build env of dependencies, so I consider the direct build a
valid option.
Best reviewed commit-by-commit, only the last commit is the actual
implementation, while the preview mode introduction is just a
refactoring touching too many files.
When building a wheel from a source distribution or both a source
distribution and a wheel, the versions in their filenames must be the
same.
By inspecting the filenames, we also assert that the filenames from the
build a valid (we don't enforce normalization though, just that uv can
parse them).
Note that we're not yet checking that also the `pyproject.toml` version,
if declared, and METADATA version matches.
When running `lock_requires_python_exact`, we would download CPython
3.12.0 each time. By instead downloading CPython 3.13.0 ahead of time
and passing it in, we speed the test up and avoid timeouts. Locally in
pycharm, the test goes from 6.5s to 500ms.
When encountering `dynamic = ["version"]` in the pyproject.toml of a
source dist, we can ignore that and treat it as a statically known
metadata distribution, since the filename tells us the version and that
version must not change on build.
This fixed locking PyGObject 3.50.0 from `pygobject-3.50.0.tar.gz`
(minimized):
```toml
[project]
name = "PyGObject"
description = "Python bindings for GObject Introspection"
requires-python = ">=3.9, <4.0"
dependencies = [
"pycairo>=1.16"
]
dynamic = ["version"]
```
Afterwards, `uv add --no-sync toga` passes on Ubuntu 24.04 without the
pygobject build deps, when previously it needed `{ name = "pygobject",
version = "3.50.0", requires-dist = [], requires-python = ">=3.9" }`.
I've added a check that source distribution versions are respected after
build.
Fixes#9548
Add the `uv build --list`, a "subcommand" to list the files that would
be included when building a distribution. It does not build the
distribution, except when a source dist is required for source dist ->
wheel. This is an important debugging tool for the include and exclude
options: Did i actually include the files I wanted, or am i shipping a
broken distribution? Are there any temporary files I still need to
exclude?
Cargo offers this as `cargo package --list`.
`--list` is preview-exclusive, since it requires the fast path, which I
also put into preview.
Examples:



I'll fix the error handling in a follow-up.
Tagging as enhancement because it changes the stable output slightly
(two lines instead of one).
CC @charliermarsh for uv-wide consistency in the stdout/stderr handling.
## Summary
This change introduces the `UV_NO_INSTALLER_METADATA` environment
variable
as a way to opt out of the extra installer metadata files that `uv` is
creating.
This is important to achieve reproducible builds in distribution
packaging, allowing to replace usage of
[installer](https://pypi.org/project/installer) with `uv pip install`.
At the time of writing these files are:
- `uv_cache.json`
Contains timestamps which are non-reproducible.
These hashes also leak in to the `RECORD` file.
- `direct_url.json`
Contains the path to the installed wheel.
While not non-reproducible it's not required for distribution packaging.
- `INSTALLER`
Again, not non-reproducible, but of no value in distribution packaging.
## Test Plan
Automated test added.
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
## Summary
Today, our dependency group implementation is a little awkward... For
each package `P`, we check if `P` contains dependencies for each enabled
group, then add a dependency on `P` with the group enabled. There are a
few issues here:
1. It's sort of backwards... We add a dependency from the base package
`P` to `P` with the group enabled. Then `P` with the group enabled adds
a dependency on the base package.
2. We can't, e.g., enable different groups for different packages. (We
don't have a way for users to specify this on the CLI, but there's no
reason that it should be _impossible_ in the resolver.)
3. It's inconsistent with how extras work, which leads to confusing
differences in the resolver.
Instead, our internal requirement type can now include dependency
groups, which makes dependency groups look much, much more like extras
in the resolver.
Using the directory writer trait, we can collect the files instead of
writing them to a real sink. This builds up to a `uv build --list`
similar to `cargo package --list`. It is not connected to the cli yet.
## Summary
Discovered while working on https://github.com/astral-sh/uv/issues/9516.
In the linked repo, the root uses a `../dependency` path for the
workspace member, which we weren't normalizing.
## Summary
If a Git repository uses a `path` dependency (rather than a
`workspace`), we need to expand the path to make it relative to the Git
root.
Closes https://github.com/astral-sh/uv/issues/9516.
## Summary
Include the `git_member` when fetching metadata from cache.
h/t to @PhilipVinc for the suggested fix
Resolves#8887
## Test Plan
Pending
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
This pull request is best viewed with [whitespace
hidden](https://github.com/astral-sh/uv/pull/8650/files?diff=unified&w=1)
Adds a `--default` flag to `uv python install` in preview. This includes
a `python` and `python{major}` executable in addition to the
`python{major}.{minor}` executable. We will replace uv-managed
executables, but externally managed executables require the `--force`
flag to overwrite.
If you run `uv python install` (without arguments), we include the
`--default` flag implicitly to populate `python` and `python3` for the
"default" install version.
In the future, we should add a warning if the installed executable isn't
at the front of the PATH.
## Summary
Fixes#9027
Minor enhancement on top of #8531 that makes the CLI parameter
`--check-url` also available as the setting `check-url` in configuration
files.
## Test Plan
Updates existing tests to take the new setting into account.
Within publish command testing I didn't see existing tests covering
settings from toml files (instead of from CLI params), so I didn't add
anything of that sort.
## Summary
On Windows, non-virtual environments put the `python.exe` in the
top-level of the installation directory, rather than in `Scripts`. This
PR adds those paths to `PATH` in `uv run` and `uv tool run`.
Closes
https://github.com/astral-sh/uv/issues/9574#issuecomment-2512217110.
This _partially_ unwinds the optimization in #9540 by adding back the
base package dependency as a sibling to the extra package dependency
in some cases. Specifically, this occurs when _any_ of the extras are
declared as conflicting.
This is believed to be necessary (until another method is found) to
handle the forking logic based on conflicts. Namely, the forking logic
depends on the base and extra packages being sibling dependencies. If
only the extra is present, then it won't be included in the fork that
excludes all conflicting extras. And that means the base package won't
either, even though it should be included in that fork in some cases. If
the base package dependency is deferred, then it will never be reached.
This also adds another test and updates the snapshots that would have
caught the regression in #9540 if the conflict tests had been enabled.
Embarrassingly, PR #9474 moved the conflicting extras/groups tests into
their own module, but never actually included the module in
`it/main.rs`.
This adds `lock_conflict` to `main.rs` and fixes the fallout.
For listing files, we first use a directory writer for source dists,
which we will use for collecting the filenames instead of writing the
archive in the future. I've split breaking `lib.rs` of uv-build-backend
into modules into the next PR.
No logic changes, only restructuring.
Best reviewed commit-by-commit
Going through PEP 517 to build a package is slow, so when building a
package with the uv build backend, we can call into the uv build backend
directly. This is the basis for the `uv build --list`.
This does not enable the fast path for general source dependencies.
There is a possible difference in execution if the latest uv version is
newer than the one currently running: The PEP 517 path would use the
latest version, while the fast path uses the current version.
Please review commit-by-commit
### Benchmark
`built_with_uv`, using the fast path:
```
$ hyperfine "~/projects/uv/target/profiling/uv build"
Time (mean ± σ): 9.2 ms ± 1.1 ms [User: 4.6 ms, System: 4.6 ms]
Range (min … max): 6.4 ms … 12.7 ms 290 runs
```
`hatcling_editable`, with hatchling being optimized for fast startup
times:
```
$ hyperfine "~/projects/uv/target/profiling/uv build"
Time (mean ± σ): 270.5 ms ± 18.4 ms [User: 230.8 ms, System: 44.5 ms]
Range (min … max): 250.7 ms … 298.4 ms 10 runs
```
In the course of working on #9289, I've had to devise
some additions to our markers. While we are still staying
strictly compatible with the PEP 508 format, we will be
abusing the `extra` expression to carry a lot more
information.
Specifically, we want the following additional
operations:
* Simplify `extra != 'foo'`
* Remove all extra expressions
* Remove everything except extra expressions
My work on #9289 requires all of these (which will be
in a future in PR).
## Summary
This proposes adding the command line option `uv pip uninstall --dry-run
...`, complementing the existing `uv pip install --dry-run ...` added
for #1244 in #1436.
This option does not exist in PyPA's `pip uninstall`, if adopted it
would be unique to `uv pip`. The code should be considered PoC, it is
baby's first Rust.
The initial motivation was while investigating
https://github.com/moreati/ansible-uv/issues/2 - to allow Ansible module
`moreati.uv.pip` to work with`state: absent` in "check_mode" (Ansible's
equivalent of a dry run), without requiring `packaging` or `setuptools`.
## Test Plan
One new unit test has been added. I pedge to add more if the feature is
desired/accepted
Example usage
```console
➜ uv git:(pip-uninstall--dry-run) rm -rf .venv
➜ uv git:(pip-uninstall--dry-run) ./target/debug/uv venv
Using CPython 3.13.0
Creating virtual environment at: .venv
Activate with: source .venv/bin/activate
➜ uv git:(pip-uninstall--dry-run) ./target/debug/uv pip install httpx
Resolved 7 packages in 178ms
Prepared 5 packages in 60ms
Installed 7 packages in 15ms
+ anyio==4.6.2.post1
+ certifi==2024.8.30
+ h11==0.14.0
+ httpcore==1.0.7
+ httpx==0.28.0
+ idna==3.10
+ sniffio==1.3.1
➜ uv git:(pip-uninstall--dry-run) ./target/debug/uv pip uninstall --dry-run httpx
Would uninstall 1 package
- httpx==0.28.0
➜ uv git:(pip-uninstall--dry-run) ./target/debug/uv pip list
Package Version
-------- -----------
anyio 4.6.2.post1
certifi 2024.8.30
h11 0.14.0
httpcore 1.0.7
httpx 0.28.0
idna 3.10
sniffio 1.3.1
```
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
Fixes#9531
## Context
While working with [uv](https://github.com/astral-sh/uv), I encountered
issues with a python dependency, [httpx](https://www.python-httpx.org/)
unable to be installed because of a **os error 5 permission denied**.
The error occur when we try to persist a **.exe file** from a temporary
folder into a persistent one.
I only reproduce the issue in an enterprise **Windows** Jenkins Runner.
In my virtual machines, I don't have any issues. So I think this is most
probably coming from the system configuration. This windows runner
**contains an AV/EDR**. And the fact that the file locked occured only
once for an executable make me think that it's most probably the cause.
While doing some research and speaking with some colleagues (hi
@vmeurisse), it seems that the issue is a very recurrent one on Windows.
In the Javascript ecosystem, there is this package, created by the
@isaacs, `npm` inventor: https://www.npmjs.com/package/graceful-fs, used
inside `npm`, allowing its package installations to be more resilient to
filesystem errors:
> The improvements are meant to normalize behavior across different
platforms and environments, and to make filesystem access more resilient
to errors.
One of its core feature is this one:
> On Windows, it retries renaming a file for up to one second if EACCESS
or EPERM error occurs, likely because antivirus software has locked the
directory.
So I tried to implement the same algorithm on `uv`, **and it fixed my
issue**! I can finally install `httpx`.
Then, [as I mentionned in this
issue](https://github.com/astral-sh/uv/issues/9531#issuecomment-2508981316),
I saw that you already implemented exactly the same algorithm in an
asynchronous function for renames 😄22fd9f7ff1/crates/uv-fs/src/lib.rs (L221)
## Summary of changes
- I added a similar function for `persist` (was not easy to have the
benediction of the borrow checker 😄)
- I added a `sync` variant of `rename_with_retry`
- I edited `install_script` to use the function including retries on
Windows
Let me know if I should change anything 🙂
Thanks!!
## Test Plan
This pull-request should be totally iso-functional, so I think it should
be covered by existing tests in case of regression.
All tests are still passing on my side.
Also, of course validated that my windows machines (windows 10 & windows
11) containing AV/EDR software are now able to install `httpx.exe`
script.
However, if you think any additional test is needed, feel free to tell
me!
When looking at the build frontend code, I noticed that we always pass
every single field of the shared state to the build dispatch:
```rust
let build_dispatch = BuildDispatch::new(
...
&state.index,
&state.git,
&state.capabilities,
&state.in_flight,
...
);
```
We can abstract this by moving `SharedState` into the build dispatch.
The `BuildDispatch` then has only immutable fields and the
`SharedState`. Since the `SharedState` is all `Arc`s, we can clone it
freely.
## Summary
After #9524, I noticed two other dependencies were misaligned.
Since the previous PR has been merged, I was thinking I could submit
those two misses.
Of course, open to any comments/decline!
Thanks!! 🙂
## Test Plan
All units tests are still passing on my side. Let's see with the
pull-request CI again 😄
## Summary
Previously, when we encountered `foo[bar]`, we'd add a dependency on
`PubGrubPackage::Package` for `foo`, and then `PubGrubPackage::Extra`
for `foo[bar]`.
Later, when we ask for the dependencies of the `PubGrubPackage::Extra`,
we add `PubGrubPackage::Package` for `foo`, and
`PubGrubPackage::Package` for `foo[bar]`. This is an intentional
strategy because it ensures that PubGrub "knows" that these have to be
solved to the same version as early as possible.
It turns out that the first part here ("add a dependency on
`PubGrubPackage::Package` for `foo`") is suboptimal, because it means
PubGrub might try to solve _just_ `foo` without realizing that it also
has to accommodate all the constraints from the extra.
Instead, we now add _just_ `PubGrubPackage::Extra` for `foo[bar]`, and
defer adding the base package. It looks like this leads to a far more
efficient solve for Airflow.
When adding excludes, we usually don't want to include python cache
files. On the contrary, I haven't seen any project in my ecosystem
research that would want any of `__pycache__`, `*.pyc`, `*.pyo` to be
included. By moving them behind a `default-excludes` toggle, they are
always active even when defining custom excludes, but can be deactivated
if the user so chooses.
With includes and excludes being this small again, we can roll back the
include-exclude anchored difference to always using anchored globs (i.e.
you would need to use `**/build-*.h` below).
A pyproject.toml with custom settings with the change applied:
```toml
[project]
name = "foo"
version = "0.1.0"
readme = "README.md"
license-files = ["LICENSE*", "third-party-licenses/*"]
[tool.uv.build-backend]
# A file we need for the source dist -> wheel step, but not in the wheel itself (currently unused)
source-include = ["data/build-script.py"]
# A temporary or generated file we want to ignore
source-exclude = ["/src/foo/not-packaged.txt"]
# Headers are build-only
wheel-exclude = ["build-*.h"]
[tool.uv.build-backend.data]
scripts = "scripts"
data = "assets"
headers = "header"
[build-system]
requires = ["uv>=0.5.5,<0.6"]
build-backend = "uv"
```