Compare commits

...

117 commits
0.7.14 ... main

Author SHA1 Message Date
Zanie Blue
7e48292fac
Fix handling of pre-releases in preferences (#14498)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Closes https://github.com/astral-sh/uv/issues/14485

I tested this using the reproduction in the issue. It'd be nice to add
test coverage though.
2025-07-07 20:10:35 -05:00
github-actions[bot]
e31f556205
Sync latest Python releases (#14452)
Automated update for Python releases.

Co-authored-by: zanieb <2586601+zanieb@users.noreply.github.com>
2025-07-08 00:53:38 +00:00
Zanie Blue
dedced3265
Remove cache-dependency-glob examples for setup-uv (#14493)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
See https://github.com/astral-sh/uv/pull/13163#discussion_r2063244551
2025-07-07 15:06:23 -05:00
theirix
5c6d76ca8b
Update documentation for GHA to use v6 (#14490)
## Summary

`astral-sh/setup-uv@v6` is the latest version of GitHub actions.

## Test Plan

Documentation update
2025-07-07 14:04:45 -05:00
Nils Koch
1d20530f2d
trim content of INSTALLER file (#14488)
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

We are using UV as a library and `installer()` returned `"pip\n"`. The
packages got installed by the pip package manager and not by UV. pip
seems to add a new line to the `INSTALLER` file and UV does not.

<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

<!-- How was it tested? -->
2025-07-07 18:16:50 +02:00
renovate[bot]
ddb1577a93
Update Rust crate reqwest to v0.12.22 (#14475)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [reqwest](https://redirect.github.com/seanmonstar/reqwest) |
workspace.dependencies | patch | `=0.12.15` -> `=0.12.22` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>seanmonstar/reqwest (reqwest)</summary>

###
[`v0.12.22`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01222)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.21...v0.12.22)

- Fix socks proxies when resolving IPv6 destinations.

###
[`v0.12.21`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01221)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.20...v0.12.21)

- Fix socks proxy to use `socks4a://` instead of `socks4h://`.
- Fix `Error::is_timeout()` to check for hyper and IO timeouts too.
- Fix request `Error` to again include URLs when possible.
- Fix socks connect error to include more context.
- (wasm) implement `Default` for `Body`.

###
[`v0.12.20`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01220)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.19...v0.12.20)

- Add `ClientBuilder::tcp_user_timeout(Duration)` option to set
`TCP_USER_TIMEOUT`.
- Fix proxy headers only using the first matched proxy.
- (wasm) Fix re-adding `Error::is_status()`.

###
[`v0.12.19`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01219)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.18...v0.12.19)

- Fix redirect that changes the method to GET should remove payload
headers.
- Fix redirect to only check the next scheme if the policy action is to
follow.
- (wasm) Fix compilation error if `cookies` feature is enabled (by the
way, it's a noop feature in wasm).

###
[`v0.12.18`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01218)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.17...v0.12.18)

- Fix compilation when `socks` enabled without TLS.

###
[`v0.12.17`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01217)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.16...v0.12.17)

- Fix compilation on macOS.

###
[`v0.12.16`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01216)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.15...v0.12.16)

- Add `ClientBuilder::http3_congestion_bbr()` to enable BBR congestion
control.
- Add `ClientBuilder::http3_send_grease()` to configure whether to send
use QUIC grease.
- Add `ClientBuilder::http3_max_field_section_size()` to configure the
maximum response headers.
- Add `ClientBuilder::tcp_keepalive_interval()` to configure TCP probe
interval.
- Add `ClientBuilder::tcp_keepalive_retries()` to configure TCP probe
count.
- Add `Proxy::headers()` to add extra headers that should be sent to a
proxy.
- Fix `redirect::Policy::limit()` which had an off-by-1 error, allowing
1 more redirect than specified.
- Fix HTTP/3 to support streaming request bodies.
- (wasm) Fix null bodies when calling `Response::bytes_stream()`.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box


---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

---

Closes #14243

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: konstin <konstin@mailbox.org>
2025-07-07 13:39:57 +02:00
John Mumm
d31e6ad7c7
Move fragment preservation test to directly test our redirect handling logic (#14480)
When [updating](https://github.com/astral-sh/uv/pull/14475) to the
latest `reqwest` version, our fragment propagation test broke. That test
was partially testing the `reqwest` behavior, so this PR moves the
fragment test to directly test our logic for constructing redirect
requests.
2025-07-07 12:51:21 +02:00
renovate[bot]
3a77b9cdd9
Update aws-actions/configure-aws-credentials digest to f503a18 (#14473)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| aws-actions/configure-aws-credentials | action | digest | `3d8cba3` ->
`f503a18` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 11:31:31 +02:00
renovate[bot]
1d027bd92a
Update pre-commit dependencies (#14474)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[astral-sh/ruff-pre-commit](https://redirect.github.com/astral-sh/ruff-pre-commit)
| repository | patch | `v0.12.1` -> `v0.12.2` |
| [crate-ci/typos](https://redirect.github.com/crate-ci/typos) |
repository | minor | `v1.33.1` -> `v1.34.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

Note: The `pre-commit` manager in Renovate is not supported by the
`pre-commit` maintainers or community. Please do not report any problems
there, instead [create a Discussion in the Renovate
repository](https://redirect.github.com/renovatebot/renovate/discussions/new)
if you have any questions.

---

### Release Notes

<details>
<summary>astral-sh/ruff-pre-commit (astral-sh/ruff-pre-commit)</summary>

###
[`v0.12.2`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.2)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.12.1...v0.12.2)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.2

</details>

<details>
<summary>crate-ci/typos (crate-ci/typos)</summary>

###
[`v1.34.0`](https://redirect.github.com/crate-ci/typos/releases/tag/v1.34.0)

[Compare
Source](https://redirect.github.com/crate-ci/typos/compare/v1.33.1...v1.34.0)

#### \[1.34.0] - 2025-06-30

##### Features

- Updated the dictionary with the [June
2025](https://redirect.github.com/crate-ci/typos/issues/1309) changes

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 10:36:26 +02:00
renovate[bot]
bb738aeb44
Update Rust crate test-log to v0.2.18 (#14477)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [test-log](https://redirect.github.com/d-e-s-o/test-log) |
dev-dependencies | patch | `0.2.17` -> `0.2.18` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>d-e-s-o/test-log (test-log)</summary>

###
[`v0.2.18`](https://redirect.github.com/d-e-s-o/test-log/blob/HEAD/CHANGELOG.md#0218)

[Compare
Source](https://redirect.github.com/d-e-s-o/test-log/compare/v0.2.17...v0.2.18)

- Improved cooperation with other similar procedural macros to enable
  attribute stacking

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 10:04:50 +02:00
renovate[bot]
fc758bb755
Update Rust crate schemars to v1.0.4 (#14476)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [schemars](https://graham.cool/schemars/)
([source](https://redirect.github.com/GREsau/schemars)) |
workspace.dependencies | patch | `1.0.3` -> `1.0.4` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>GREsau/schemars (schemars)</summary>

###
[`v1.0.4`](https://redirect.github.com/GREsau/schemars/blob/HEAD/CHANGELOG.md#104---2025-07-06)

[Compare
Source](https://redirect.github.com/GREsau/schemars/compare/v1.0.3...v1.0.4)

##### Fixed

- Fix `JsonSchema` impl on
[atomic](https://doc.rust-lang.org/std/sync/atomic/) types being ignored
on non-nightly compilers due to a buggy `cfg` check
([https://github.com/GREsau/schemars/issues/453](https://redirect.github.com/GREsau/schemars/issues/453))
- Fix compatibility with minimal dependency versions, e.g. old(-ish)
versions of `syn`
([https://github.com/GREsau/schemars/issues/450](https://redirect.github.com/GREsau/schemars/issues/450))
- Fix derive for empty tuple variants
([https://github.com/GREsau/schemars/issues/455](https://redirect.github.com/GREsau/schemars/issues/455))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 10:00:52 +02:00
renovate[bot]
1308c85efe
Update Rust crate async-channel to v2.5.0 (#14478)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [async-channel](https://redirect.github.com/smol-rs/async-channel) |
workspace.dependencies | minor | `2.3.1` -> `2.5.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>smol-rs/async-channel (async-channel)</summary>

###
[`v2.5.0`](https://redirect.github.com/smol-rs/async-channel/blob/HEAD/CHANGELOG.md#Version-250)

[Compare
Source](https://redirect.github.com/smol-rs/async-channel/compare/v2.4.0...v2.5.0)

- Add `Sender::closed()`
([#&#8203;102](https://redirect.github.com/smol-rs/async-channel/issues/102))

###
[`v2.4.0`](https://redirect.github.com/smol-rs/async-channel/blob/HEAD/CHANGELOG.md#Version-240)

[Compare
Source](https://redirect.github.com/smol-rs/async-channel/compare/v2.3.1...v2.4.0)

- Add `Sender::same_channel()` and `Receiver::same_channel()`.
([#&#8203;98](https://redirect.github.com/smol-rs/async-channel/issues/98))
- Add `portable-atomic` feature to support platforms without atomics.
([#&#8203;106](https://redirect.github.com/smol-rs/async-channel/issues/106))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 09:58:05 +02:00
John Mumm
f609e1ddaf
Document that VerbatimUrl does not preserve original string after serialization (#14456)
Some checks failed
CI / check cache | macos aarch64 (push) Has been cancelled
CI / check system | python on debian (push) Has been cancelled
CI / check system | python on fedora (push) Has been cancelled
CI / check system | python on ubuntu (push) Has been cancelled
CI / check system | python on rocky linux 8 (push) Has been cancelled
CI / check system | python on rocky linux 9 (push) Has been cancelled
CI / check system | graalpy on ubuntu (push) Has been cancelled
CI / check system | pypy on ubuntu (push) Has been cancelled
CI / check system | python on macos aarch64 (push) Has been cancelled
CI / check system | homebrew python on macos aarch64 (push) Has been cancelled
CI / check system | python on macos x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86 (push) Has been cancelled
CI / check system | python3.13 on windows x86-64 (push) Has been cancelled
CI / check system | x86-64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | aarch64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | windows registry (push) Has been cancelled
CI / check system | python3.12 via chocolatey (push) Has been cancelled
CI / check system | python3.9 via pyenv (push) Has been cancelled
CI / check system | python3.13 (push) Has been cancelled
CI / check system | conda3.11 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.8 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.11 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.8 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.11 on windows x86-64 (push) Has been cancelled
CI / check system | conda3.8 on windows x86-64 (push) Has been cancelled
CI / check system | amazonlinux (push) Has been cancelled
CI / check system | embedded python3.10 on windows x86-64 (push) Has been cancelled
CI / benchmarks | walltime aarch64 linux (push) Has been cancelled
CI / benchmarks | instrumented (push) Has been cancelled
This came up in
[discussion](https://github.com/astral-sh/uv/pull/14387#issuecomment-3032223670)
on #14387.
2025-07-04 22:42:56 +02:00
Tim de Jager
eaf517efd8
Add method to get packages involved in a NoSolutionError (#14457)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

In pixi we overlay the PyPI packages over the conda packages and we
sometimes need to figure out what PyPI packages are involved in the
no-solution error. We could parse the error message, but this is pretty
error-prone, so it would be good to get access to more information. A
lot of information in this module is private and should probably stay
this way, but package names are easy enough to expose. This would help
us a lot!

I collect into a HashSet to remove duplication, and did not want to
expose a rustc_hash datastructure directly, thats's why I've chosen to
expose as an iterator :)

Let me know if any changes need to be done, and thanks!

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-07-04 18:08:23 +00:00
Charlie Marsh
e8bc3950ef
Remove transparent variants in uv-extract to enable retries (#14450)
Some checks are pending
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
## Summary

We think this is the culprit for the lack of retries in some settings
(e.g., Python downloads).

See: https://github.com/astral-sh/uv/issues/14425.
2025-07-03 23:32:07 +00:00
konsti
06af93fce7
Fix optional cfg gates (#14448)
Running `cargo clippy` in individual crates could raise warnings due to
unused imports as `Cow` is only used with `#[cfg(feature = "schemars")]`
2025-07-03 15:29:03 -05:00
Simon Sure
8afbd86f03
make ErrorTree for NoSolutionError externally accessible (#14444)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Hey, are you okay with exposing the `ErrorTree` for library consumers?

We have a use case that needs more information on conflicts. We need the
tree-structure of the conflict and be able to traverse it in particular.

Signed-off-by: Simon Sure <ssure@palantir.com>
2025-07-03 11:43:59 -05:00
konsti
a1cda6213c
Make "exit code" -> "exit status" a default filter (#14441)
Remove some test boilerplate.

Revival of https://github.com/astral-sh/uv/pull/14439 with main as base.
2025-07-03 13:50:40 +00:00
konsti
39cdfe9981
Add a test for --force-pep517 (#14310)
There was previously a gap in the test coverage in ensuring that
`--force-pep517` was respected.
2025-07-03 13:34:44 +00:00
Zanie Blue
85c0fc963b
Fix forced resolution with all extras in uv version (#14434)
Closes https://github.com/astral-sh/uv/issues/14433

Same as https://github.com/astral-sh/uv/pull/13380
2025-07-03 07:29:59 -05:00
Zanie Blue
c3f13d2505
Finish incomplete sentence in pip migration guide (#14432)
Some checks are pending
CI / ecosystem test | pallets/flask (push) Blocked by required conditions
CI / smoke test | linux (push) Blocked by required conditions
CI / smoke test | windows x86_64 (push) Blocked by required conditions
CI / smoke test | windows aarch64 (push) Blocked by required conditions
CI / integration test | conda on ubuntu (push) Blocked by required conditions
CI / integration test | deadsnakes python3.9 on ubuntu (push) Blocked by required conditions
CI / integration test | free-threaded on windows (push) Blocked by required conditions
CI / integration test | aarch64 windows implicit (push) Blocked by required conditions
CI / integration test | aarch64 windows explicit (push) Blocked by required conditions
CI / integration test | pypy on ubuntu (push) Blocked by required conditions
CI / integration test | pypy on windows (push) Blocked by required conditions
CI / integration test | graalpy on ubuntu (push) Blocked by required conditions
CI / integration test | determine publish changes (push) Blocked by required conditions
CI / integration test | registries (push) Blocked by required conditions
CI / integration test | uv publish (push) Blocked by required conditions
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Fixes https://github.com/astral-sh/uv/pull/12382#discussion_r2181237729
2025-07-03 01:02:17 +00:00
Zanie Blue
38ee6ec800
Bump version to 0.7.19 (#14431) 2025-07-02 21:19:52 +00:00
konsti
71b5ba13d7
Stabilize the uv build backend (#14311)
The uv build backend has gone through some feedback cycles, we expect no
more major configuration changes, and we're ready to take the next step:
The uv build backend in stable.

This PR stabilizes:

* Using `uv_build` as build backend
* The documentation of the uv build backend
* The direct build fast path, where uv doesn't use PEP 517 if you're
using `uv_build` in a compatible version.
* `uv build --list`, which is limited to `uv_build`.

It does not:
* Make `uv_build` the default on `uv init`
* Make `--package` the default on `uv init`
2025-07-02 15:37:43 -05:00
Zanie Blue
5f2857a1c7
Add linux aarch64 smoke tests (#14427)
Testing https://github.com/astral-sh/uv/pull/14426
2025-07-02 15:18:46 -05:00
Zanie Blue
a58969feef
Fix workspace_unsatisfiable_member_dependencies (#14429) 2025-07-02 15:11:50 -05:00
github-actions[bot]
3bb8ac610c
Sync latest Python releases (#14426)
Automated update for Python releases.

Co-authored-by: zanieb <2586601+zanieb@users.noreply.github.com>
2025-07-02 14:51:17 -05:00
Jack O'Connor
ec54dce919
Includes sys.prefix in cached environment keys to avoid --with collisions across projects (#14403)
Fixes https://github.com/astral-sh/uv/issues/12889.

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-07-02 14:40:18 -05:00
Zanie Blue
a6bb65c78d
Clarify behavior and hint on tool install when no executables are available (#14423)
Some checks are pending
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / build binary | macos aarch64 (push) Blocked by required conditions
CI / build binary | macos x86_64 (push) Blocked by required conditions
CI / build binary | windows x86_64 (push) Blocked by required conditions
CI / build binary | windows aarch64 (push) Blocked by required conditions
CI / ecosystem test | pydantic/pydantic-core (push) Blocked by required conditions
CI / ecosystem test | prefecthq/prefect (push) Blocked by required conditions
CI / ecosystem test | pallets/flask (push) Blocked by required conditions
CI / smoke test | linux (push) Blocked by required conditions
CI / integration test | free-threaded on windows (push) Blocked by required conditions
CI / integration test | aarch64 windows implicit (push) Blocked by required conditions
CI / integration test | aarch64 windows explicit (push) Blocked by required conditions
CI / integration test | pypy on ubuntu (push) Blocked by required conditions
CI / integration test | determine publish changes (push) Blocked by required conditions
CI / integration test | registries (push) Blocked by required conditions
CI / integration test | uv publish (push) Blocked by required conditions
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Closes https://github.com/astral-sh/uv/issues/14416
2025-07-02 13:11:17 -05:00
Charlie Marsh
743260b1f5
Make project and interpreter lock acquisition non-fatal (#14404)
## Summary

If we fail to acquire a lock on an environment, uv shouldn't fail; we
should just warn. In some cases, users run uv with read-only permissions
for their projects, etc.

For now, I kept any locks acquired _in the cache_ as hard failures,
since we always need write-access to the cache.

Closes https://github.com/astral-sh/uv/issues/14411.
2025-07-02 14:03:43 -04:00
Zanie Blue
2f53ea5c5c
Add a migration guide from pip to uv projects (#12382)
[Rendered](https://github.com/astral-sh/uv/blob/zb/pip-wip/docs/guides/migration/pip-to-project.md)

---------

Co-authored-by: samypr100 <3933065+samypr100@users.noreply.github.com>
Co-authored-by: Mathieu Kniewallner <mathieu.kniewallner@gmail.com>
Co-authored-by: Aria Desires <aria.desires@gmail.com>
2025-07-02 12:25:19 -05:00
Charlie Marsh
a9ea756d14
Ignore Python patch version for --universal pip compile (#14405)
## Summary

The idea here is that if a user runs `uv pip compile --universal`, we
should ignore the patch version on the current interpreter. I think this
makes sense... `--universal` tries to resolve for all future versions,
so it seems a bit odd that we'd start at the _current_ patch version.

Closes https://github.com/astral-sh/uv/issues/14397.
2025-07-02 11:11:51 -04:00
Zanie Blue
43f67a4a4c
Update the tilde version specifier warning to include more context (#14335)
Follows https://github.com/astral-sh/uv/pull/14008
2025-07-02 09:08:45 -05:00
konsti
a7aa46acc5
Add a "choosing a build backend" section to the docs (#14295)
I think the build backend docs as a whole are now ready for review. I
only made a small change here.

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-07-02 09:02:03 -05:00
Zanie Blue
b0db548c80
Bump the test timeout from 90s -> 120s (#14170)
In hopes of resolving https://github.com/astral-sh/uv/issues/14158

We should also see why the tests are so slow though.
2025-07-02 13:39:58 +00:00
konsti
bf5dcf9929
Reduce index credential stashing code duplication (#14419)
Reduces some duplicate code around index credentials.
2025-07-02 15:25:56 +02:00
John Mumm
e40d3d5dff
Re-enable Artifactory in the registries integration test (#14408)
Having worked out the account issue, I've re-enabled Artifactory in the
registries test.
2025-07-02 13:50:35 +02:00
Zanie Blue
87e9ccfb92
Bump version to 0.7.18 (#14402)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-07-01 15:30:44 -05:00
konsti
06df95adbf
Workaround for panic due to missing global validation in clap (#14368)
Clap does not perform global validation, so flag that are declared as
overriding can be set at the same time:
https://github.com/clap-rs/clap/issues/6049. This would previously cause
a panic. We work around this by choosing the yes-value always and
writing a warning.

An alternative would be erroring when both are set, but it's unclear to
me if this may break things we want to support. (`UV_OFFLINE=1 cargo run
-q pip --no-offline install tqdm --no-cache` is already banned).

Fixes https://github.com/astral-sh/uv/pull/14299

**Test Plan**

```
$ cargo run -q pip --offline install --no-offline tqdm --no-cache
  warning: Boolean flags on different levels are not correctly supported (https://github.com/clap-rs/clap/issues/6049)
    × No solution found when resolving dependencies:
    ╰─▶ Because tqdm was not found in the cache and you require tqdm, we can conclude that your requirements are unsatisfiable.

        hint: Packages were unavailable because the network was disabled. When the network is disabled, registry packages may only be read from the cache.
```
2025-07-01 13:39:46 -05:00
John Mumm
29fcd6faee
Fix test cases to match Cow variants (#14390)
Updates `without_trailing_slash` and `without_fragment` to separately
match values against `Cow` variants.

Closes #14350
2025-07-01 13:39:17 -05:00
Charlie Marsh
d9f9ed4aec
Reuse build (virtual) environments across resolution and installation (#14338)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
## Summary

The basic idea here is that we can (should) reuse a build environment
across resolution (`prepare_metadata_for_build_wheel`) and installation.
This also happens to solve the build-PyTorch-from-source problem, since
we use a consistent build environment between the invocations.

Since `SourceDistributionBuilder` is stateless, we instead store the
builds on `BuildContext`, and we key them by various properties: the
underlying interpreter, the configuration settings, etc. This just
ensures that if we build the same package twice within a process, we
don't accidentally reuse an incompatible build (virtual) environment.
(Note that still drop build environments at the end of the command, and
don't attempt to reuse them across processes.)

Closes #14269.
2025-07-01 13:15:47 -04:00
Jack O'Connor
85358fe9c6 Keep track of retries in ManagedPythonDownload::fetch_with_retry
If/when we see https://github.com/astral-sh/uv/issues/14171 again, this
should clarify whether our retry logic was skipped (i.e. a transient
error wasn't correctly identified as transient), or whether we exhausted
our retries. Previously, if you ran a local example fileserver as in
https://github.com/astral-sh/uv/issues/14171#issuecomment-3014580701 and
then you tried to install Python from it, you'd get:

```
$ export UV_TEST_NO_CLI_PROGRESS=1
$ uv python install 3.8.20 --mirror http://localhost:8000 2>&1 | cat
error: Failed to install cpython-3.8.20-linux-x86_64-gnu
  Caused by: Failed to extract archive: cpython-3.8.20-20241002-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
  Caused by: failed to unpack `/home/jacko/.local/share/uv/python/.temp/.tmpS4sHHZ/python/lib/libpython3.8.so.1.0`
  Caused by: failed to unpack `python/lib/libpython3.8.so.1.0` into `/home/jacko/.local/share/uv/python/.temp/.tmpS4sHHZ/python/lib/libpython3.8.so.1.0`
  Caused by: error decoding response body
  Caused by: request or response body error
  Caused by: error reading a body from connection
  Caused by: Connection reset by peer (os error 104)
```

With this change you get:

```
error: Failed to install cpython-3.8.20-linux-x86_64-gnu
  Caused by: Request failed after 3 retries
  Caused by: Failed to extract archive: cpython-3.8.20-20241002-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
  Caused by: failed to unpack `/home/jacko/.local/share/uv/python/.temp/.tmp4Ia24w/python/lib/libpython3.8.so.1.0`
  Caused by: failed to unpack `python/lib/libpython3.8.so.1.0` into `/home/jacko/.local/share/uv/python/.temp/.tmp4Ia24w/python/lib/libpython3.8.so.1.0`
  Caused by: error decoding response body
  Caused by: request or response body error
  Caused by: error reading a body from connection
  Caused by: Connection reset by peer (os error 104)
```

At the same time, I'm updating the way we handle the retry count to
avoid nested retry loops exceeding the intended number of attempts, as I
mentioned at
https://github.com/astral-sh/uv/issues/14069#issuecomment-3020634281.
It's not clear to me whether we actually want this part of the change,
and I need feedback here.
2025-07-01 09:52:19 -07:00
Charlie Marsh
c078683217
Only drop build directories on program exit (#14304)
## Summary

This PR ensures that we avoid cleaning up build directories until the
end of a resolve-and-install cycle. It's not bulletproof (since we could
still run into issues with `uv lock` followed by `uv sync` whereby a
build directory gets cleaned up that's still referenced in the `build`
artifacts), but it at least gets PyTorch building without error with `uv
pip install .`, which is a case that's been reported several times.

Closes https://github.com/astral-sh/uv/issues/14269.
2025-07-01 12:50:19 -04:00
Zanie Blue
c777491bf4
Use the insiders requirements when building docs in CI (#14379) 2025-07-01 11:29:10 -05:00
konsti
9af3e9b6ec
Remove unnecessary codspeed deps (#14396)
See https://github.com/CodSpeedHQ/codspeed-rust/pull/108
2025-07-01 11:00:30 -05:00
konsti
43745d2ecf
Fix equals-star and tilde-equals with python_version and python_full_version (#14271)
The marker display code assumes that all versions are normalized, in
that all trailing zeroes are stripped. This is not the case for
tilde-equals and equals-star versions, where the trailing zeroes (before
the `.*`) are semantically relevant. This would cause path
dependent-behavior where we would get a different marker string
depending on whether a version with or without a trailing zero was added
to the cache first.

To handle both equals-star and tilde-equals when converting
`python_version` to `python_full_version` markers, we have to merge the
version normalization (i.e. trimming the trailing zeroes) and the
conversion both to `python_full_version` and to `Ranges`, while special
casing equals-star and tilde-equals.

To avoid churn in lockfiles, we only trim in the conversion to `Ranges`
for markers, but keep using untrimmed versions for requires-python.
(Note that this behavior is technically also path dependent, as versions
with and without trailing zeroes have the same Hash and Eq. E.q.,
`requires-python == ">= 3.10.0"` and `requires-python == ">= 3.10"` in
the same workspace could lead to either value in `uv.lock`, and which
one it is could change if we make unrelated (performance) changes.
Always trimming however definitely changes lockfiles, a churn I wouldn't
do outside another breaking or lockfile-changing change.) Nevertheless,
there is a change for users who have `requires-python = "~= 3.12.0"` in
their `pyproject.toml`, as this now hits the correct normalization path.

Fixes #14231
Fixes #14270
2025-07-01 17:48:48 +02:00
Charlie Marsh
3774a656d7
Use parsed URLs for conflicting URL error message (#14380)
## Summary

There's a good example of the downside of using verbatim URLs here:
https://github.com/astral-sh/uv/pull/14197#discussion_r2163599625 (we
show two relative paths that point to the same directory, but it's not
clear from the error message).

The diff:

```
    2     2 │ ----- stdout -----
    3     3 │
    4     4 │ ----- stderr -----
    5     5 │ error: Requirements contain conflicting URLs for package `library` in all marker environments:
    6       │-- ../../library
    7       │-- ./library
          6 │+- file://[TEMP_DIR]/library
          7 │+- file://[TEMP_DIR]/library (editable)
```
2025-07-01 08:18:01 -04:00
Zanie Blue
b1812d111a
Edits to the build backend documentation (#14376)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Co-authored-by: konstin <konstin@mailbox.org>
2025-07-01 08:44:23 +00:00
github-actions[bot]
a3db9a9ae4
Sync latest Python releases (#14381)
Automated update for Python releases.

Co-authored-by: zanieb <2586601+zanieb@users.noreply.github.com>
2025-07-01 03:44:18 +00:00
renovate[bot]
c5ca240fb7
Update PyO3/maturin-action action to v1.49.3 (#14363)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-30 19:16:53 -04:00
renovate[bot]
7bbdc08dae
Update depot/build-push-action action to v1.15.0 (#14361) 2025-06-30 19:16:46 -04:00
renovate[bot]
5f8d4bbf02
Update Rust crate indexmap to v2.10.0 (#14365) 2025-06-30 19:16:31 -04:00
Adrien Cacciaguerra
9e9505df50
Bump CodSpeed to v3 (#14371)
## Summary

<!-- What's the purpose of the change? What does it do, and why? -->
As explained in the [`codspeed-rust` v3 release
notes](https://github.com/CodSpeedHQ/codspeed-rust/releases/tag/v3.0.0),
the `v3` of the compatibility layers is now required to work with the
latest version(`v3`) of `cargo-codspeed`.
2025-06-30 17:58:29 -05:00
Aria Desires
2f9061dcd0
Update python, add support for installing arm windows pythons (#14374) 2025-06-30 22:02:19 +00:00
Aria Desires
317ce6e245
disfavor aarch64 windows in its own house (#13724)
and prefer emulated x64 windows in its stead.

This is preparatory work for shipping support for uv downloading and
installing aarch64 (arm64) windows Pythons. We've [had builds for this
platform ready for a
while](https://github.com/astral-sh/python-build-standalone/pull/387),
but have held back on shipping them due to a fundamental problem:

**The Python packaging ecosystem does not have strong support for
aarch64 windows**, e.g., not many projects build aarch64 wheels yet. The
net effect of this is that, if we handed you an aarch64 python
interpreter on windows, you would have to build a lot more sdists, and
there's a high chance you will simply fail to build that sdist and be
sad.

Yes unfortunately, in this case a non-native Python interpreter simply
*works better* than the native one... in terms of working at all, today.
Of course, if the native interpreter works for your project, it should
presumably have better performance and platform compatibility.

We do not want to stand in the way of progress, as ideally this
situation is a temporary state of affairs as the ecosystem grows to
support aarch64 windows. To enable progress, on aarch64 Windows builds
of uv:

* We will still use a native python interpreter, e.g., if it's at the
front of your `PATH` or the only installed version.
* If we are choosing between equally good interpreters that differ in
architecture, x64 will be preferred.
* If the aarch64 version is newer, we will prefer the aarch64 one.
* We will emit a diagnostic on installation, and show the python request
to pass to uv to force aarch64 windows to be used.
* Will be shipping [aarch64 Windows Python
downloads](https://github.com/astral-sh/python-build-standalone/pull/387)
* Will probably add some kind of global override setting/env-var to
disable this behaviour.
* Will be shipping this behaviour in
[astral-sh/setup-uv](https://github.com/astral-sh/setup-uv)

We're coordinating with Microsoft, GitHub (for the `setup-python`
action), and the CPython team (for the `python.org` installers), to
ensure we're aligned on this default and the timing of toggling to
prefer native distributions in the future.

See discussion in 

- https://github.com/astral-sh/uv/issues/12906

---

This is an alternative to 

* #13719 

which uses sorting rather than filtering, as discussed in 

* #13721
2025-06-30 17:42:00 -04:00
Zanie Blue
1c7c174bc8
Include the canonical path in the interpreter query cache key (#14331)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
This fixes an obscure cache collision in Python interpreter queries,
which we believe to be the root cause of CI flakes we've been seeing
where a project environment is invalidated and recreated.

This work follows from the logs in [this CI
run](4495059999)
which captured one of the flakes with tracing enabled. There, we can see
that the project environment is invalidated because the Python
interpreter in the environment has a different version than expected:

```
DEBUG Checking for Python environment at `.venv`
TRACE Cached interpreter info for Python 3.12.9, skipping probing: .venv/bin/python3
DEBUG The interpreter in the project environment has different version (3.12.9) than it was created with (3.9.21)
```

(this message is updated to reflect #14329)

The flow is roughly:

- We create an environment with 3.12.9
- We query the environment, and cache the interpreter version for
`.venv/bin/python`
- We create an environment for 3.9.12, replacing the existing one
- We query the environment, and read the cached information

The Python cache entries are keyed by the absolute path to the
interpreter, and rely on the modification time (ctime, nsec resolution)
of the canonicalized path to determine if the cache entry should be
invalidated. The key is a hex representation of a u64 sea hasher output
— which is very unlikely to collide.

After an audit of the Python query caching logic, we determined that the
most likely cause of a collision in cache entries is that the
modification times of underlying interpreters are identical. This seems
pretty feasible, especially if the file system does not support
nanosecond precision — though it appears that the GitHub runners do
support it.

The fix here is to include the canonicalized path in the cache key,
which ensures we're looking at the modification time of the _same_
underlying interpreter.

This will "invalidate" all existing interpreter cache entries but that's
not a big deal.

This should also have the effect of reducing cache churn for
interpreters in virtual environments. Now, when you change Python
versions, we won't invalidate the previous cache entry so if you change
_back_ to the old version we can re-use our cached information.

It's a bit speculative, since we don't have a deterministic reproduction
in CI, but this is the strongest candidate given the logs and should
increase correctness regardless.

Closes https://github.com/astral-sh/uv/issues/14160
Closes https://github.com/astral-sh/uv/issues/13744
Closes https://github.com/astral-sh/uv/issues/13745

Once it's confirmed the flakes are resolved, we should revert

- https://github.com/astral-sh/uv/pull/14275
- #13817
2025-06-30 15:39:47 +00:00
konsti
0372a5b05d
Ignore invalid build backend settings when not building (#14372)
Fixes #14323
2025-06-30 16:32:28 +02:00
Ondrej Profant
ae500c95d2
Docs: add instructions for publishing to JFrog's Artifactory (#14253)
## Summary

Add instructions for publishing to JFrog's Artifactory into
[documentation](https://docs.astral.sh/uv/guides/integration/alternative-indexes/).

Related issues:
https://github.com/astral-sh/uv/issues/9845
https://github.com/astral-sh/uv/issues/10193

## Test Plan

I ran the documentation locally and use npx prettier.

---------

Co-authored-by: Ondrej Profant <ondrej.profant@datamole.ai>
Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-06-30 13:58:55 +00:00
renovate[bot]
5cfabd7085
Update Rust crate schemars to v1.0.3 (#14358)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [schemars](https://graham.cool/schemars/)
([source](https://redirect.github.com/GREsau/schemars)) |
workspace.dependencies | patch | `1.0.0` -> `1.0.3` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>GREsau/schemars (schemars)</summary>

###
[`v1.0.3`](https://redirect.github.com/GREsau/schemars/blob/HEAD/CHANGELOG.md#103---2025-06-28)

[Compare
Source](https://redirect.github.com/GREsau/schemars/compare/v1.0.2...v1.0.3)

##### Fixed

- Fix compile error when a doc comment is set on both a `transparent`
(or newtype) struct and its field
([https://github.com/GREsau/schemars/issues/446](https://redirect.github.com/GREsau/schemars/issues/446))
- Fix `json_schema!()` macro compatibility when used from pre-2021 rust
editions
([https://github.com/GREsau/schemars/pull/447](https://redirect.github.com/GREsau/schemars/pull/447))

###
[`v1.0.2`](https://redirect.github.com/GREsau/schemars/blob/HEAD/CHANGELOG.md#102---2025-06-26)

[Compare
Source](https://redirect.github.com/GREsau/schemars/compare/v1.0.1...v1.0.2)

##### Fixed

- Fix schema properties being incorrectly reordered during serialization
([https://github.com/GREsau/schemars/issues/444](https://redirect.github.com/GREsau/schemars/issues/444))

###
[`v1.0.1`](https://redirect.github.com/GREsau/schemars/blob/HEAD/CHANGELOG.md#101---2025-06-24)

[Compare
Source](https://redirect.github.com/GREsau/schemars/compare/v1.0.0...v1.0.1)

##### Fixed

- Deriving `JsonSchema` with `no_std` broken due to
`std::borrow::ToOwned` trait not being in scope
([https://github.com/GREsau/schemars/issues/441](https://redirect.github.com/GREsau/schemars/issues/441))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: konstin <konstin@mailbox.org>
2025-06-30 13:39:20 +00:00
renovate[bot]
15551a0201
Update Swatinem/rust-cache action to v2.8.0 (#14366)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [Swatinem/rust-cache](https://redirect.github.com/Swatinem/rust-cache)
| action | minor | `v2.7.8` -> `v2.8.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>Swatinem/rust-cache (Swatinem/rust-cache)</summary>

###
[`v2.8.0`](https://redirect.github.com/Swatinem/rust-cache/releases/tag/v2.8.0)

[Compare
Source](https://redirect.github.com/Swatinem/rust-cache/compare/v2.7.8...v2.8.0)

##### What's Changed

- Add cache-workspace-crates feature by
[@&#8203;jbransen](https://redirect.github.com/jbransen) in
[https://github.com/Swatinem/rust-cache/pull/246](https://redirect.github.com/Swatinem/rust-cache/pull/246)
- Feat: support warpbuild cache provider by
[@&#8203;stegaBOB](https://redirect.github.com/stegaBOB) in
[https://github.com/Swatinem/rust-cache/pull/247](https://redirect.github.com/Swatinem/rust-cache/pull/247)

##### New Contributors

- [@&#8203;jbransen](https://redirect.github.com/jbransen) made their
first contribution in
[https://github.com/Swatinem/rust-cache/pull/246](https://redirect.github.com/Swatinem/rust-cache/pull/246)
- [@&#8203;stegaBOB](https://redirect.github.com/stegaBOB) made their
first contribution in
[https://github.com/Swatinem/rust-cache/pull/247](https://redirect.github.com/Swatinem/rust-cache/pull/247)

**Full Changelog**:
https://github.com/Swatinem/rust-cache/compare/v2.7.8...v2.8.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 12:24:54 +02:00
renovate[bot]
61482da319
Update pre-commit hook astral-sh/ruff-pre-commit to v0.12.1 (#14362)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[astral-sh/ruff-pre-commit](https://redirect.github.com/astral-sh/ruff-pre-commit)
| repository | minor | `v0.11.13` -> `v0.12.1` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

Note: The `pre-commit` manager in Renovate is not supported by the
`pre-commit` maintainers or community. Please do not report any problems
there, instead [create a Discussion in the Renovate
repository](https://redirect.github.com/renovatebot/renovate/discussions/new)
if you have any questions.

---

### Release Notes

<details>
<summary>astral-sh/ruff-pre-commit (astral-sh/ruff-pre-commit)</summary>

###
[`v0.12.1`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.1)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.12.0...v0.12.1)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.1

###
[`v0.12.0`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.0)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.11.13...v0.12.0)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 12:13:33 +02:00
renovate[bot]
b2979d25a8
Update acj/freebsd-firecracker-action action to v0.5.1 (#14355)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-29 22:19:38 -04:00
renovate[bot]
e44a64ee13
Update Rust crate windows-registry to v0.5.3 (#14359) 2025-06-29 22:18:26 -04:00
renovate[bot]
e9533a0e29
Update aws-actions/configure-aws-credentials digest to 3d8cba3 (#14354) 2025-06-29 22:18:12 -04:00
renovate[bot]
40386e438f
Update Rust crate owo-colors to v4.2.2 (#14357) 2025-06-29 22:17:59 -04:00
renovate[bot]
a8b838dee9
Update astral-sh/setup-uv action to v6.3.1 (#14360) 2025-06-29 22:17:48 -04:00
renovate[bot]
d7e1fced43
Update Rust crate cargo-util to v0.2.21 (#14356) 2025-06-29 22:17:41 -04:00
Charlie Marsh
7603153f5b
Allow alpha, beta, and rc prefixes in tests (#14352)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
## Summary

A bunch of tests currently fail if you try to use a pre-release version.
This PR makes the regular expressions more lenient.
2025-06-29 19:30:52 +00:00
Charlie Marsh
d15efb7d91
Add an IntoIterator for FormMetadata (#14351)
## Summary

Clippy would lint for this if the symbol were public as a matter of API
hygiene, so adding it.
2025-06-29 15:07:07 -04:00
Zanie Blue
17b7eec287
Consistently normalize trailing slashes on URLs with no path segments (#14349)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Alternative to https://github.com/astral-sh/uv/pull/14348
2025-06-29 12:25:10 -05:00
Zanie Blue
c0ebe6871d
Improve trace message for cached Python interpreter query (#14328) 2025-06-29 09:40:29 -05:00
Charlie Marsh
41c218a89b
Bump version to 0.7.17 (#14347) 2025-06-29 09:58:33 -04:00
Charlie Marsh
734b228edf
Drop trailing slashes when converting index URL from URL (#14346)
## Summary

In #14245, we started normalizing index URLs by dropping the trailing
slash in the lockfile. We added tests to ensure that this didn't cause
existing lockfiles to be invalidated, but we missed one of the
constructors (specifically, the path that's used with
`tool.uv.sources`).
2025-06-29 09:36:13 -04:00
Zanie Blue
f9d3f8ea3b
Fix error message ordering for pyvenv.cfg version conflict (#14329)
These were reversed, and we're missing an "a".
2025-06-29 09:19:05 -04:00
Zanie Blue
ec18f4813a
Fix typo (#14341)
Some checks failed
CI / check cache | ubuntu (push) Has been cancelled
CI / check cache | macos aarch64 (push) Has been cancelled
CI / check system | python on debian (push) Has been cancelled
CI / check system | python on fedora (push) Has been cancelled
CI / check system | python on rocky linux 8 (push) Has been cancelled
CI / check system | python on rocky linux 9 (push) Has been cancelled
CI / check system | graalpy on ubuntu (push) Has been cancelled
CI / check system | pypy on ubuntu (push) Has been cancelled
CI / check system | pyston (push) Has been cancelled
CI / check system | python on macos aarch64 (push) Has been cancelled
CI / check system | homebrew python on macos aarch64 (push) Has been cancelled
CI / check system | python on macos x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86 (push) Has been cancelled
CI / check system | python3.13 on windows x86-64 (push) Has been cancelled
CI / check system | x86-64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | windows registry (push) Has been cancelled
CI / check system | python3.12 via chocolatey (push) Has been cancelled
CI / check system | python3.9 via pyenv (push) Has been cancelled
CI / check system | python3.13 (push) Has been cancelled
CI / check system | conda3.11 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.8 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.11 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.8 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.11 on windows x86-64 (push) Has been cancelled
CI / check system | conda3.8 on windows x86-64 (push) Has been cancelled
CI / check system | amazonlinux (push) Has been cancelled
CI / check system | embedded python3.10 on windows x86-64 (push) Has been cancelled
CI / benchmarks | walltime aarch64 linux (push) Has been cancelled
CI / benchmarks | instrumented (push) Has been cancelled
2025-06-28 11:32:03 -05:00
Zanie Blue
0cfbdcec09
Ignore UV_PYTHON_CACHE_DIR when empty (#14336)
To match our semantics elsewhere
2025-06-28 09:42:27 -05:00
Zanie Blue
608a1020c6
Update the Python query cache comment (#14330) 2025-06-28 09:42:23 -05:00
Zanie Blue
692667cbb0
Use the canonical ImplementationName -> &str implementation (#14337)
Motivated by some code duplication highlighted in
https://github.com/astral-sh/uv/pull/14201, I noticed we weren't taking
advantage of the existing implementation for casting to a str here.
Unfortunately, we do need a special case for CPython still.
2025-06-28 09:42:18 -05:00
github-actions[bot]
db14cc3005
Sync latest Python releases (#14339)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Automated update for Python releases.

Co-authored-by: zanieb <2586601+zanieb@users.noreply.github.com>
2025-06-28 03:08:53 +00:00
Charlie Marsh
731689e503
Apply build constraints when resolving --with dependencies (#14340)
## Summary

We were applying these at install time, but not resolve time.
2025-06-28 01:39:35 +00:00
Zanie Blue
b6b7409d13
Bump version to 0.7.16 (#14334)
Some checks are pending
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-27 16:46:36 -05:00
Aaron Ang
eab938b7b4
Warn users on ~= python version specifier (#14008)
Close #7426

## Summary

Picking up on #8284, I noticed that the `requires_python` object already
has its specifiers canonicalized in the `intersection` method, meaning
`~=3.12` is converted to `>=3.12, <4`. To fix this, we check and warn in
`intersection`.

## Test Plan

Used the same tests from #8284.
2025-06-27 15:48:41 -05:00
Charlie Marsh
6a5d2f1ec4
Share workspace cache between lock and sync operations (#14321)
## Summary

Closes #14316.
2025-06-27 14:48:40 -04:00
Charlie Marsh
4eef79e5e8
Avoid rendering desugared prefix matches in error messages (#14195)
## Summary

When the user provides a requirement like `==2.4.*`, we desugar that to
`>=2.4.dev0,<2.5.dev0`. These bounds then appear in error messages, and
worse, they also trick the error message reporter into thinking that the
user asked for a pre-release.

This PR adds logic to convert to the more-concise `==2.4.*`
representation when possible. We could probably do a similar thing for
the compatible release operator (`~=`).

Closes https://github.com/astral-sh/uv/issues/14177.

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-06-27 18:06:19 +00:00
John Mumm
f892b8564f
Return Cow from UrlString::with_ methods (#14319)
A minor performance improvement as a follow-up to #14245 (and an
accompanying test).
2025-06-27 13:54:52 -04:00
Zanie Blue
74468dac15
Bump python-build-standalone releases to include 3.14.0b3 (#14301)
See
20250626
2025-06-27 12:36:07 -05:00
John Mumm
880c5e4949
Ensure preview default Python installs are upgradeable (#14261)
Python `bin` installations installed with `uv python install --default
--preview` (no version specified) were not being installed as
upgradeable. Instead each link was pointed at the highest patch version
for a minor version. This change ensures that these preview default
installations are also treated as upgradeable.

The PR includes some updates to the related tests. First, it checks the
default install without specified version case. Second, since it's
adding more read link checks, it creates a new `read_link` helper method
to consolidate repeated logic and replace instances of
`#[cfg(unix/windows)` with `if cfg!(unix/windows)`.

Fixes #14247
2025-06-27 19:26:28 +02:00
John Mumm
5754f2f2db
Normalize index URLs to remove trailing slash (#14245)
Some checks are pending
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
This PR updates `IndexUrl` parsing to normalize non-file URLs by
removing trailing slashes. It also normalizes registry source URLs when
using them to validate the lockfile.

Prior to this change, when writing an index URL to the lockfile, uv
would use a trailing slash if present in the provided URL and no
trailing slash otherwise. This can cause surprising behavior. For
example, `uv lock --locked` will fail when a package is added with an
`--index` value without a trailing slash and then `uv lock --locked` is
run with a `pyproject.toml` version of the index URL that contains a
trailing slash. This PR fixes this and adds a test for the scenario.

It might be safe to normalize file URLs in the same way, but since
slashes have a well-defined meaning in the context of files and
directories, I chose not to normalize them here.

Closes #13707.
2025-06-27 17:11:21 +02:00
John Mumm
a824468c8b
Respect URL-encoded credentials in redirect location (#14315)
uv currently ignores URL-encoded credentials in a redirect location.
This PR adds a check for these credentials to the redirect handling
logic. If found, they are moved to the Authorization header in the
redirect request.

Closes #11097
2025-06-27 16:41:14 +02:00
Charlie Marsh
56266447e2
Bump MSRV and rust-toolchain version (#14303)
## Summary

Per our versioning policy, we stay two versions back (and 1.88 was
released today).
2025-06-27 10:27:45 -04:00
Jack O'Connor
efc361223c move the test buckets dir into the canonicalized temp dir
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Previously we were using the XDG data dir to avoid symlinks, but there's no
particular guarantee that that's not going to be a symlink too. Using the
canonicalized temp dir by default is also slightly nicer for a couple reasons:
It's sometimes faster (an in-memory tempfs on e.g. Arch), and it makes
overriding `$TMPDIR` or `%TMP%` sufficient to control where tests put temp
files, without needing to override `UV_INTERNAL__TEST_DIR` too.
2025-06-26 14:56:20 -07:00
Zanie Blue
9ee34dc69b
Fix Indexes::new doc (#14293) 2025-06-26 20:40:40 +00:00
Charlie Marsh
326e4497da
Allow local indexes to reference remote files (#14294)
## Summary

Previously, we assumed that local indexes only referenced local files.
However, it's fine for a local index (like, a `file://`-based Simple
API) to reference a remote file, and in fact Pyodide operates this way.

Closes https://github.com/astral-sh/uv/issues/14227.

## Test Plan

Ran `UV_INDEX=$(pyodide config get package_index) cargo run add anyio`,
which produced this lockfile:

```toml
version = 1
revision = 2
requires-python = ">=3.13.2"

[[package]]
name = "anyio"
version = "4.9.0"
source = { registry = "../../../Library/Caches/.pyodide-xbuildenv-0.30.5/0.27.7/xbuildenv/pyodide-root/package_index" }
dependencies = [
    { name = "idna" },
    { name = "sniffio" },
]
wheels = [
    { url = "https://cdn.jsdelivr.net/pyodide/v0.27.7/full/anyio-4.9.0-py3-none-any.whl", hash = "sha256:e1d9180d4361fd71d1bc4a7007fea6cae1d18792dba9d07eaad89f2a8562f71c" },
]

[[package]]
name = "foo"
version = "0.1.0"
source = { virtual = "." }
dependencies = [
    { name = "anyio" },
]

[package.metadata]
requires-dist = [{ name = "anyio", specifier = ">=4.9.0" }]

[[package]]
name = "idna"
version = "3.7"
source = { registry = "../../../Library/Caches/.pyodide-xbuildenv-0.30.5/0.27.7/xbuildenv/pyodide-root/package_index" }
wheels = [
    { url = "https://cdn.jsdelivr.net/pyodide/v0.27.7/full/idna-3.7-py3-none-any.whl", hash = "sha256:9d4685891e3e37434e09b1becda7e96a284e660c7aea9222564d88b6c3527c09" },
]

[[package]]
name = "sniffio"
version = "1.3.1"
source = { registry = "../../../Library/Caches/.pyodide-xbuildenv-0.30.5/0.27.7/xbuildenv/pyodide-root/package_index" }
wheels = [
    { url = "https://cdn.jsdelivr.net/pyodide/v0.27.7/full/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:9215f9917b34fc73152b134a3fc0a2eb0e4a49b0b956100cad75e84943412bb9" },
]
```
2025-06-26 20:17:42 +00:00
Charlie Marsh
05ab266200
Avoid using path URL for workspace Git dependencies in requirements.txt (#14288)
## Summary

Closes https://github.com/astral-sh/uv/issues/13020.
2025-06-26 19:48:12 +00:00
Charlie Marsh
c291d4329a
Include path or URL when failing to convert in lockfile (#14292)
## Summary

E.g., in #14227, we now get:

```
error: Failed to convert URL to path: https://cdn.jsdelivr.net/pyodide/v0.27.7/full/sniffio-1.3.1-py3-none-any.whl
```
2025-06-26 19:42:04 +00:00
Jack O'Connor
d4d6ede23b Lock the source tree when running setuptools, to protect concurrent builds
Fixes https://github.com/astral-sh/uv/issues/13703
2025-06-26 12:28:15 -07:00
Jack O'Connor
60528e3e25 Annotate LockedFile with #[must_use]
Standard lock guards have the same annotation, because creating them
without binding them to a local variable is almost always a mistake.
2025-06-26 12:28:15 -07:00
Zanie Blue
1ff8fc0947
Use Flit instead of Poetry for uninstall tests (#14285)
Investigating https://github.com/astral-sh/uv/issues/14158
2025-06-26 18:09:04 +00:00
Zanie Blue
8c27c2b494
Add verbose output on flake for run_groups_requires_python (#14275)
See https://github.com/astral-sh/uv/issues/14160

Same as https://github.com/astral-sh/uv/pull/13817
2025-06-26 12:11:34 -05:00
Zanie Blue
d27cec78b4
Restore snapshot for sync_dry_run (#14274)
In addition to our flake catch, keep a snapshot.

Extends https://github.com/astral-sh/uv/pull/13817
2025-06-26 16:23:37 +00:00
Aria Desires
1e02008d8b
add more proper docker login if (#14278) 2025-06-26 12:05:45 -04:00
Zanie Blue
469246d177
Fix emit_index_annotation_multiple_indexes test case (#14277)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
uv is taken on Test PyPI now, so the existing test fails
2025-06-26 14:58:24 +00:00
John Mumm
a27e60a22f
Temporarily disable Artifactory registry test (#14276)
I'm waiting on a response to get our subscription back up. Then I can
re-enable this. But for now, this would cause failing CI tests.
2025-06-26 09:47:18 -05:00
Daniel Vianna
4b348512c2
GCP Artifact Registry download URLs must have /simple path (#14251)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-25 17:35:41 +02:00
John Mumm
4ed9c5791b
Bump version to 0.7.15 (#14254)
Some checks are pending
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-25 12:06:41 +02:00
John Mumm
177df19f30
Add check for using minor version link when creating a venv on Windows (#14252)
There was a regression introduced in #13954 on Windows where creating a
venv behaved as if there was a minor version link even if none existed.
This PR adds a check to fix this.

Closes #14249.
2025-06-25 10:12:32 +02:00
John Mumm
5b2c3595a7
Require disambiguated relative paths for --index (#14152)
We do not currently support passing index names to `--index` for
installing packages. However, we do accept relative paths that can look
like index names. This PR adds the requirement that `--index` values
must be disambiguated with a prefix (`./` or `../` on Unix and Windows
or `.\\` or `..\\` on Windows). For now, if an ambiguous value is
provided, uv will warn that this will not be supported in the future.

Currently, if you provide an index name like `--index test` when there
is no `test` directory, uv will error with a `Directory not found...`
error. That's not very informative if you thought index names were
supported. The new warning makes the context clearer.

Closes #13921
2025-06-25 10:02:06 +02:00
konsti
283323a78a
Allow symlinks in the build backend (#14212)
In workspaces with multiple packages, you usually don't want to include
shared files such as the license repeatedly. Instead, we reading from
symlinked files. This would be supported if we had used std's `is_file`
and read methods, but walkdir's `is_file` does not consider symlinked
files as files.

See https://github.com/astral-sh/uv/issues/3957#issuecomment-2994675003
2025-06-25 07:44:22 +00:00
ya7010
ac788d7cde
Update schemars 1.0.0 (#13693)
Some checks are pending
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary
Update [schemars
0.9.0](https://github.com/GREsau/schemars/releases/tag/v0.9.0)

There are differences in the generated JSON Schema and I will [contact
the author](https://github.com/GREsau/schemars/issues/407).

## Test Plan

---------

Co-authored-by: konstin <konstin@mailbox.org>
2025-06-24 21:43:31 +02:00
Christopher Tee
9fba7a4768
Consistently use Ordering::Relaxed for standalone atomic use cases (#14190) 2025-06-24 12:30:26 -07:00
Christopher Tee
fe11ceedfa
Skip GitHub fast path when rate-limited (#13033) 2025-06-24 12:11:41 -07:00
dmitry-bychkov
61265b0c14
Add a link to PyPI FAQ to clarify what per-project token is. (#14242)
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

This change adds a link to PyPI FAQ about API tokens on the package
publishing guide page. To me it wasn't clear what are meant in this
section of the docs and it required a little bit of research. Adding
explicit link might help beginners.

<!-- What's the purpose of the change? What does it do, and why? -->

Co-authored-by: Dmitry Bychkov <dbychkov@alarislabs.com>
2025-06-24 11:56:36 -04:00
Charlie Marsh
606633d35f
Remove wheel filename benchmark (#14240)
## Summary

This flakes often and we don't really need it to be monitored
continuously. We can always revive it from Git later.

Closes https://github.com/astral-sh/uv/issues/13952.
2025-06-24 11:54:12 -04:00
konsti
f20659e1ce
Don't log GitHub fast path usage if it's cached (#14235)
Don't log that we resolved a reference through the GitHub fast path if
we didn't use GitHub at all but used the cached revision. This avoids
stating that the fast path works when it's blocked due to unrelated
reasons (e.g. rate limits).
2025-06-24 11:53:10 -04:00
Charlie Marsh
093e9d6ff0
Add "python-eol" feature to Sphinx tests (#14241)
## Summary

Closes https://github.com/astral-sh/uv/issues/14228.
2025-06-24 11:15:18 -04:00
Ben Beasley
19c58c7fbb
Update wiremock to 0.6.4 (#14238)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

In e10881d49c, `uv` started using a fork
of the `wiremock` crate, https://github.com/astral-sh/wiremock-rs,
linking companion PR
https://github.com/LukeMathWalker/wiremock-rs/pull/159. That PR was
merged in `wiremock` 0.6.4, so this PR switches back to the crates.io
version of `wiremock`, with a minimum version of 0.6.4.
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

```
$ cargo run python install
$ cargo test
````
2025-06-24 13:04:55 +00:00
Charlie Marsh
aa2448ef83
Strip query parameters when parsing source URL (#14224)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
## Summary

Closes https://github.com/astral-sh/uv/issues/14217.
2025-06-23 14:52:07 -04:00
Charlie Marsh
d9351d52fc
Remove wheel filename-from URL conversion (#14223)
## Summary

This appears to be unused.
2025-06-23 14:26:14 -04:00
168 changed files with 7074 additions and 3057 deletions

View file

@ -1,4 +1,4 @@
[profile.default] [profile.default]
# Mark tests that take longer than 10s as slow. # Mark tests that take longer than 10s as slow.
# Terminate after 90s as a stop-gap measure to terminate on deadlock. # Terminate after 120s as a stop-gap measure to terminate on deadlock.
slow-timeout = { period = "10s", terminate-after = 9 } slow-timeout = { period = "10s", terminate-after = 12 }

View file

@ -54,7 +54,7 @@ jobs:
- name: "Prep README.md" - name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi run: python scripts/transform_readme.py --target pypi
- name: "Build sdist" - name: "Build sdist"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
command: sdist command: sdist
args: --out dist args: --out dist
@ -74,7 +74,7 @@ jobs:
# uv-build # uv-build
- name: "Build sdist uv-build" - name: "Build sdist uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
command: sdist command: sdist
args: --out crates/uv-build/dist -m crates/uv-build/Cargo.toml args: --out crates/uv-build/dist -m crates/uv-build/Cargo.toml
@ -103,7 +103,7 @@ jobs:
# uv # uv
- name: "Build wheels - x86_64" - name: "Build wheels - x86_64"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: x86_64 target: x86_64
args: --release --locked --out dist --features self-update args: --release --locked --out dist --features self-update
@ -133,7 +133,7 @@ jobs:
# uv-build # uv-build
- name: "Build wheels uv-build - x86_64" - name: "Build wheels uv-build - x86_64"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: x86_64 target: x86_64
args: --profile minimal-size --locked --out crates/uv-build/dist -m crates/uv-build/Cargo.toml args: --profile minimal-size --locked --out crates/uv-build/dist -m crates/uv-build/Cargo.toml
@ -157,7 +157,7 @@ jobs:
# uv # uv
- name: "Build wheels - aarch64" - name: "Build wheels - aarch64"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: aarch64 target: aarch64
args: --release --locked --out dist --features self-update args: --release --locked --out dist --features self-update
@ -193,7 +193,7 @@ jobs:
# uv-build # uv-build
- name: "Build wheels uv-build - aarch64" - name: "Build wheels uv-build - aarch64"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: aarch64 target: aarch64
args: --profile minimal-size --locked --out crates/uv-build/dist -m crates/uv-build/Cargo.toml args: --profile minimal-size --locked --out crates/uv-build/dist -m crates/uv-build/Cargo.toml
@ -231,7 +231,7 @@ jobs:
# uv # uv
- name: "Build wheels" - name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
args: --release --locked --out dist --features self-update,windows-gui-bin args: --release --locked --out dist --features self-update,windows-gui-bin
@ -267,7 +267,7 @@ jobs:
# uv-build # uv-build
- name: "Build wheels uv-build" - name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
args: --profile minimal-size --locked --out crates/uv-build/dist -m crates/uv-build/Cargo.toml args: --profile minimal-size --locked --out crates/uv-build/dist -m crates/uv-build/Cargo.toml
@ -303,7 +303,7 @@ jobs:
# uv # uv
- name: "Build wheels" - name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.target }} target: ${{ matrix.target }}
# Generally, we try to build in a target docker container. In this case however, a # Generally, we try to build in a target docker container. In this case however, a
@ -368,7 +368,7 @@ jobs:
# uv-build # uv-build
- name: "Build wheels uv-build" - name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.target }} target: ${{ matrix.target }}
manylinux: auto manylinux: auto
@ -412,7 +412,7 @@ jobs:
# uv # uv
- name: "Build wheels" - name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
# On `aarch64`, use `manylinux: 2_28`; otherwise, use `manylinux: auto`. # On `aarch64`, use `manylinux: 2_28`; otherwise, use `manylinux: auto`.
@ -461,7 +461,7 @@ jobs:
# uv-build # uv-build
- name: "Build wheels uv-build" - name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
# On `aarch64`, use `manylinux: 2_28`; otherwise, use `manylinux: auto`. # On `aarch64`, use `manylinux: 2_28`; otherwise, use `manylinux: auto`.
@ -509,7 +509,7 @@ jobs:
# uv # uv
- name: "Build wheels" - name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
manylinux: auto manylinux: auto
@ -561,7 +561,7 @@ jobs:
# uv-build # uv-build
- name: "Build wheels uv-build" - name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
manylinux: auto manylinux: auto
@ -614,7 +614,7 @@ jobs:
# uv # uv
- name: "Build wheels" - name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
manylinux: auto manylinux: auto
@ -671,7 +671,7 @@ jobs:
# uv-build # uv-build
- name: "Build wheels uv-build" - name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
manylinux: auto manylinux: auto
@ -712,7 +712,7 @@ jobs:
# uv # uv
- name: "Build wheels" - name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
manylinux: auto manylinux: auto
@ -761,7 +761,7 @@ jobs:
# uv-build # uv-build
- name: "Build wheels uv-build" - name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
manylinux: auto manylinux: auto
@ -807,7 +807,7 @@ jobs:
# uv # uv
- name: "Build wheels" - name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.target }} target: ${{ matrix.target }}
manylinux: musllinux_1_1 manylinux: musllinux_1_1
@ -854,7 +854,7 @@ jobs:
# uv-build # uv-build
- name: "Build wheels uv-build" - name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.target }} target: ${{ matrix.target }}
manylinux: musllinux_1_1 manylinux: musllinux_1_1
@ -901,7 +901,7 @@ jobs:
# uv # uv
- name: "Build wheels" - name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
manylinux: musllinux_1_1 manylinux: musllinux_1_1
@ -966,7 +966,7 @@ jobs:
# uv-build # uv-build
- name: "Build wheels" - name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1 uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with: with:
target: ${{ matrix.platform.target }} target: ${{ matrix.platform.target }}
manylinux: musllinux_1_1 manylinux: musllinux_1_1

View file

@ -45,6 +45,7 @@ jobs:
name: plan name: plan
runs-on: ubuntu-latest runs-on: ubuntu-latest
outputs: outputs:
login: ${{ steps.plan.outputs.login }}
push: ${{ steps.plan.outputs.push }} push: ${{ steps.plan.outputs.push }}
tag: ${{ steps.plan.outputs.tag }} tag: ${{ steps.plan.outputs.tag }}
action: ${{ steps.plan.outputs.action }} action: ${{ steps.plan.outputs.action }}
@ -53,13 +54,16 @@ jobs:
env: env:
DRY_RUN: ${{ inputs.plan == '' || fromJson(inputs.plan).announcement_tag_is_implicit }} DRY_RUN: ${{ inputs.plan == '' || fromJson(inputs.plan).announcement_tag_is_implicit }}
TAG: ${{ inputs.plan != '' && fromJson(inputs.plan).announcement_tag }} TAG: ${{ inputs.plan != '' && fromJson(inputs.plan).announcement_tag }}
IS_LOCAL_PR: ${{ github.event.pull_request.head.repo.full_name == 'astral-sh/uv' }}
id: plan id: plan
run: | run: |
if [ "${{ env.DRY_RUN }}" == "false" ]; then if [ "${{ env.DRY_RUN }}" == "false" ]; then
echo "login=true" >> "$GITHUB_OUTPUT"
echo "push=true" >> "$GITHUB_OUTPUT" echo "push=true" >> "$GITHUB_OUTPUT"
echo "tag=${{ env.TAG }}" >> "$GITHUB_OUTPUT" echo "tag=${{ env.TAG }}" >> "$GITHUB_OUTPUT"
echo "action=build and publish" >> "$GITHUB_OUTPUT" echo "action=build and publish" >> "$GITHUB_OUTPUT"
else else
echo "login=${{ env.IS_LOCAL_PR }}" >> "$GITHUB_OUTPUT"
echo "push=false" >> "$GITHUB_OUTPUT" echo "push=false" >> "$GITHUB_OUTPUT"
echo "tag=dry-run" >> "$GITHUB_OUTPUT" echo "tag=dry-run" >> "$GITHUB_OUTPUT"
echo "action=build" >> "$GITHUB_OUTPUT" echo "action=build" >> "$GITHUB_OUTPUT"
@ -90,6 +94,7 @@ jobs:
# Login to DockerHub (when not pushing, it's to avoid rate-limiting) # Login to DockerHub (when not pushing, it's to avoid rate-limiting)
- uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0 - uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
if: ${{ needs.docker-plan.outputs.login == 'true' }}
with: with:
username: ${{ needs.docker-plan.outputs.push == 'true' && 'astral' || 'astralshbot' }} username: ${{ needs.docker-plan.outputs.push == 'true' && 'astral' || 'astralshbot' }}
password: ${{ needs.docker-plan.outputs.push == 'true' && secrets.DOCKERHUB_TOKEN_RW || secrets.DOCKERHUB_TOKEN_RO }} password: ${{ needs.docker-plan.outputs.push == 'true' && secrets.DOCKERHUB_TOKEN_RW || secrets.DOCKERHUB_TOKEN_RO }}
@ -132,7 +137,7 @@ jobs:
- name: Build and push by digest - name: Build and push by digest
id: build id: build
uses: depot/build-push-action@636daae76684e38c301daa0c5eca1c095b24e780 # v1.14.0 uses: depot/build-push-action@2583627a84956d07561420dcc1d0eb1f2af3fac0 # v1.15.0
with: with:
project: 7hd4vdzmw5 # astral-sh/uv project: 7hd4vdzmw5 # astral-sh/uv
context: . context: .
@ -195,6 +200,7 @@ jobs:
steps: steps:
# Login to DockerHub (when not pushing, it's to avoid rate-limiting) # Login to DockerHub (when not pushing, it's to avoid rate-limiting)
- uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0 - uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
if: ${{ needs.docker-plan.outputs.login == 'true' }}
with: with:
username: ${{ needs.docker-plan.outputs.push == 'true' && 'astral' || 'astralshbot' }} username: ${{ needs.docker-plan.outputs.push == 'true' && 'astral' || 'astralshbot' }}
password: ${{ needs.docker-plan.outputs.push == 'true' && secrets.DOCKERHUB_TOKEN_RW || secrets.DOCKERHUB_TOKEN_RO }} password: ${{ needs.docker-plan.outputs.push == 'true' && secrets.DOCKERHUB_TOKEN_RW || secrets.DOCKERHUB_TOKEN_RO }}
@ -261,7 +267,7 @@ jobs:
- name: Build and push - name: Build and push
id: build-and-push id: build-and-push
uses: depot/build-push-action@636daae76684e38c301daa0c5eca1c095b24e780 # v1.14.0 uses: depot/build-push-action@2583627a84956d07561420dcc1d0eb1f2af3fac0 # v1.15.0
with: with:
context: . context: .
project: 7hd4vdzmw5 # astral-sh/uv project: 7hd4vdzmw5 # astral-sh/uv

View file

@ -82,7 +82,7 @@ jobs:
run: rustup component add rustfmt run: rustup component add rustfmt
- name: "Install uv" - name: "Install uv"
uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0 uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "rustfmt" - name: "rustfmt"
run: cargo fmt --all --check run: cargo fmt --all --check
@ -126,7 +126,7 @@ jobs:
name: "cargo clippy | ubuntu" name: "cargo clippy | ubuntu"
steps: steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with: with:
save-if: ${{ github.ref == 'refs/heads/main' }} save-if: ${{ github.ref == 'refs/heads/main' }}
- name: "Check uv_build dependencies" - name: "Check uv_build dependencies"
@ -156,7 +156,7 @@ jobs:
run: | run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with: with:
workspaces: ${{ env.UV_WORKSPACE }} workspaces: ${{ env.UV_WORKSPACE }}
@ -175,7 +175,7 @@ jobs:
name: "cargo dev generate-all" name: "cargo dev generate-all"
steps: steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with: with:
save-if: ${{ github.ref == 'refs/heads/main' }} save-if: ${{ github.ref == 'refs/heads/main' }}
- name: "Generate all" - name: "Generate all"
@ -208,12 +208,12 @@ jobs:
- uses: rui314/setup-mold@v1 - uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain" - name: "Install Rust toolchain"
run: rustup show run: rustup show
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0 - uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install required Python versions" - name: "Install required Python versions"
run: uv python install run: uv python install
@ -240,12 +240,12 @@ jobs:
- uses: rui314/setup-mold@v1 - uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain" - name: "Install Rust toolchain"
run: rustup show run: rustup show
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0 - uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install required Python versions" - name: "Install required Python versions"
run: uv python install run: uv python install
@ -279,11 +279,11 @@ jobs:
run: | run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0 - uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install required Python versions" - name: "Install required Python versions"
run: uv python install run: uv python install
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with: with:
workspaces: ${{ env.UV_WORKSPACE }} workspaces: ${{ env.UV_WORKSPACE }}
@ -332,7 +332,7 @@ jobs:
run: | run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with: with:
workspaces: ${{ env.UV_WORKSPACE }}/crates/uv-trampoline workspaces: ${{ env.UV_WORKSPACE }}/crates/uv-trampoline
@ -388,7 +388,7 @@ jobs:
- name: Copy Git Repo to Dev Drive - name: Copy Git Repo to Dev Drive
run: | run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with: with:
workspaces: ${{ env.UV_WORKSPACE }}/crates/uv-trampoline workspaces: ${{ env.UV_WORKSPACE }}/crates/uv-trampoline
- name: "Install Rust toolchain" - name: "Install Rust toolchain"
@ -430,7 +430,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0 - uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0 - uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- name: "Add SSH key" - name: "Add SSH key"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }} if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
@ -443,7 +443,7 @@ jobs:
- name: "Build docs (insiders)" - name: "Build docs (insiders)"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }} if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: uvx --with-requirements docs/requirements.txt mkdocs build --strict -f mkdocs.insiders.yml run: uvx --with-requirements docs/requirements-insiders.txt mkdocs build --strict -f mkdocs.insiders.yml
build-binary-linux-libc: build-binary-linux-libc:
timeout-minutes: 10 timeout-minutes: 10
@ -456,7 +456,7 @@ jobs:
- uses: rui314/setup-mold@v1 - uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Build" - name: "Build"
run: cargo build run: cargo build
@ -470,6 +470,31 @@ jobs:
./target/debug/uvx ./target/debug/uvx
retention-days: 1 retention-days: 1
build-binary-linux-aarch64:
timeout-minutes: 10
needs: determine_changes
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
runs-on: github-ubuntu-24.04-aarch64-4
name: "build binary | linux aarch64"
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Build"
run: cargo build
- name: "Upload binary"
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: uv-linux-aarch64-${{ github.sha }}
path: |
./target/debug/uv
./target/debug/uvx
retention-days: 1
build-binary-linux-musl: build-binary-linux-musl:
timeout-minutes: 10 timeout-minutes: 10
needs: determine_changes needs: determine_changes
@ -486,7 +511,7 @@ jobs:
sudo apt-get install musl-tools sudo apt-get install musl-tools
rustup target add x86_64-unknown-linux-musl rustup target add x86_64-unknown-linux-musl
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Build" - name: "Build"
run: cargo build --target x86_64-unknown-linux-musl --bin uv --bin uvx run: cargo build --target x86_64-unknown-linux-musl --bin uv --bin uvx
@ -511,7 +536,7 @@ jobs:
- uses: rui314/setup-mold@v1 - uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Build" - name: "Build"
run: cargo build --bin uv --bin uvx run: cargo build --bin uv --bin uvx
@ -535,7 +560,7 @@ jobs:
- uses: rui314/setup-mold@v1 - uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Build" - name: "Build"
run: cargo build --bin uv --bin uvx run: cargo build --bin uv --bin uvx
@ -565,7 +590,7 @@ jobs:
run: | run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with: with:
workspaces: ${{ env.UV_WORKSPACE }} workspaces: ${{ env.UV_WORKSPACE }}
@ -600,7 +625,7 @@ jobs:
run: | run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with: with:
workspaces: ${{ env.UV_WORKSPACE }} workspaces: ${{ env.UV_WORKSPACE }}
@ -637,7 +662,7 @@ jobs:
run: rustup default ${{ steps.msrv.outputs.value }} run: rustup default ${{ steps.msrv.outputs.value }}
- name: "Install mold" - name: "Install mold"
uses: rui314/setup-mold@v1 uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- run: cargo +${{ steps.msrv.outputs.value }} build - run: cargo +${{ steps.msrv.outputs.value }} build
- run: ./target/debug/uv --version - run: ./target/debug/uv --version
@ -650,7 +675,7 @@ jobs:
steps: steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Cross build" - name: "Cross build"
run: | run: |
# Install cross from `freebsd-firecracker` # Install cross from `freebsd-firecracker`
@ -661,7 +686,7 @@ jobs:
cross build --target x86_64-unknown-freebsd cross build --target x86_64-unknown-freebsd
- name: Test in Firecracker VM - name: Test in Firecracker VM
uses: acj/freebsd-firecracker-action@6c57bda7113c2f137ef00d54512d61ae9d64365b # v0.5.0 uses: acj/freebsd-firecracker-action@136ca0bce2adade21e526ceb07db643ad23dd2dd # v0.5.1
with: with:
verbose: false verbose: false
checkout: false checkout: false
@ -770,6 +795,33 @@ jobs:
eval "$(./uv generate-shell-completion bash)" eval "$(./uv generate-shell-completion bash)"
eval "$(./uvx --generate-shell-completion bash)" eval "$(./uvx --generate-shell-completion bash)"
smoke-test-linux-aarch64:
timeout-minutes: 10
needs: build-binary-linux-aarch64
name: "smoke test | linux aarch64"
runs-on: github-ubuntu-24.04-aarch64-2
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: "Download binary"
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: uv-linux-aarch64-${{ github.sha }}
- name: "Prepare binary"
run: |
chmod +x ./uv
chmod +x ./uvx
- name: "Smoke test"
run: |
./uv run scripts/smoke-test
- name: "Test shell completions"
run: |
eval "$(./uv generate-shell-completion bash)"
eval "$(./uvx --generate-shell-completion bash)"
smoke-test-linux-musl: smoke-test-linux-musl:
timeout-minutes: 10 timeout-minutes: 10
needs: build-binary-linux-musl needs: build-binary-linux-musl
@ -852,7 +904,7 @@ jobs:
timeout-minutes: 10 timeout-minutes: 10
needs: build-binary-windows-aarch64 needs: build-binary-windows-aarch64
name: "smoke test | windows aarch64" name: "smoke test | windows aarch64"
runs-on: github-windows-11-aarch64-4 runs-on: windows-11-arm
steps: steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
@ -1000,6 +1052,96 @@ jobs:
./uv run python -c "" ./uv run python -c ""
./uv run -p 3.13t python -c "" ./uv run -p 3.13t python -c ""
integration-test-windows-aarch64-implicit:
timeout-minutes: 10
needs: build-binary-windows-aarch64
name: "integration test | aarch64 windows implicit"
runs-on: windows-11-arm
steps:
- name: "Download binary"
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: uv-windows-aarch64-${{ github.sha }}
- name: "Install Python via uv (implicitly select x64)"
run: |
./uv python install -v 3.13
- name: "Create a virtual environment (stdlib)"
run: |
& (./uv python find 3.13) -m venv .venv
- name: "Check version (stdlib)"
run: |
.venv/Scripts/python --version
- name: "Create a virtual environment (uv)"
run: |
./uv venv -p 3.13 --managed-python
- name: "Check version (uv)"
run: |
.venv/Scripts/python --version
- name: "Check is x64"
run: |
.venv/Scripts/python -c "import sys; exit(1) if 'AMD64' not in sys.version else exit(0)"
- name: "Check install"
run: |
./uv pip install -v anyio
- name: "Check uv run"
run: |
./uv run python -c ""
./uv run -p 3.13 python -c ""
integration-test-windows-aarch64-explicit:
timeout-minutes: 10
needs: build-binary-windows-aarch64
name: "integration test | aarch64 windows explicit"
runs-on: windows-11-arm
steps:
- name: "Download binary"
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: uv-windows-aarch64-${{ github.sha }}
- name: "Install Python via uv (explicitly select aarch64)"
run: |
./uv python install -v cpython-3.13-windows-aarch64-none
- name: "Create a virtual environment (stdlib)"
run: |
& (./uv python find 3.13) -m venv .venv
- name: "Check version (stdlib)"
run: |
.venv/Scripts/python --version
- name: "Create a virtual environment (uv)"
run: |
./uv venv -p 3.13 --managed-python
- name: "Check version (uv)"
run: |
.venv/Scripts/python --version
- name: "Check is NOT x64"
run: |
.venv/Scripts/python -c "import sys; exit(1) if 'AMD64' in sys.version else exit(0)"
- name: "Check install"
run: |
./uv pip install -v anyio
- name: "Check uv run"
run: |
./uv run python -c ""
./uv run -p 3.13 python -c ""
integration-test-pypy-linux: integration-test-pypy-linux:
timeout-minutes: 10 timeout-minutes: 10
needs: build-binary-linux-libc needs: build-binary-linux-libc
@ -1443,7 +1585,7 @@ jobs:
run: chmod +x ./uv run: chmod +x ./uv
- name: "Configure AWS credentials" - name: "Configure AWS credentials"
uses: aws-actions/configure-aws-credentials@3bb878b6ab43ba8717918141cd07a0ea68cfe7ea uses: aws-actions/configure-aws-credentials@f503a1870408dcf2c35d5c2b8a68e69211042c7d
with: with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
@ -2072,7 +2214,7 @@ jobs:
timeout-minutes: 10 timeout-minutes: 10
needs: build-binary-windows-aarch64 needs: build-binary-windows-aarch64
name: "check system | x86-64 python3.13 on windows aarch64" name: "check system | x86-64 python3.13 on windows aarch64"
runs-on: github-windows-11-aarch64-4 runs-on: windows-11-arm
steps: steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
@ -2090,6 +2232,28 @@ jobs:
- name: "Validate global Python install" - name: "Validate global Python install"
run: py -3.13 ./scripts/check_system_python.py --uv ./uv.exe run: py -3.13 ./scripts/check_system_python.py --uv ./uv.exe
system-test-windows-aarch64-aarch64-python-313:
timeout-minutes: 10
needs: build-binary-windows-aarch64
name: "check system | aarch64 python3.13 on windows aarch64"
runs-on: windows-11-arm
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: "3.13"
architecture: "arm64"
allow-prereleases: true
- name: "Download binary"
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: uv-windows-aarch64-${{ github.sha }}
- name: "Validate global Python install"
run: py -3.13 ./scripts/check_system_python.py --uv ./uv.exe
# Test our PEP 514 integration that installs Python into the Windows registry. # Test our PEP 514 integration that installs Python into the Windows registry.
system-test-windows-registry: system-test-windows-registry:
timeout-minutes: 10 timeout-minutes: 10
@ -2337,7 +2501,7 @@ jobs:
- name: "Checkout Branch" - name: "Checkout Branch"
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain" - name: "Install Rust toolchain"
run: rustup show run: rustup show
@ -2374,7 +2538,7 @@ jobs:
- name: "Checkout Branch" - name: "Checkout Branch"
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8 - uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain" - name: "Install Rust toolchain"
run: rustup show run: rustup show

View file

@ -22,7 +22,7 @@ jobs:
id-token: write id-token: write
steps: steps:
- name: "Install uv" - name: "Install uv"
uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0 uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 - uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with: with:
pattern: wheels_uv-* pattern: wheels_uv-*
@ -43,7 +43,7 @@ jobs:
id-token: write id-token: write
steps: steps:
- name: "Install uv" - name: "Install uv"
uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0 uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 - uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with: with:
pattern: wheels_uv_build-* pattern: wheels_uv_build-*

View file

@ -85,7 +85,6 @@ Write-Output `
"DEV_DRIVE=$($Drive)" ` "DEV_DRIVE=$($Drive)" `
"TMP=$($Tmp)" ` "TMP=$($Tmp)" `
"TEMP=$($Tmp)" ` "TEMP=$($Tmp)" `
"UV_INTERNAL__TEST_DIR=$($Tmp)" `
"RUSTUP_HOME=$($Drive)/.rustup" ` "RUSTUP_HOME=$($Drive)/.rustup" `
"CARGO_HOME=$($Drive)/.cargo" ` "CARGO_HOME=$($Drive)/.cargo" `
"UV_WORKSPACE=$($Drive)/uv" ` "UV_WORKSPACE=$($Drive)/uv" `

View file

@ -17,7 +17,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0 - uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
with: with:
version: "latest" version: "latest"
enable-cache: true enable-cache: true

View file

@ -12,7 +12,7 @@ repos:
- id: validate-pyproject - id: validate-pyproject
- repo: https://github.com/crate-ci/typos - repo: https://github.com/crate-ci/typos
rev: v1.33.1 rev: v1.34.0
hooks: hooks:
- id: typos - id: typos
@ -42,7 +42,7 @@ repos:
types_or: [yaml, json5] types_or: [yaml, json5]
- repo: https://github.com/astral-sh/ruff-pre-commit - repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.11.13 rev: v0.12.2
hooks: hooks:
- id: ruff-format - id: ruff-format
- id: ruff - id: ruff

View file

@ -3,6 +3,145 @@
<!-- prettier-ignore-start --> <!-- prettier-ignore-start -->
## 0.7.19
The **[uv build backend](https://docs.astral.sh/uv/concepts/build-backend/) is now stable**, and considered ready for production use.
The uv build backend is a great choice for pure Python projects. It has reasonable defaults, with the goal of requiring zero configuration for most users, but provides flexible configuration to accommodate most Python project structures. It integrates tightly with uv, to improve messaging and user experience. It validates project metadata and structures, preventing common mistakes. And, finally, it's very fast — `uv sync` on a new project (from `uv init`) is 10-30x faster than with other build backends.
To use uv as a build backend in an existing project, add `uv_build` to the `[build-system]` section in your `pyproject.toml`:
```toml
[build-system]
requires = ["uv_build>=0.7.19,<0.8.0"]
build-backend = "uv_build"
```
In a future release, it will replace `hatchling` as the default in `uv init`. As before, uv will remain compatible with all standards-compliant build backends.
### Python
- Add PGO distributions of Python for aarch64 Linux, which are more optimized for better performance
See the [python-build-standalone release](https://github.com/astral-sh/python-build-standalone/releases/tag/20250702) for more details.
### Enhancements
- Ignore Python patch version for `--universal` pip compile ([#14405](https://github.com/astral-sh/uv/pull/14405))
- Update the tilde version specifier warning to include more context ([#14335](https://github.com/astral-sh/uv/pull/14335))
- Clarify behavior and hint on tool install when no executables are available ([#14423](https://github.com/astral-sh/uv/pull/14423))
### Bug fixes
- Make project and interpreter lock acquisition non-fatal ([#14404](https://github.com/astral-sh/uv/pull/14404))
- Includes `sys.prefix` in cached environment keys to avoid `--with` collisions across projects ([#14403](https://github.com/astral-sh/uv/pull/14403))
### Documentation
- Add a migration guide from pip to uv projects ([#12382](https://github.com/astral-sh/uv/pull/12382))
## 0.7.18
### Python
- Added arm64 Windows Python 3.11, 3.12, 3.13, and 3.14
These are not downloaded by default, since x86-64 Python has broader ecosystem support on Windows.
However, they can be requested with `cpython-<version>-windows-aarch64`.
See the [python-build-standalone release](https://github.com/astral-sh/python-build-standalone/releases/tag/20250630) for more details.
### Enhancements
- Keep track of retries in `ManagedPythonDownload::fetch_with_retry` ([#14378](https://github.com/astral-sh/uv/pull/14378))
- Reuse build (virtual) environments across resolution and installation ([#14338](https://github.com/astral-sh/uv/pull/14338))
- Improve trace message for cached Python interpreter query ([#14328](https://github.com/astral-sh/uv/pull/14328))
- Use parsed URLs for conflicting URL error message ([#14380](https://github.com/astral-sh/uv/pull/14380))
### Preview features
- Ignore invalid build backend settings when not building ([#14372](https://github.com/astral-sh/uv/pull/14372))
### Bug fixes
- Fix equals-star and tilde-equals with `python_version` and `python_full_version` ([#14271](https://github.com/astral-sh/uv/pull/14271))
- Include the canonical path in the interpreter query cache key ([#14331](https://github.com/astral-sh/uv/pull/14331))
- Only drop build directories on program exit ([#14304](https://github.com/astral-sh/uv/pull/14304))
- Error instead of panic on conflict between global and subcommand flags ([#14368](https://github.com/astral-sh/uv/pull/14368))
- Consistently normalize trailing slashes on URLs with no path segments ([#14349](https://github.com/astral-sh/uv/pull/14349))
### Documentation
- Add instructions for publishing to JFrog's Artifactory ([#14253](https://github.com/astral-sh/uv/pull/14253))
- Edits to the build backend documentation ([#14376](https://github.com/astral-sh/uv/pull/14376))
## 0.7.17
### Bug fixes
- Apply build constraints when resolving `--with` dependencies ([#14340](https://github.com/astral-sh/uv/pull/14340))
- Drop trailing slashes when converting index URL from URL ([#14346](https://github.com/astral-sh/uv/pull/14346))
- Ignore `UV_PYTHON_CACHE_DIR` when empty ([#14336](https://github.com/astral-sh/uv/pull/14336))
- Fix error message ordering for `pyvenv.cfg` version conflict ([#14329](https://github.com/astral-sh/uv/pull/14329))
## 0.7.16
### Python
- Add Python 3.14.0b3
See the
[`python-build-standalone` release notes](https://github.com/astral-sh/python-build-standalone/releases/tag/20250626)
for more details.
### Enhancements
- Include path or URL when failing to convert in lockfile ([#14292](https://github.com/astral-sh/uv/pull/14292))
- Warn when `~=` is used as a Python version specifier without a patch version ([#14008](https://github.com/astral-sh/uv/pull/14008))
### Preview features
- Ensure preview default Python installs are upgradeable ([#14261](https://github.com/astral-sh/uv/pull/14261))
### Performance
- Share workspace cache between lock and sync operations ([#14321](https://github.com/astral-sh/uv/pull/14321))
### Bug fixes
- Allow local indexes to reference remote files ([#14294](https://github.com/astral-sh/uv/pull/14294))
- Avoid rendering desugared prefix matches in error messages ([#14195](https://github.com/astral-sh/uv/pull/14195))
- Avoid using path URL for workspace Git dependencies in `requirements.txt` ([#14288](https://github.com/astral-sh/uv/pull/14288))
- Normalize index URLs to remove trailing slash ([#14245](https://github.com/astral-sh/uv/pull/14245))
- Respect URL-encoded credentials in redirect location ([#14315](https://github.com/astral-sh/uv/pull/14315))
- Lock the source tree when running setuptools, to protect concurrent builds ([#14174](https://github.com/astral-sh/uv/pull/14174))
### Documentation
- Note that GCP Artifact Registry download URLs must have `/simple` component ([#14251](https://github.com/astral-sh/uv/pull/14251))
## 0.7.15
### Enhancements
- Consistently use `Ordering::Relaxed` for standalone atomic use cases ([#14190](https://github.com/astral-sh/uv/pull/14190))
- Warn on ambiguous relative paths for `--index` ([#14152](https://github.com/astral-sh/uv/pull/14152))
- Skip GitHub fast path when rate-limited ([#13033](https://github.com/astral-sh/uv/pull/13033))
- Preserve newlines in `schema.json` descriptions ([#13693](https://github.com/astral-sh/uv/pull/13693))
### Bug fixes
- Add check for using minor version link when creating a venv on Windows ([#14252](https://github.com/astral-sh/uv/pull/14252))
- Strip query parameters when parsing source URL ([#14224](https://github.com/astral-sh/uv/pull/14224))
### Documentation
- Add a link to PyPI FAQ to clarify what per-project token is ([#14242](https://github.com/astral-sh/uv/pull/14242))
### Preview features
- Allow symlinks in the build backend ([#14212](https://github.com/astral-sh/uv/pull/14212))
## 0.7.14 ## 0.7.14
### Enhancements ### Enhancements

206
Cargo.lock generated
View file

@ -94,6 +94,15 @@ version = "1.0.98"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e16d2d3311acee920a9eb8d33b8cbc1787ce4a264e85f964c2404b969bdcd487" checksum = "e16d2d3311acee920a9eb8d33b8cbc1787ce4a264e85f964c2404b969bdcd487"
[[package]]
name = "approx"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cab112f0a86d568ea0e627cc1d6be74a1e9cd55214684db5561995f6dad897c6"
dependencies = [
"num-traits",
]
[[package]] [[package]]
name = "arbitrary" name = "arbitrary"
version = "1.4.1" version = "1.4.1"
@ -180,9 +189,9 @@ dependencies = [
[[package]] [[package]]
name = "async-channel" name = "async-channel"
version = "2.3.1" version = "2.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "89b47800b0be77592da0afd425cc03468052844aff33b84e33cc696f64e77b6a" checksum = "924ed96dd52d1b75e9c1a3e6275715fd320f5f9439fb5a4a11fa51f4221158d2"
dependencies = [ dependencies = [
"concurrent-queue", "concurrent-queue",
"event-listener-strategy", "event-listener-strategy",
@ -364,6 +373,15 @@ version = "0.22.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "72b3254f16251a8381aa12e40e3c4d2f0199f8c6508fbecb9d91f575e0fbb8c6" checksum = "72b3254f16251a8381aa12e40e3c4d2f0199f8c6508fbecb9d91f575e0fbb8c6"
[[package]]
name = "bincode"
version = "1.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b1f45e9417d87227c7a56d22e471c6206462cba514c7590c09aff4cf6d1ddcad"
dependencies = [
"serde",
]
[[package]] [[package]]
name = "bisection" name = "bisection"
version = "0.1.0" version = "0.1.0"
@ -512,9 +530,9 @@ dependencies = [
[[package]] [[package]]
name = "cargo-util" name = "cargo-util"
version = "0.2.20" version = "0.2.21"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d767bc85f367f6483a6072430b56f5c0d6ee7636751a21a800526d0711753d76" checksum = "c95ec8b2485b20aed818bd7460f8eecc6c87c35c84191b353a3aba9aa1736c36"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"core-foundation", "core-foundation",
@ -672,22 +690,27 @@ checksum = "f46ad14479a25103f283c0f10005961cf086d8dc42205bb44c46ac563475dca6"
[[package]] [[package]]
name = "codspeed" name = "codspeed"
version = "2.10.1" version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "93f4cce9c27c49c4f101fffeebb1826f41a9df2e7498b7cd4d95c0658b796c6c" checksum = "922018102595f6668cdd09c03f4bff2d951ce2318c6dca4fe11bdcb24b65b2bf"
dependencies = [ dependencies = [
"anyhow",
"bincode",
"colored", "colored",
"glob",
"libc", "libc",
"nix 0.29.0",
"serde", "serde",
"serde_json", "serde_json",
"statrs",
"uuid", "uuid",
] ]
[[package]] [[package]]
name = "codspeed-criterion-compat" name = "codspeed-criterion-compat"
version = "2.10.1" version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c3c23d880a28a2aab52d38ca8481dd7a3187157d0a952196b6db1db3c8499725" checksum = "24d8ad82d2383cb74995f58993cbdd2914aed57b2f91f46580310dd81dc3d05a"
dependencies = [ dependencies = [
"codspeed", "codspeed",
"codspeed-criterion-compat-walltime", "codspeed-criterion-compat-walltime",
@ -696,9 +719,9 @@ dependencies = [
[[package]] [[package]]
name = "codspeed-criterion-compat-walltime" name = "codspeed-criterion-compat-walltime"
version = "2.10.1" version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7b0a2f7365e347f4f22a67e9ea689bf7bc89900a354e22e26cf8a531a42c8fbb" checksum = "61badaa6c452d192a29f8387147888f0ab358553597c3fe9bf8a162ef7c2fa64"
dependencies = [ dependencies = [
"anes", "anes",
"cast", "cast",
@ -1142,9 +1165,9 @@ dependencies = [
[[package]] [[package]]
name = "event-listener-strategy" name = "event-listener-strategy"
version = "0.5.3" version = "0.5.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3c3e4e0dd3673c1139bf041f3008816d9cf2946bbfac2945c09e523b8d7b05b2" checksum = "8be9f3dfaaffdae2972880079a491a1a8bb7cbed0b8dd7a347f668b4150a3b93"
dependencies = [ dependencies = [
"event-listener", "event-listener",
"pin-project-lite", "pin-project-lite",
@ -1675,7 +1698,7 @@ dependencies = [
"tokio", "tokio",
"tokio-rustls", "tokio-rustls",
"tower-service", "tower-service",
"webpki-roots", "webpki-roots 0.26.8",
] ]
[[package]] [[package]]
@ -1684,6 +1707,7 @@ version = "0.1.14"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dc2fdfdbff08affe55bb779f33b053aa1fe5dd5b54c257343c17edfa55711bdb" checksum = "dc2fdfdbff08affe55bb779f33b053aa1fe5dd5b54c257343c17edfa55711bdb"
dependencies = [ dependencies = [
"base64 0.22.1",
"bytes", "bytes",
"futures-channel", "futures-channel",
"futures-core", "futures-core",
@ -1691,7 +1715,9 @@ dependencies = [
"http", "http",
"http-body", "http-body",
"hyper", "hyper",
"ipnet",
"libc", "libc",
"percent-encoding",
"pin-project-lite", "pin-project-lite",
"socket2", "socket2",
"tokio", "tokio",
@ -1873,9 +1899,9 @@ checksum = "b72ad49b554c1728b1e83254a1b1565aea4161e28dabbfa171fc15fe62299caf"
[[package]] [[package]]
name = "indexmap" name = "indexmap"
version = "2.9.0" version = "2.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cea70ddb795996207ad57735b50c5982d8844f38ba9ee5f1aedcfb708a2aa11e" checksum = "fe4cd85333e22411419a0bcae1297d25e58c9443848b11dc6a86fefe8c78a661"
dependencies = [ dependencies = [
"equivalent", "equivalent",
"hashbrown 0.15.4", "hashbrown 0.15.4",
@ -1922,6 +1948,16 @@ version = "2.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "469fb0b9cefa57e3ef31275ee7cacb78f2fdca44e4765491884a2b119d4eb130" checksum = "469fb0b9cefa57e3ef31275ee7cacb78f2fdca44e4765491884a2b119d4eb130"
[[package]]
name = "iri-string"
version = "0.7.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dbc5ebe9c3a1a7a5127f920a418f7585e9e758e911d0466ed004f393b0e380b2"
dependencies = [
"memchr",
"serde",
]
[[package]] [[package]]
name = "is-terminal" name = "is-terminal"
version = "0.4.15" version = "0.4.15"
@ -2474,9 +2510,9 @@ checksum = "b15813163c1d831bf4a13c3610c05c0d03b39feb07f7e09fa234dac9b15aaf39"
[[package]] [[package]]
name = "owo-colors" name = "owo-colors"
version = "4.2.1" version = "4.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "26995317201fa17f3656c36716aed4a7c81743a9634ac4c99c0eeda495db0cec" checksum = "48dd4f4a2c8405440fd0462561f0e5806bd0f77e86f51c761481bdd4018b545e"
[[package]] [[package]]
name = "parking" name = "parking"
@ -3039,9 +3075,9 @@ dependencies = [
[[package]] [[package]]
name = "reqwest" name = "reqwest"
version = "0.12.15" version = "0.12.22"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d19c46a6fdd48bc4dab94b6103fccc55d34c67cc0ad04653aad4ea2a07cd7bbb" checksum = "cbc931937e6ca3a06e3b6c0aa7841849b160a90351d6ab467a8b9b9959767531"
dependencies = [ dependencies = [
"async-compression", "async-compression",
"base64 0.22.1", "base64 0.22.1",
@ -3056,18 +3092,14 @@ dependencies = [
"hyper", "hyper",
"hyper-rustls", "hyper-rustls",
"hyper-util", "hyper-util",
"ipnet",
"js-sys", "js-sys",
"log", "log",
"mime",
"mime_guess", "mime_guess",
"once_cell",
"percent-encoding", "percent-encoding",
"pin-project-lite", "pin-project-lite",
"quinn", "quinn",
"rustls", "rustls",
"rustls-native-certs", "rustls-native-certs",
"rustls-pemfile",
"rustls-pki-types", "rustls-pki-types",
"serde", "serde",
"serde_json", "serde_json",
@ -3075,17 +3107,16 @@ dependencies = [
"sync_wrapper", "sync_wrapper",
"tokio", "tokio",
"tokio-rustls", "tokio-rustls",
"tokio-socks",
"tokio-util", "tokio-util",
"tower", "tower",
"tower-http",
"tower-service", "tower-service",
"url", "url",
"wasm-bindgen", "wasm-bindgen",
"wasm-bindgen-futures", "wasm-bindgen-futures",
"wasm-streams", "wasm-streams",
"web-sys", "web-sys",
"webpki-roots", "webpki-roots 1.0.1",
"windows-registry 0.4.0",
] ]
[[package]] [[package]]
@ -3328,15 +3359,6 @@ dependencies = [
"security-framework", "security-framework",
] ]
[[package]]
name = "rustls-pemfile"
version = "2.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dce314e5fee3f39953d46bb63bb8a46d40c2f8fb7cc5a3b6cab2bde9721d6e50"
dependencies = [
"rustls-pki-types",
]
[[package]] [[package]]
name = "rustls-pki-types" name = "rustls-pki-types"
version = "1.11.0" version = "1.11.0"
@ -3405,11 +3427,12 @@ dependencies = [
[[package]] [[package]]
name = "schemars" name = "schemars"
version = "0.8.22" version = "1.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3fbf2ae1b8bc8e02df939598064d22402220cd5bbcca1c76f7d6a310974d5615" checksum = "82d20c4491bc164fa2f6c5d44565947a52ad80b9505d8e36f8d54c27c739fcd0"
dependencies = [ dependencies = [
"dyn-clone", "dyn-clone",
"ref-cast",
"schemars_derive", "schemars_derive",
"serde", "serde",
"serde_json", "serde_json",
@ -3418,9 +3441,9 @@ dependencies = [
[[package]] [[package]]
name = "schemars_derive" name = "schemars_derive"
version = "0.8.22" version = "1.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "32e265784ad618884abaea0600a9adf15393368d840e0222d101a072f3f7534d" checksum = "33d020396d1d138dc19f1165df7545479dcd58d93810dc5d646a16e55abefa80"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -3719,6 +3742,16 @@ version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a8f112729512f8e442d81f95a8a7ddf2b7c6b8a1a6f509a95864142b30cab2d3" checksum = "a8f112729512f8e442d81f95a8a7ddf2b7c6b8a1a6f509a95864142b30cab2d3"
[[package]]
name = "statrs"
version = "0.18.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2a3fe7c28c6512e766b0874335db33c94ad7b8f9054228ae1c2abd47ce7d335e"
dependencies = [
"approx",
"num-traits",
]
[[package]] [[package]]
name = "strict-num" name = "strict-num"
version = "0.1.1" version = "0.1.1"
@ -3934,9 +3967,9 @@ dependencies = [
[[package]] [[package]]
name = "test-log" name = "test-log"
version = "0.2.17" version = "0.2.18"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e7f46083d221181166e5b6f6b1e5f1d499f3a76888826e6cb1d057554157cd0f" checksum = "1e33b98a582ea0be1168eba097538ee8dd4bbe0f2b01b22ac92ea30054e5be7b"
dependencies = [ dependencies = [
"test-log-macros", "test-log-macros",
"tracing-subscriber", "tracing-subscriber",
@ -3944,9 +3977,9 @@ dependencies = [
[[package]] [[package]]
name = "test-log-macros" name = "test-log-macros"
version = "0.2.17" version = "0.2.18"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "888d0c3c6db53c0fdab160d2ed5e12ba745383d3e85813f2ea0f2b1475ab553f" checksum = "451b374529930d7601b1eef8d32bc79ae870b6079b069401709c2a8bf9e75f36"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -4138,18 +4171,6 @@ dependencies = [
"tokio", "tokio",
] ]
[[package]]
name = "tokio-socks"
version = "0.5.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0d4770b8024672c1101b3f6733eab95b18007dbe0847a8afe341fcf79e06043f"
dependencies = [
"either",
"futures-util",
"thiserror 1.0.69",
"tokio",
]
[[package]] [[package]]
name = "tokio-stream" name = "tokio-stream"
version = "0.1.17" version = "0.1.17"
@ -4232,6 +4253,24 @@ dependencies = [
"tower-service", "tower-service",
] ]
[[package]]
name = "tower-http"
version = "0.6.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "adc82fd73de2a9722ac5da747f12383d2bfdb93591ee6c58486e0097890f05f2"
dependencies = [
"bitflags 2.9.1",
"bytes",
"futures-util",
"http",
"http-body",
"iri-string",
"pin-project-lite",
"tower",
"tower-layer",
"tower-service",
]
[[package]] [[package]]
name = "tower-layer" name = "tower-layer"
version = "0.3.3" version = "0.3.3"
@ -4569,7 +4608,7 @@ dependencies = [
[[package]] [[package]]
name = "uv" name = "uv"
version = "0.7.14" version = "0.7.19"
dependencies = [ dependencies = [
"anstream", "anstream",
"anyhow", "anyhow",
@ -4583,7 +4622,6 @@ dependencies = [
"ctrlc", "ctrlc",
"dotenvy", "dotenvy",
"dunce", "dunce",
"etcetera",
"filetime", "filetime",
"flate2", "flate2",
"fs-err 3.1.1", "fs-err 3.1.1",
@ -4719,7 +4757,6 @@ dependencies = [
"uv-configuration", "uv-configuration",
"uv-dispatch", "uv-dispatch",
"uv-distribution", "uv-distribution",
"uv-distribution-filename",
"uv-distribution-types", "uv-distribution-types",
"uv-extract", "uv-extract",
"uv-install-wheel", "uv-install-wheel",
@ -4735,7 +4772,7 @@ dependencies = [
[[package]] [[package]]
name = "uv-build" name = "uv-build"
version = "0.7.14" version = "0.7.19"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"uv-build-backend", "uv-build-backend",
@ -4798,6 +4835,7 @@ dependencies = [
"tokio", "tokio",
"toml_edit", "toml_edit",
"tracing", "tracing",
"uv-cache-key",
"uv-configuration", "uv-configuration",
"uv-distribution", "uv-distribution",
"uv-distribution-types", "uv-distribution-types",
@ -5135,7 +5173,6 @@ dependencies = [
"serde", "serde",
"smallvec", "smallvec",
"thiserror 2.0.12", "thiserror 2.0.12",
"url",
"uv-cache-key", "uv-cache-key",
"uv-normalize", "uv-normalize",
"uv-pep440", "uv-pep440",
@ -5178,6 +5215,7 @@ dependencies = [
"uv-pypi-types", "uv-pypi-types",
"uv-redacted", "uv-redacted",
"uv-small-str", "uv-small-str",
"uv-warnings",
"version-ranges", "version-ranges",
] ]
@ -5602,7 +5640,7 @@ dependencies = [
"uv-trampoline-builder", "uv-trampoline-builder",
"uv-warnings", "uv-warnings",
"which", "which",
"windows-registry 0.5.2", "windows-registry",
"windows-result 0.3.4", "windows-result 0.3.4",
"windows-sys 0.59.0", "windows-sys 0.59.0",
] ]
@ -5806,7 +5844,7 @@ dependencies = [
"tracing", "tracing",
"uv-fs", "uv-fs",
"uv-static", "uv-static",
"windows-registry 0.5.2", "windows-registry",
"windows-result 0.3.4", "windows-result 0.3.4",
"windows-sys 0.59.0", "windows-sys 0.59.0",
] ]
@ -5904,6 +5942,7 @@ name = "uv-types"
version = "0.0.1" version = "0.0.1"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"dashmap",
"rustc-hash", "rustc-hash",
"thiserror 2.0.12", "thiserror 2.0.12",
"uv-cache", "uv-cache",
@ -5923,7 +5962,7 @@ dependencies = [
[[package]] [[package]]
name = "uv-version" name = "uv-version"
version = "0.7.14" version = "0.7.19"
[[package]] [[package]]
name = "uv-virtualenv" name = "uv-virtualenv"
@ -6187,6 +6226,15 @@ dependencies = [
"rustls-pki-types", "rustls-pki-types",
] ]
[[package]]
name = "webpki-roots"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8782dd5a41a24eed3a4f40b606249b3e236ca61adf1f25ea4d45c73de122b502"
dependencies = [
"rustls-pki-types",
]
[[package]] [[package]]
name = "weezl" name = "weezl"
version = "0.1.8" version = "0.1.8"
@ -6330,7 +6378,7 @@ dependencies = [
"windows-interface 0.59.1", "windows-interface 0.59.1",
"windows-link", "windows-link",
"windows-result 0.3.4", "windows-result 0.3.4",
"windows-strings 0.4.1", "windows-strings 0.4.2",
] ]
[[package]] [[package]]
@ -6400,9 +6448,9 @@ dependencies = [
[[package]] [[package]]
name = "windows-link" name = "windows-link"
version = "0.1.1" version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "76840935b766e1b0a05c0066835fb9ec80071d4c09a16f6bd5f7e655e3c14c38" checksum = "5e6ad25900d524eaabdbbb96d20b4311e1e7ae1699af4fb28c17ae66c80d798a"
[[package]] [[package]]
name = "windows-numerics" name = "windows-numerics"
@ -6416,24 +6464,13 @@ dependencies = [
[[package]] [[package]]
name = "windows-registry" name = "windows-registry"
version = "0.4.0" version = "0.5.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4286ad90ddb45071efd1a66dfa43eb02dd0dfbae1545ad6cc3c51cf34d7e8ba3" checksum = "5b8a9ed28765efc97bbc954883f4e6796c33a06546ebafacbabee9696967499e"
dependencies = [
"windows-result 0.3.4",
"windows-strings 0.3.1",
"windows-targets 0.53.0",
]
[[package]]
name = "windows-registry"
version = "0.5.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b3bab093bdd303a1240bb99b8aba8ea8a69ee19d34c9e2ef9594e708a4878820"
dependencies = [ dependencies = [
"windows-link", "windows-link",
"windows-result 0.3.4", "windows-result 0.3.4",
"windows-strings 0.4.1", "windows-strings 0.4.2",
] ]
[[package]] [[package]]
@ -6465,9 +6502,9 @@ dependencies = [
[[package]] [[package]]
name = "windows-strings" name = "windows-strings"
version = "0.4.1" version = "0.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2a7ab927b2637c19b3dbe0965e75d8f2d30bdd697a1516191cad2ec4df8fb28a" checksum = "56e6c93f3a0c3b36176cb1327a4958a0353d5d166c2a35cb268ace15e91d3b57"
dependencies = [ dependencies = [
"windows-link", "windows-link",
] ]
@ -6701,8 +6738,9 @@ checksum = "d135d17ab770252ad95e9a872d365cf3090e3be864a34ab46f48555993efc904"
[[package]] [[package]]
name = "wiremock" name = "wiremock"
version = "0.6.3" version = "0.6.4"
source = "git+https://github.com/astral-sh/wiremock-rs?rev=b79b69f62521df9f83a54e866432397562eae789#b79b69f62521df9f83a54e866432397562eae789" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a2b8b99d4cdbf36b239a9532e31fe4fb8acc38d1897c1761e161550a7dc78e6a"
dependencies = [ dependencies = [
"assert-json-diff", "assert-json-diff",
"async-trait", "async-trait",

View file

@ -12,7 +12,7 @@ resolver = "2"
[workspace.package] [workspace.package]
edition = "2024" edition = "2024"
rust-version = "1.85" rust-version = "1.86"
homepage = "https://pypi.org/project/uv/" homepage = "https://pypi.org/project/uv/"
documentation = "https://pypi.org/project/uv/" documentation = "https://pypi.org/project/uv/"
repository = "https://github.com/astral-sh/uv" repository = "https://github.com/astral-sh/uv"
@ -142,7 +142,7 @@ ref-cast = { version = "1.0.24" }
reflink-copy = { version = "0.1.19" } reflink-copy = { version = "0.1.19" }
regex = { version = "1.10.6" } regex = { version = "1.10.6" }
regex-automata = { version = "0.4.8", default-features = false, features = ["dfa-build", "dfa-search", "perf", "std", "syntax"] } regex-automata = { version = "0.4.8", default-features = false, features = ["dfa-build", "dfa-search", "perf", "std", "syntax"] }
reqwest = { version = "=0.12.15", default-features = false, features = ["json", "gzip", "deflate", "zstd", "stream", "rustls-tls", "rustls-tls-native-roots", "socks", "multipart", "http2", "blocking"] } reqwest = { version = "0.12.22", default-features = false, features = ["json", "gzip", "deflate", "zstd", "stream", "rustls-tls", "rustls-tls-native-roots", "socks", "multipart", "http2", "blocking"] }
reqwest-middleware = { git = "https://github.com/astral-sh/reqwest-middleware", rev = "ad8b9d332d1773fde8b4cd008486de5973e0a3f8", features = ["multipart"] } reqwest-middleware = { git = "https://github.com/astral-sh/reqwest-middleware", rev = "ad8b9d332d1773fde8b4cd008486de5973e0a3f8", features = ["multipart"] }
reqwest-retry = { git = "https://github.com/astral-sh/reqwest-middleware", rev = "ad8b9d332d1773fde8b4cd008486de5973e0a3f8" } reqwest-retry = { git = "https://github.com/astral-sh/reqwest-middleware", rev = "ad8b9d332d1773fde8b4cd008486de5973e0a3f8" }
rkyv = { version = "0.8.8", features = ["bytecheck"] } rkyv = { version = "0.8.8", features = ["bytecheck"] }
@ -151,7 +151,7 @@ rust-netrc = { version = "0.1.2" }
rustc-hash = { version = "2.0.0" } rustc-hash = { version = "2.0.0" }
rustix = { version = "1.0.0", default-features = false, features = ["fs", "std"] } rustix = { version = "1.0.0", default-features = false, features = ["fs", "std"] }
same-file = { version = "1.0.6" } same-file = { version = "1.0.6" }
schemars = { version = "0.8.21", features = ["url"] } schemars = { version = "1.0.0", features = ["url2"] }
seahash = { version = "4.1.0" } seahash = { version = "4.1.0" }
self-replace = { version = "1.5.0" } self-replace = { version = "1.5.0" }
serde = { version = "1.0.210", features = ["derive", "rc"] } serde = { version = "1.0.210", features = ["derive", "rc"] }
@ -189,7 +189,7 @@ windows-core = { version = "0.59.0" }
windows-registry = { version = "0.5.0" } windows-registry = { version = "0.5.0" }
windows-result = { version = "0.3.0" } windows-result = { version = "0.3.0" }
windows-sys = { version = "0.59.0", features = ["Win32_Foundation", "Win32_Security", "Win32_Storage_FileSystem", "Win32_System_Ioctl", "Win32_System_IO", "Win32_System_Registry"] } windows-sys = { version = "0.59.0", features = ["Win32_Foundation", "Win32_Security", "Win32_Storage_FileSystem", "Win32_System_Ioctl", "Win32_System_IO", "Win32_System_Registry"] }
wiremock = { git = "https://github.com/astral-sh/wiremock-rs", rev = "b79b69f62521df9f83a54e866432397562eae789" } wiremock = { version = "0.6.4" }
xz2 = { version = "0.1.7" } xz2 = { version = "0.1.7" }
zip = { version = "2.2.3", default-features = false, features = ["deflate", "zstd", "bzip2", "lzma", "xz"] } zip = { version = "2.2.3", default-features = false, features = ["deflate", "zstd", "bzip2", "lzma", "xz"] }

View file

@ -37,7 +37,7 @@ disallowed-methods = [
"std::fs::soft_link", "std::fs::soft_link",
"std::fs::symlink_metadata", "std::fs::symlink_metadata",
"std::fs::write", "std::fs::write",
"std::os::unix::fs::symlink", { path = "std::os::unix::fs::symlink", allow-invalid = true },
"std::os::windows::fs::symlink_dir", { path = "std::os::windows::fs::symlink_dir", allow-invalid = true },
"std::os::windows::fs::symlink_file", { path = "std::os::windows::fs::symlink_file", allow-invalid = true },
] ]

View file

@ -86,7 +86,7 @@ impl Indexes {
Self(FxHashSet::default()) Self(FxHashSet::default())
} }
/// Create a new [`AuthIndexUrls`] from an iterator of [`AuthIndexUrl`]s. /// Create a new [`Indexes`] instance from an iterator of [`Index`]s.
pub fn from_indexes(urls: impl IntoIterator<Item = Index>) -> Self { pub fn from_indexes(urls: impl IntoIterator<Item = Index>) -> Self {
let mut index_urls = Self::new(); let mut index_urls = Self::new();
for url in urls { for url in urls {

View file

@ -18,11 +18,6 @@ workspace = true
doctest = false doctest = false
bench = false bench = false
[[bench]]
name = "distribution-filename"
path = "benches/distribution_filename.rs"
harness = false
[[bench]] [[bench]]
name = "uv" name = "uv"
path = "benches/uv.rs" path = "benches/uv.rs"
@ -34,7 +29,6 @@ uv-client = { workspace = true }
uv-configuration = { workspace = true } uv-configuration = { workspace = true }
uv-dispatch = { workspace = true } uv-dispatch = { workspace = true }
uv-distribution = { workspace = true } uv-distribution = { workspace = true }
uv-distribution-filename = { workspace = true }
uv-distribution-types = { workspace = true } uv-distribution-types = { workspace = true }
uv-extract = { workspace = true, optional = true } uv-extract = { workspace = true, optional = true }
uv-install-wheel = { workspace = true } uv-install-wheel = { workspace = true }
@ -48,8 +42,10 @@ uv-types = { workspace = true }
uv-workspace = { workspace = true } uv-workspace = { workspace = true }
anyhow = { workspace = true } anyhow = { workspace = true }
codspeed-criterion-compat = { version = "2.7.2", default-features = false, optional = true } codspeed-criterion-compat = { version = "3.0.2", default-features = false, optional = true }
criterion = { version = "0.6.0", default-features = false, features = ["async_tokio"] } criterion = { version = "0.6.0", default-features = false, features = [
"async_tokio",
] }
jiff = { workspace = true } jiff = { workspace = true }
tokio = { workspace = true } tokio = { workspace = true }

View file

@ -1,168 +0,0 @@
use std::str::FromStr;
use uv_bench::criterion::{
BenchmarkId, Criterion, Throughput, criterion_group, criterion_main, measurement::WallTime,
};
use uv_distribution_filename::WheelFilename;
use uv_platform_tags::{AbiTag, LanguageTag, PlatformTag, Tags};
/// A set of platform tags extracted from burntsushi's Archlinux workstation.
/// We could just re-create these via `Tags::from_env`, but those might differ
/// depending on the platform. This way, we always use the same data. It also
/// lets us assert tag compatibility regardless of where the benchmarks run.
const PLATFORM_TAGS: &[(&str, &str, &str)] = include!("../inputs/platform_tags.rs");
/// A set of wheel names used in the benchmarks below. We pick short and long
/// names, as well as compatible and not-compatibles (with `PLATFORM_TAGS`)
/// names.
///
/// The tuple is (name, filename, compatible) where `name` is a descriptive
/// name for humans used in the benchmark definition. And `filename` is the
/// actual wheel filename we want to benchmark operation on. And `compatible`
/// indicates whether the tags in the wheel filename are expected to be
/// compatible with the tags in `PLATFORM_TAGS`.
const WHEEL_NAMES: &[(&str, &str, bool)] = &[
// This tests a case with a very short name that *is* compatible with
// PLATFORM_TAGS. It only uses one tag for each component (one Python
// version, one ABI and one platform).
(
"flyte-short-compatible",
"ipython-2.1.0-py3-none-any.whl",
true,
),
// This tests a case with a long name that is *not* compatible. That
// is, all platform tags need to be checked against the tags in the
// wheel filename. This is essentially the worst possible practical
// case.
(
"flyte-long-incompatible",
"protobuf-3.5.2.post1-cp36-cp36m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl",
false,
),
// This tests a case with a long name that *is* compatible. We
// expect this to be (on average) quicker because the compatibility
// check stops as soon as a positive match is found. (Where as the
// incompatible case needs to check all tags.)
(
"flyte-long-compatible",
"coverage-6.6.0b1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
true,
),
];
/// A list of names that are candidates for wheel filenames but will ultimately
/// fail to parse.
const INVALID_WHEEL_NAMES: &[(&str, &str)] = &[
("flyte-short-extension", "mock-5.1.0.tar.gz"),
(
"flyte-long-extension",
"Pillow-5.4.0.dev0-py3.7-macosx-10.13-x86_64.egg",
),
];
/// Benchmarks the construction of platform tags.
///
/// This only happens ~once per program startup. Originally, construction was
/// trivial. But to speed up `WheelFilename::is_compatible`, we added some
/// extra processing. We thus expect construction to become slower, but we
/// write a benchmark to ensure it is still "reasonable."
fn benchmark_build_platform_tags(c: &mut Criterion<WallTime>) {
let tags: Vec<(LanguageTag, AbiTag, PlatformTag)> = PLATFORM_TAGS
.iter()
.map(|&(py, abi, plat)| {
(
LanguageTag::from_str(py).unwrap(),
AbiTag::from_str(abi).unwrap(),
PlatformTag::from_str(plat).unwrap(),
)
})
.collect();
let mut group = c.benchmark_group("build_platform_tags");
group.bench_function(BenchmarkId::from_parameter("burntsushi-archlinux"), |b| {
b.iter(|| std::hint::black_box(Tags::new(tags.clone())));
});
group.finish();
}
/// Benchmarks `WheelFilename::from_str`. This has been observed to take some
/// non-trivial time in profiling (although, at time of writing, not as much
/// as tag compatibility). In the process of optimizing tag compatibility,
/// we tweaked wheel filename parsing. This benchmark was therefore added to
/// ensure we didn't regress here.
fn benchmark_wheelname_parsing(c: &mut Criterion<WallTime>) {
let mut group = c.benchmark_group("wheelname_parsing");
for (name, filename, _) in WHEEL_NAMES.iter().copied() {
let len = u64::try_from(filename.len()).expect("length fits in u64");
group.throughput(Throughput::Bytes(len));
group.bench_function(BenchmarkId::from_parameter(name), |b| {
b.iter(|| {
filename
.parse::<WheelFilename>()
.expect("valid wheel filename");
});
});
}
group.finish();
}
/// Benchmarks `WheelFilename::from_str` when it fails. This routine is called
/// on every filename in a package's metadata. A non-trivial portion of which
/// are not wheel filenames. Ensuring that the error path is fast is thus
/// probably a good idea.
fn benchmark_wheelname_parsing_failure(c: &mut Criterion<WallTime>) {
let mut group = c.benchmark_group("wheelname_parsing_failure");
for (name, filename) in INVALID_WHEEL_NAMES.iter().copied() {
let len = u64::try_from(filename.len()).expect("length fits in u64");
group.throughput(Throughput::Bytes(len));
group.bench_function(BenchmarkId::from_parameter(name), |b| {
b.iter(|| {
filename
.parse::<WheelFilename>()
.expect_err("invalid wheel filename");
});
});
}
group.finish();
}
/// Benchmarks the `WheelFilename::is_compatible` routine. This was revealed
/// to be the #1 bottleneck in the resolver. The main issue was that the
/// set of platform tags (generated once) is quite large, and the original
/// implementation did an exhaustive search over each of them for each tag in
/// the wheel filename.
fn benchmark_wheelname_tag_compatibility(c: &mut Criterion<WallTime>) {
let tags: Vec<(LanguageTag, AbiTag, PlatformTag)> = PLATFORM_TAGS
.iter()
.map(|&(py, abi, plat)| {
(
LanguageTag::from_str(py).unwrap(),
AbiTag::from_str(abi).unwrap(),
PlatformTag::from_str(plat).unwrap(),
)
})
.collect();
let tags = Tags::new(tags);
let mut group = c.benchmark_group("wheelname_tag_compatibility");
for (name, filename, expected) in WHEEL_NAMES.iter().copied() {
let wheelname: WheelFilename = filename.parse().expect("valid wheel filename");
let len = u64::try_from(filename.len()).expect("length fits in u64");
group.throughput(Throughput::Bytes(len));
group.bench_function(BenchmarkId::from_parameter(name), |b| {
b.iter(|| {
assert_eq!(expected, wheelname.is_compatible(&tags));
});
});
}
group.finish();
}
criterion_group!(
uv_distribution_filename,
benchmark_build_platform_tags,
benchmark_wheelname_parsing,
benchmark_wheelname_parsing_failure,
benchmark_wheelname_tag_compatibility,
);
criterion_main!(uv_distribution_filename);

View file

@ -9,12 +9,12 @@ pub use settings::{BuildBackendSettings, WheelDataIncludes};
pub use source_dist::{build_source_dist, list_source_dist}; pub use source_dist::{build_source_dist, list_source_dist};
pub use wheel::{build_editable, build_wheel, list_wheel, metadata}; pub use wheel::{build_editable, build_wheel, list_wheel, metadata};
use std::fs::FileType;
use std::io; use std::io;
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::str::FromStr; use std::str::FromStr;
use thiserror::Error; use thiserror::Error;
use tracing::debug; use tracing::debug;
use walkdir::DirEntry;
use uv_fs::Simplified; use uv_fs::Simplified;
use uv_globfilter::PortableGlobError; use uv_globfilter::PortableGlobError;
@ -54,8 +54,6 @@ pub enum Error {
#[source] #[source]
err: walkdir::Error, err: walkdir::Error,
}, },
#[error("Unsupported file type {:?}: `{}`", _1, _0.user_display())]
UnsupportedFileType(PathBuf, FileType),
#[error("Failed to write wheel zip archive")] #[error("Failed to write wheel zip archive")]
Zip(#[from] zip::result::ZipError), Zip(#[from] zip::result::ZipError),
#[error("Failed to write RECORD file")] #[error("Failed to write RECORD file")]
@ -86,6 +84,16 @@ trait DirectoryWriter {
/// Files added through the method are considered generated when listing included files. /// Files added through the method are considered generated when listing included files.
fn write_bytes(&mut self, path: &str, bytes: &[u8]) -> Result<(), Error>; fn write_bytes(&mut self, path: &str, bytes: &[u8]) -> Result<(), Error>;
/// Add the file or directory to the path.
fn write_dir_entry(&mut self, entry: &DirEntry, target_path: &str) -> Result<(), Error> {
if entry.file_type().is_dir() {
self.write_directory(target_path)?;
} else {
self.write_file(target_path, entry.path())?;
}
Ok(())
}
/// Add a local file. /// Add a local file.
fn write_file(&mut self, path: &str, file: &Path) -> Result<(), Error>; fn write_file(&mut self, path: &str, file: &Path) -> Result<(), Error>;

View file

@ -4,10 +4,6 @@ use uv_macros::OptionsMetadata;
/// Settings for the uv build backend (`uv_build`). /// Settings for the uv build backend (`uv_build`).
/// ///
/// !!! note
///
/// The uv build backend is currently in preview and may change in any future release.
///
/// Note that those settings only apply when using the `uv_build` backend, other build backends /// Note that those settings only apply when using the `uv_build` backend, other build backends
/// (such as hatchling) have their own configuration. /// (such as hatchling) have their own configuration.
/// ///

View file

@ -250,32 +250,16 @@ fn write_source_dist(
.expect("walkdir starts with root"); .expect("walkdir starts with root");
if !include_matcher.match_path(relative) || exclude_matcher.is_match(relative) { if !include_matcher.match_path(relative) || exclude_matcher.is_match(relative) {
trace!("Excluding: `{}`", relative.user_display()); trace!("Excluding from sdist: `{}`", relative.user_display());
continue; continue;
} }
debug!("Including {}", relative.user_display()); let entry_path = Path::new(&top_level)
if entry.file_type().is_dir() {
writer.write_directory(
&Path::new(&top_level)
.join(relative) .join(relative)
.portable_display() .portable_display()
.to_string(), .to_string();
)?; debug!("Adding to sdist: {}", relative.user_display());
} else if entry.file_type().is_file() { writer.write_dir_entry(&entry, &entry_path)?;
writer.write_file(
&Path::new(&top_level)
.join(relative)
.portable_display()
.to_string(),
entry.path(),
)?;
} else {
return Err(Error::UnsupportedFileType(
relative.to_path_buf(),
entry.file_type(),
));
}
} }
debug!("Visited {files_visited} files for source dist build"); debug!("Visited {files_visited} files for source dist build");

View file

@ -164,7 +164,7 @@ fn write_wheel(
.path() .path()
.strip_prefix(source_tree) .strip_prefix(source_tree)
.expect("walkdir starts with root"); .expect("walkdir starts with root");
let wheel_path = entry let entry_path = entry
.path() .path()
.strip_prefix(&src_root) .strip_prefix(&src_root)
.expect("walkdir starts with root"); .expect("walkdir starts with root");
@ -172,21 +172,10 @@ fn write_wheel(
trace!("Excluding from module: `{}`", match_path.user_display()); trace!("Excluding from module: `{}`", match_path.user_display());
continue; continue;
} }
let wheel_path = wheel_path.portable_display().to_string();
debug!("Adding to wheel: `{wheel_path}`"); let entry_path = entry_path.portable_display().to_string();
debug!("Adding to wheel: {entry_path}");
if entry.file_type().is_dir() { wheel_writer.write_dir_entry(&entry, &entry_path)?;
wheel_writer.write_directory(&wheel_path)?;
} else if entry.file_type().is_file() {
wheel_writer.write_file(&wheel_path, entry.path())?;
} else {
// TODO(konsti): We may want to support symlinks, there is support for installing them.
return Err(Error::UnsupportedFileType(
entry.path().to_path_buf(),
entry.file_type(),
));
}
} }
debug!("Visited {files_visited} files for wheel build"); debug!("Visited {files_visited} files for wheel build");
@ -519,23 +508,12 @@ fn wheel_subdir_from_globs(
continue; continue;
} }
let relative_licenses = Path::new(target) let license_path = Path::new(target)
.join(relative) .join(relative)
.portable_display() .portable_display()
.to_string(); .to_string();
debug!("Adding for {}: `{}`", globs_field, relative.user_display());
if entry.file_type().is_dir() { wheel_writer.write_dir_entry(&entry, &license_path)?;
wheel_writer.write_directory(&relative_licenses)?;
} else if entry.file_type().is_file() {
debug!("Adding {} file: `{}`", globs_field, relative.user_display());
wheel_writer.write_file(&relative_licenses, entry.path())?;
} else {
// TODO(konsti): We may want to support symlinks, there is support for installing them.
return Err(Error::UnsupportedFileType(
entry.path().to_path_buf(),
entry.file_type(),
));
}
} }
Ok(()) Ok(())
} }

View file

@ -17,6 +17,7 @@ doctest = false
workspace = true workspace = true
[dependencies] [dependencies]
uv-cache-key = { workspace = true }
uv-configuration = { workspace = true } uv-configuration = { workspace = true }
uv-distribution = { workspace = true } uv-distribution = { workspace = true }
uv-distribution-types = { workspace = true } uv-distribution-types = { workspace = true }

View file

@ -25,12 +25,14 @@ use tempfile::TempDir;
use tokio::io::AsyncBufReadExt; use tokio::io::AsyncBufReadExt;
use tokio::process::Command; use tokio::process::Command;
use tokio::sync::{Mutex, Semaphore}; use tokio::sync::{Mutex, Semaphore};
use tracing::{Instrument, debug, info_span, instrument}; use tracing::{Instrument, debug, info_span, instrument, warn};
use uv_cache_key::cache_digest;
use uv_configuration::PreviewMode; use uv_configuration::PreviewMode;
use uv_configuration::{BuildKind, BuildOutput, ConfigSettings, SourceStrategy}; use uv_configuration::{BuildKind, BuildOutput, ConfigSettings, SourceStrategy};
use uv_distribution::BuildRequires; use uv_distribution::BuildRequires;
use uv_distribution_types::{IndexLocations, Requirement, Resolution}; use uv_distribution_types::{IndexLocations, Requirement, Resolution};
use uv_fs::LockedFile;
use uv_fs::{PythonExt, Simplified}; use uv_fs::{PythonExt, Simplified};
use uv_pep440::Version; use uv_pep440::Version;
use uv_pep508::PackageName; use uv_pep508::PackageName;
@ -201,6 +203,11 @@ impl Pep517Backend {
{import} {import}
"#, backend_path = backend_path_encoded} "#, backend_path = backend_path_encoded}
} }
fn is_setuptools(&self) -> bool {
// either `setuptools.build_meta` or `setuptools.build_meta:__legacy__`
self.backend.split(':').next() == Some("setuptools.build_meta")
}
} }
/// Uses an [`Rc`] internally, clone freely. /// Uses an [`Rc`] internally, clone freely.
@ -434,6 +441,31 @@ impl SourceBuild {
}) })
} }
/// Acquire a lock on the source tree, if necessary.
async fn acquire_lock(&self) -> Result<Option<LockedFile>, Error> {
// Depending on the command, setuptools puts `*.egg-info`, `build/`, and `dist/` in the
// source tree, and concurrent invocations of setuptools using the same source dir can
// stomp on each other. We need to lock something to fix that, but we don't want to dump a
// `.lock` file into the source tree that the user will need to .gitignore. Take a global
// proxy lock instead.
let mut source_tree_lock = None;
if self.pep517_backend.is_setuptools() {
debug!("Locking the source tree for setuptools");
let canonical_source_path = self.source_tree.canonicalize()?;
let lock_path = env::temp_dir().join(format!(
"uv-setuptools-{}.lock",
cache_digest(&canonical_source_path)
));
source_tree_lock = LockedFile::acquire(lock_path, self.source_tree.to_string_lossy())
.await
.inspect_err(|err| {
warn!("Failed to acquire build lock: {err}");
})
.ok();
}
Ok(source_tree_lock)
}
async fn get_resolved_requirements( async fn get_resolved_requirements(
build_context: &impl BuildContext, build_context: &impl BuildContext,
source_build_context: SourceBuildContext, source_build_context: SourceBuildContext,
@ -604,6 +636,9 @@ impl SourceBuild {
return Ok(Some(metadata_dir.clone())); return Ok(Some(metadata_dir.clone()));
} }
// Lock the source tree, if necessary.
let _lock = self.acquire_lock().await?;
// Hatch allows for highly dynamic customization of metadata via hooks. In such cases, Hatch // Hatch allows for highly dynamic customization of metadata via hooks. In such cases, Hatch
// can't uphold the PEP 517 contract, in that the metadata Hatch would return by // can't uphold the PEP 517 contract, in that the metadata Hatch would return by
// `prepare_metadata_for_build_wheel` isn't guaranteed to match that of the built wheel. // `prepare_metadata_for_build_wheel` isn't guaranteed to match that of the built wheel.
@ -716,16 +751,15 @@ impl SourceBuild {
pub async fn build(&self, wheel_dir: &Path) -> Result<String, Error> { pub async fn build(&self, wheel_dir: &Path) -> Result<String, Error> {
// The build scripts run with the extracted root as cwd, so they need the absolute path. // The build scripts run with the extracted root as cwd, so they need the absolute path.
let wheel_dir = std::path::absolute(wheel_dir)?; let wheel_dir = std::path::absolute(wheel_dir)?;
let filename = self.pep517_build(&wheel_dir, &self.pep517_backend).await?; let filename = self.pep517_build(&wheel_dir).await?;
Ok(filename) Ok(filename)
} }
/// Perform a PEP 517 build for a wheel or source distribution (sdist). /// Perform a PEP 517 build for a wheel or source distribution (sdist).
async fn pep517_build( async fn pep517_build(&self, output_dir: &Path) -> Result<String, Error> {
&self, // Lock the source tree, if necessary.
output_dir: &Path, let _lock = self.acquire_lock().await?;
pep517_backend: &Pep517Backend,
) -> Result<String, Error> {
// Write the hook output to a file so that we can read it back reliably. // Write the hook output to a file so that we can read it back reliably.
let outfile = self let outfile = self
.temp_dir .temp_dir
@ -737,7 +771,7 @@ impl SourceBuild {
BuildKind::Sdist => { BuildKind::Sdist => {
debug!( debug!(
r#"Calling `{}.build_{}("{}", {})`"#, r#"Calling `{}.build_{}("{}", {})`"#,
pep517_backend.backend, self.pep517_backend.backend,
self.build_kind, self.build_kind,
output_dir.escape_for_python(), output_dir.escape_for_python(),
self.config_settings.escape_for_python(), self.config_settings.escape_for_python(),
@ -750,7 +784,7 @@ impl SourceBuild {
with open("{}", "w") as fp: with open("{}", "w") as fp:
fp.write(sdist_filename) fp.write(sdist_filename)
"#, "#,
pep517_backend.backend_import(), self.pep517_backend.backend_import(),
self.build_kind, self.build_kind,
output_dir.escape_for_python(), output_dir.escape_for_python(),
self.config_settings.escape_for_python(), self.config_settings.escape_for_python(),
@ -766,7 +800,7 @@ impl SourceBuild {
}); });
debug!( debug!(
r#"Calling `{}.build_{}("{}", {}, {})`"#, r#"Calling `{}.build_{}("{}", {}, {})`"#,
pep517_backend.backend, self.pep517_backend.backend,
self.build_kind, self.build_kind,
output_dir.escape_for_python(), output_dir.escape_for_python(),
self.config_settings.escape_for_python(), self.config_settings.escape_for_python(),
@ -780,7 +814,7 @@ impl SourceBuild {
with open("{}", "w") as fp: with open("{}", "w") as fp:
fp.write(wheel_filename) fp.write(wheel_filename)
"#, "#,
pep517_backend.backend_import(), self.pep517_backend.backend_import(),
self.build_kind, self.build_kind,
output_dir.escape_for_python(), output_dir.escape_for_python(),
self.config_settings.escape_for_python(), self.config_settings.escape_for_python(),
@ -810,7 +844,7 @@ impl SourceBuild {
return Err(Error::from_command_output( return Err(Error::from_command_output(
format!( format!(
"Call to `{}.build_{}` failed", "Call to `{}.build_{}` failed",
pep517_backend.backend, self.build_kind self.pep517_backend.backend, self.build_kind
), ),
&output, &output,
self.level, self.level,
@ -825,7 +859,7 @@ impl SourceBuild {
return Err(Error::from_command_output( return Err(Error::from_command_output(
format!( format!(
"Call to `{}.build_{}` failed", "Call to `{}.build_{}` failed",
pep517_backend.backend, self.build_kind self.pep517_backend.backend, self.build_kind
), ),
&output, &output,
self.level, self.level,

View file

@ -1,6 +1,6 @@
[package] [package]
name = "uv-build" name = "uv-build"
version = "0.7.14" version = "0.7.19"
edition.workspace = true edition.workspace = true
rust-version.workspace = true rust-version.workspace = true
homepage.workspace = true homepage.workspace = true

View file

@ -1,6 +1,6 @@
[project] [project]
name = "uv-build" name = "uv-build"
version = "0.7.14" version = "0.7.19"
description = "The uv build backend" description = "The uv build backend"
authors = [{ name = "Astral Software Inc.", email = "hey@astral.sh" }] authors = [{ name = "Astral Software Inc.", email = "hey@astral.sh" }]
requires-python = ">=3.8" requires-python = ">=3.8"

View file

@ -5130,6 +5130,9 @@ pub struct IndexArgs {
/// All indexes provided via this flag take priority over the index specified by /// All indexes provided via this flag take priority over the index specified by
/// `--default-index` (which defaults to PyPI). When multiple `--index` flags are provided, /// `--default-index` (which defaults to PyPI). When multiple `--index` flags are provided,
/// earlier values take priority. /// earlier values take priority.
///
/// Index names are not supported as values. Relative paths must be disambiguated from index
/// names with `./` or `../` on Unix or `.\\`, `..\\`, `./` or `../` on Windows.
// //
// The nested Vec structure (`Vec<Vec<Maybe<Index>>>`) is required for clap's // The nested Vec structure (`Vec<Vec<Maybe<Index>>>`) is required for clap's
// value parsing mechanism, which processes one value at a time, in order to handle // value parsing mechanism, which processes one value at a time, in order to handle

View file

@ -1,7 +1,10 @@
use anstream::eprintln;
use uv_cache::Refresh; use uv_cache::Refresh;
use uv_configuration::ConfigSettings; use uv_configuration::ConfigSettings;
use uv_resolver::PrereleaseMode; use uv_resolver::PrereleaseMode;
use uv_settings::{Combine, PipOptions, ResolverInstallerOptions, ResolverOptions}; use uv_settings::{Combine, PipOptions, ResolverInstallerOptions, ResolverOptions};
use uv_warnings::owo_colors::OwoColorize;
use crate::{ use crate::{
BuildOptionsArgs, FetchArgs, IndexArgs, InstallerArgs, Maybe, RefreshArgs, ResolverArgs, BuildOptionsArgs, FetchArgs, IndexArgs, InstallerArgs, Maybe, RefreshArgs, ResolverArgs,
@ -9,12 +12,27 @@ use crate::{
}; };
/// Given a boolean flag pair (like `--upgrade` and `--no-upgrade`), resolve the value of the flag. /// Given a boolean flag pair (like `--upgrade` and `--no-upgrade`), resolve the value of the flag.
pub fn flag(yes: bool, no: bool) -> Option<bool> { pub fn flag(yes: bool, no: bool, name: &str) -> Option<bool> {
match (yes, no) { match (yes, no) {
(true, false) => Some(true), (true, false) => Some(true),
(false, true) => Some(false), (false, true) => Some(false),
(false, false) => None, (false, false) => None,
(..) => unreachable!("Clap should make this impossible"), (..) => {
eprintln!(
"{}{} `{}` and `{}` cannot be used together. \
Boolean flags on different levels are currently not supported \
(https://github.com/clap-rs/clap/issues/6049)",
"error".bold().red(),
":".bold(),
format!("--{name}").green(),
format!("--no-{name}").green(),
);
// No error forwarding since should eventually be solved on the clap side.
#[allow(clippy::exit)]
{
std::process::exit(2);
}
}
} }
} }
@ -26,7 +44,7 @@ impl From<RefreshArgs> for Refresh {
refresh_package, refresh_package,
} = value; } = value;
Self::from_args(flag(refresh, no_refresh), refresh_package) Self::from_args(flag(refresh, no_refresh, "no-refresh"), refresh_package)
} }
} }
@ -53,7 +71,7 @@ impl From<ResolverArgs> for PipOptions {
} = args; } = args;
Self { Self {
upgrade: flag(upgrade, no_upgrade), upgrade: flag(upgrade, no_upgrade, "no-upgrade"),
upgrade_package: Some(upgrade_package), upgrade_package: Some(upgrade_package),
index_strategy, index_strategy,
keyring_provider, keyring_provider,
@ -66,7 +84,7 @@ impl From<ResolverArgs> for PipOptions {
}, },
config_settings: config_setting config_settings: config_setting
.map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()), .map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()),
no_build_isolation: flag(no_build_isolation, build_isolation), no_build_isolation: flag(no_build_isolation, build_isolation, "build-isolation"),
no_build_isolation_package: Some(no_build_isolation_package), no_build_isolation_package: Some(no_build_isolation_package),
exclude_newer, exclude_newer,
link_mode, link_mode,
@ -96,16 +114,16 @@ impl From<InstallerArgs> for PipOptions {
} = args; } = args;
Self { Self {
reinstall: flag(reinstall, no_reinstall), reinstall: flag(reinstall, no_reinstall, "reinstall"),
reinstall_package: Some(reinstall_package), reinstall_package: Some(reinstall_package),
index_strategy, index_strategy,
keyring_provider, keyring_provider,
config_settings: config_setting config_settings: config_setting
.map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()), .map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()),
no_build_isolation: flag(no_build_isolation, build_isolation), no_build_isolation: flag(no_build_isolation, build_isolation, "build-isolation"),
exclude_newer, exclude_newer,
link_mode, link_mode,
compile_bytecode: flag(compile_bytecode, no_compile_bytecode), compile_bytecode: flag(compile_bytecode, no_compile_bytecode, "compile-bytecode"),
no_sources: if no_sources { Some(true) } else { None }, no_sources: if no_sources { Some(true) } else { None },
..PipOptions::from(index_args) ..PipOptions::from(index_args)
} }
@ -140,9 +158,9 @@ impl From<ResolverInstallerArgs> for PipOptions {
} = args; } = args;
Self { Self {
upgrade: flag(upgrade, no_upgrade), upgrade: flag(upgrade, no_upgrade, "upgrade"),
upgrade_package: Some(upgrade_package), upgrade_package: Some(upgrade_package),
reinstall: flag(reinstall, no_reinstall), reinstall: flag(reinstall, no_reinstall, "reinstall"),
reinstall_package: Some(reinstall_package), reinstall_package: Some(reinstall_package),
index_strategy, index_strategy,
keyring_provider, keyring_provider,
@ -155,11 +173,11 @@ impl From<ResolverInstallerArgs> for PipOptions {
fork_strategy, fork_strategy,
config_settings: config_setting config_settings: config_setting
.map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()), .map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()),
no_build_isolation: flag(no_build_isolation, build_isolation), no_build_isolation: flag(no_build_isolation, build_isolation, "build-isolation"),
no_build_isolation_package: Some(no_build_isolation_package), no_build_isolation_package: Some(no_build_isolation_package),
exclude_newer, exclude_newer,
link_mode, link_mode,
compile_bytecode: flag(compile_bytecode, no_compile_bytecode), compile_bytecode: flag(compile_bytecode, no_compile_bytecode, "compile-bytecode"),
no_sources: if no_sources { Some(true) } else { None }, no_sources: if no_sources { Some(true) } else { None },
..PipOptions::from(index_args) ..PipOptions::from(index_args)
} }
@ -289,7 +307,7 @@ pub fn resolver_options(
.filter_map(Maybe::into_option) .filter_map(Maybe::into_option)
.collect() .collect()
}), }),
upgrade: flag(upgrade, no_upgrade), upgrade: flag(upgrade, no_upgrade, "no-upgrade"),
upgrade_package: Some(upgrade_package), upgrade_package: Some(upgrade_package),
index_strategy, index_strategy,
keyring_provider, keyring_provider,
@ -303,13 +321,13 @@ pub fn resolver_options(
dependency_metadata: None, dependency_metadata: None,
config_settings: config_setting config_settings: config_setting
.map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()), .map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()),
no_build_isolation: flag(no_build_isolation, build_isolation), no_build_isolation: flag(no_build_isolation, build_isolation, "build-isolation"),
no_build_isolation_package: Some(no_build_isolation_package), no_build_isolation_package: Some(no_build_isolation_package),
exclude_newer, exclude_newer,
link_mode, link_mode,
no_build: flag(no_build, build), no_build: flag(no_build, build, "build"),
no_build_package: Some(no_build_package), no_build_package: Some(no_build_package),
no_binary: flag(no_binary, binary), no_binary: flag(no_binary, binary, "binary"),
no_binary_package: Some(no_binary_package), no_binary_package: Some(no_binary_package),
no_sources: if no_sources { Some(true) } else { None }, no_sources: if no_sources { Some(true) } else { None },
} }
@ -386,13 +404,13 @@ pub fn resolver_installer_options(
.filter_map(Maybe::into_option) .filter_map(Maybe::into_option)
.collect() .collect()
}), }),
upgrade: flag(upgrade, no_upgrade), upgrade: flag(upgrade, no_upgrade, "upgrade"),
upgrade_package: if upgrade_package.is_empty() { upgrade_package: if upgrade_package.is_empty() {
None None
} else { } else {
Some(upgrade_package) Some(upgrade_package)
}, },
reinstall: flag(reinstall, no_reinstall), reinstall: flag(reinstall, no_reinstall, "reinstall"),
reinstall_package: if reinstall_package.is_empty() { reinstall_package: if reinstall_package.is_empty() {
None None
} else { } else {
@ -410,7 +428,7 @@ pub fn resolver_installer_options(
dependency_metadata: None, dependency_metadata: None,
config_settings: config_setting config_settings: config_setting
.map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()), .map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()),
no_build_isolation: flag(no_build_isolation, build_isolation), no_build_isolation: flag(no_build_isolation, build_isolation, "build-isolation"),
no_build_isolation_package: if no_build_isolation_package.is_empty() { no_build_isolation_package: if no_build_isolation_package.is_empty() {
None None
} else { } else {
@ -418,14 +436,14 @@ pub fn resolver_installer_options(
}, },
exclude_newer, exclude_newer,
link_mode, link_mode,
compile_bytecode: flag(compile_bytecode, no_compile_bytecode), compile_bytecode: flag(compile_bytecode, no_compile_bytecode, "compile-bytecode"),
no_build: flag(no_build, build), no_build: flag(no_build, build, "build"),
no_build_package: if no_build_package.is_empty() { no_build_package: if no_build_package.is_empty() {
None None
} else { } else {
Some(no_build_package) Some(no_build_package)
}, },
no_binary: flag(no_binary, binary), no_binary: flag(no_binary, binary, "binary"),
no_binary_package: if no_binary_package.is_empty() { no_binary_package: if no_binary_package.is_empty() {
None None
} else { } else {

View file

@ -25,6 +25,7 @@ use tracing::{debug, trace};
use url::ParseError; use url::ParseError;
use url::Url; use url::Url;
use uv_auth::Credentials;
use uv_auth::{AuthMiddleware, Indexes}; use uv_auth::{AuthMiddleware, Indexes};
use uv_configuration::{KeyringProviderType, TrustedHost}; use uv_configuration::{KeyringProviderType, TrustedHost};
use uv_fs::Simplified; use uv_fs::Simplified;
@ -725,6 +726,16 @@ fn request_into_redirect(
} }
} }
// Check if there are credentials on the redirect location itself.
// If so, move them to Authorization header.
if !redirect_url.username().is_empty() {
if let Some(credentials) = Credentials::from_url(&redirect_url) {
let _ = redirect_url.set_username("");
let _ = redirect_url.set_password(None);
headers.insert(AUTHORIZATION, credentials.to_header_value());
}
}
std::mem::swap(req.headers_mut(), &mut headers); std::mem::swap(req.headers_mut(), &mut headers);
*req.url_mut() = Url::from(redirect_url); *req.url_mut() = Url::from(redirect_url);
debug!( debug!(
@ -971,6 +982,45 @@ mod tests {
Ok(()) Ok(())
} }
#[tokio::test]
async fn test_redirect_preserves_fragment() -> Result<()> {
for status in &[301, 302, 303, 307, 308] {
let server = MockServer::start().await;
Mock::given(method("GET"))
.respond_with(
ResponseTemplate::new(*status)
.insert_header("location", format!("{}/redirect", server.uri())),
)
.mount(&server)
.await;
let request = Client::new()
.get(format!("{}#fragment", server.uri()))
.build()
.unwrap();
let response = Client::builder()
.redirect(reqwest::redirect::Policy::none())
.build()
.unwrap()
.execute(request.try_clone().unwrap())
.await
.unwrap();
let redirect_request =
request_into_redirect(request, &response, CrossOriginCredentialsPolicy::Secure)?
.unwrap();
assert!(
redirect_request
.url()
.fragment()
.is_some_and(|fragment| fragment == "fragment")
);
}
Ok(())
}
#[tokio::test] #[tokio::test]
async fn test_redirect_removes_authorization_header_on_cross_origin() -> Result<()> { async fn test_redirect_removes_authorization_header_on_cross_origin() -> Result<()> {
for status in &[301, 302, 303, 307, 308] { for status in &[301, 302, 303, 307, 308] {

View file

@ -1416,44 +1416,6 @@ mod tests {
Ok(()) Ok(())
} }
#[tokio::test]
async fn test_redirect_preserve_fragment() -> Result<(), Error> {
let redirect_server = MockServer::start().await;
// Configure the redirect server to respond with a 307 with a relative URL.
Mock::given(method("GET"))
.respond_with(ResponseTemplate::new(307).insert_header("Location", "/foo".to_string()))
.mount(&redirect_server)
.await;
Mock::given(method("GET"))
.and(path_regex("/foo"))
.respond_with(ResponseTemplate::new(200))
.mount(&redirect_server)
.await;
let cache = Cache::temp()?;
let registry_client = RegistryClientBuilder::new(cache).build();
let client = registry_client.cached_client().uncached();
let mut url = DisplaySafeUrl::parse(&redirect_server.uri())?;
url.set_fragment(Some("fragment"));
assert_eq!(
client
.for_host(&url)
.get(Url::from(url.clone()))
.send()
.await?
.url()
.to_string(),
format!("{}/foo#fragment", redirect_server.uri()),
"Requests should preserve fragment"
);
Ok(())
}
#[test] #[test]
fn ignore_failing_files() { fn ignore_failing_files() {
// 1.7.7 has an invalid requires-python field (double comma), 1.7.8 is valid // 1.7.7 has an invalid requires-python field (double comma), 1.7.8 is valid

View file

@ -4,7 +4,7 @@ use uv_pep508::PackageName;
use crate::{PackageNameSpecifier, PackageNameSpecifiers}; use crate::{PackageNameSpecifier, PackageNameSpecifiers};
#[derive(Copy, Clone, Debug, Default, PartialEq, Eq)] #[derive(Copy, Clone, Debug, Default, PartialEq, Eq, Hash)]
pub enum BuildKind { pub enum BuildKind {
/// A PEP 517 wheel build. /// A PEP 517 wheel build.
#[default] #[default]

View file

@ -1,3 +1,5 @@
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::str::FromStr; use std::str::FromStr;
use uv_pep508::PackageName; use uv_pep508::PackageName;
@ -63,28 +65,16 @@ impl<'de> serde::Deserialize<'de> for PackageNameSpecifier {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for PackageNameSpecifier { impl schemars::JsonSchema for PackageNameSpecifier {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
"PackageNameSpecifier".to_string() Cow::Borrowed("PackageNameSpecifier")
} }
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(_gen: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::schema::SchemaObject { schemars::json_schema!({
instance_type: Some(schemars::schema::InstanceType::String.into()), "type": "string",
string: Some(Box::new(schemars::schema::StringValidation { "pattern": r"^(:none:|:all:|([a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9._-]*[a-zA-Z0-9]))$",
// See: https://packaging.python.org/en/latest/specifications/name-normalization/#name-format "description": "The name of a package, or `:all:` or `:none:` to select or omit all packages, respectively.",
pattern: Some( })
r"^(:none:|:all:|([a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9._-]*[a-zA-Z0-9]))$"
.to_string(),
),
..schemars::schema::StringValidation::default()
})),
metadata: Some(Box::new(schemars::schema::Metadata {
description: Some("The name of a package, or `:all:` or `:none:` to select or omit all packages, respectively.".to_string()),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
} }
} }

View file

@ -1,5 +1,6 @@
use std::fmt::Formatter; #[cfg(feature = "schemars")]
use std::str::FromStr; use std::borrow::Cow;
use std::{fmt::Formatter, str::FromStr};
use uv_pep440::{Version, VersionSpecifier, VersionSpecifiers, VersionSpecifiersParseError}; use uv_pep440::{Version, VersionSpecifier, VersionSpecifiers, VersionSpecifiersParseError};
@ -36,20 +37,15 @@ impl FromStr for RequiredVersion {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for RequiredVersion { impl schemars::JsonSchema for RequiredVersion {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
String::from("RequiredVersion") Cow::Borrowed("RequiredVersion")
} }
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::schema::SchemaObject { schemars::json_schema!({
instance_type: Some(schemars::schema::InstanceType::String.into()), "type": "string",
metadata: Some(Box::new(schemars::schema::Metadata { "description": "A version specifier, e.g. `>=0.5.0` or `==0.5.0`."
description: Some("A version specifier, e.g. `>=0.5.0` or `==0.5.0`.".to_string()), })
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
} }
} }

View file

@ -1,4 +1,6 @@
#[derive(Debug, Default, Clone, Copy, PartialEq, Eq, serde::Serialize, serde::Deserialize)] #[derive(
Debug, Default, Clone, Copy, PartialEq, Eq, Hash, serde::Serialize, serde::Deserialize,
)]
#[serde(rename_all = "kebab-case", deny_unknown_fields)] #[serde(rename_all = "kebab-case", deny_unknown_fields)]
pub enum SourceStrategy { pub enum SourceStrategy {
/// Use `tool.uv.sources` when resolving dependencies. /// Use `tool.uv.sources` when resolving dependencies.

View file

@ -62,7 +62,7 @@ pub static RAYON_PARALLELISM: AtomicUsize = AtomicUsize::new(0);
/// `LazyLock::force(&RAYON_INITIALIZE)`. /// `LazyLock::force(&RAYON_INITIALIZE)`.
pub static RAYON_INITIALIZE: LazyLock<()> = LazyLock::new(|| { pub static RAYON_INITIALIZE: LazyLock<()> = LazyLock::new(|| {
rayon::ThreadPoolBuilder::new() rayon::ThreadPoolBuilder::new()
.num_threads(RAYON_PARALLELISM.load(Ordering::SeqCst)) .num_threads(RAYON_PARALLELISM.load(Ordering::Relaxed))
.stack_size(min_stack_size()) .stack_size(min_stack_size())
.build_global() .build_global()
.expect("failed to initialize global rayon pool"); .expect("failed to initialize global rayon pool");

View file

@ -1,4 +1,6 @@
use serde::{Deserialize, Deserializer}; use serde::{Deserialize, Deserializer};
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::str::FromStr; use std::str::FromStr;
use url::Url; use url::Url;
@ -143,20 +145,15 @@ impl std::fmt::Display for TrustedHost {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for TrustedHost { impl schemars::JsonSchema for TrustedHost {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
"TrustedHost".to_string() Cow::Borrowed("TrustedHost")
} }
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::schema::SchemaObject { schemars::json_schema!({
instance_type: Some(schemars::schema::InstanceType::String.into()), "type": "string",
metadata: Some(Box::new(schemars::schema::Metadata { "description": "A host or host-port pair."
description: Some("A host or host-port pair.".to_string()), })
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
} }
} }

View file

@ -3,7 +3,7 @@ use std::path::PathBuf;
use anstream::println; use anstream::println;
use anyhow::{Result, bail}; use anyhow::{Result, bail};
use pretty_assertions::StrComparison; use pretty_assertions::StrComparison;
use schemars::{JsonSchema, schema_for}; use schemars::JsonSchema;
use serde::Deserialize; use serde::Deserialize;
use uv_settings::Options as SettingsOptions; use uv_settings::Options as SettingsOptions;
@ -91,7 +91,10 @@ const REPLACEMENTS: &[(&str, &str)] = &[
/// Generate the JSON schema for the combined options as a string. /// Generate the JSON schema for the combined options as a string.
fn generate() -> String { fn generate() -> String {
let schema = schema_for!(CombinedOptions); let settings = schemars::generate::SchemaSettings::draft07();
let generator = schemars::SchemaGenerator::new(settings);
let schema = generator.into_root_schema_for::<CombinedOptions>();
let mut output = serde_json::to_string_pretty(&schema).unwrap(); let mut output = serde_json::to_string_pretty(&schema).unwrap();
for (value, replacement) in REPLACEMENTS { for (value, replacement) in REPLACEMENTS {

View file

@ -11,7 +11,7 @@ use crate::ROOT_DIR;
use crate::generate_all::Mode; use crate::generate_all::Mode;
/// Contains current supported targets /// Contains current supported targets
const TARGETS_YML_URL: &str = "https://raw.githubusercontent.com/astral-sh/python-build-standalone/refs/tags/20250612/cpython-unix/targets.yml"; const TARGETS_YML_URL: &str = "https://raw.githubusercontent.com/astral-sh/python-build-standalone/refs/tags/20250702/cpython-unix/targets.yml";
#[derive(clap::Args)] #[derive(clap::Args)]
pub(crate) struct Args { pub(crate) struct Args {
@ -130,7 +130,7 @@ async fn generate() -> Result<String> {
output.push_str("//! DO NOT EDIT\n"); output.push_str("//! DO NOT EDIT\n");
output.push_str("//!\n"); output.push_str("//!\n");
output.push_str("//! Generated with `cargo run dev generate-sysconfig-metadata`\n"); output.push_str("//! Generated with `cargo run dev generate-sysconfig-metadata`\n");
output.push_str("//! Targets from <https://github.com/astral-sh/python-build-standalone/blob/20250612/cpython-unix/targets.yml>\n"); output.push_str("//! Targets from <https://github.com/astral-sh/python-build-standalone/blob/20250702/cpython-unix/targets.yml>\n");
output.push_str("//!\n"); output.push_str("//!\n");
// Disable clippy/fmt // Disable clippy/fmt

View file

@ -11,6 +11,7 @@ use itertools::Itertools;
use rustc_hash::FxHashMap; use rustc_hash::FxHashMap;
use thiserror::Error; use thiserror::Error;
use tracing::{debug, instrument, trace}; use tracing::{debug, instrument, trace};
use uv_build_backend::check_direct_build; use uv_build_backend::check_direct_build;
use uv_build_frontend::{SourceBuild, SourceBuildContext}; use uv_build_frontend::{SourceBuild, SourceBuildContext};
use uv_cache::Cache; use uv_cache::Cache;
@ -35,8 +36,8 @@ use uv_resolver::{
PythonRequirement, Resolver, ResolverEnvironment, PythonRequirement, Resolver, ResolverEnvironment,
}; };
use uv_types::{ use uv_types::{
AnyErrorBuild, BuildContext, BuildIsolation, BuildStack, EmptyInstalledPackages, HashStrategy, AnyErrorBuild, BuildArena, BuildContext, BuildIsolation, BuildStack, EmptyInstalledPackages,
InFlight, HashStrategy, InFlight,
}; };
use uv_workspace::WorkspaceCache; use uv_workspace::WorkspaceCache;
@ -179,6 +180,10 @@ impl BuildContext for BuildDispatch<'_> {
&self.shared_state.git &self.shared_state.git
} }
fn build_arena(&self) -> &BuildArena<SourceBuild> {
&self.shared_state.build_arena
}
fn capabilities(&self) -> &IndexCapabilities { fn capabilities(&self) -> &IndexCapabilities {
&self.shared_state.capabilities &self.shared_state.capabilities
} }
@ -448,12 +453,6 @@ impl BuildContext for BuildDispatch<'_> {
build_kind: BuildKind, build_kind: BuildKind,
version_id: Option<&'data str>, version_id: Option<&'data str>,
) -> Result<Option<DistFilename>, BuildDispatchError> { ) -> Result<Option<DistFilename>, BuildDispatchError> {
// Direct builds are a preview feature with the uv build backend.
if self.preview.is_disabled() {
trace!("Preview is disabled, not checking for direct build");
return Ok(None);
}
let source_tree = if let Some(subdir) = subdirectory { let source_tree = if let Some(subdir) = subdirectory {
source.join(subdir) source.join(subdir)
} else { } else {
@ -521,6 +520,8 @@ pub struct SharedState {
index: InMemoryIndex, index: InMemoryIndex,
/// The downloaded distributions. /// The downloaded distributions.
in_flight: InFlight, in_flight: InFlight,
/// Build directories for any PEP 517 builds executed during resolution or installation.
build_arena: BuildArena<SourceBuild>,
} }
impl SharedState { impl SharedState {
@ -533,6 +534,7 @@ impl SharedState {
Self { Self {
git: self.git.clone(), git: self.git.clone(),
capabilities: self.capabilities.clone(), capabilities: self.capabilities.clone(),
build_arena: self.build_arena.clone(),
..Default::default() ..Default::default()
} }
} }
@ -556,4 +558,9 @@ impl SharedState {
pub fn capabilities(&self) -> &IndexCapabilities { pub fn capabilities(&self) -> &IndexCapabilities {
&self.capabilities &self.capabilities
} }
/// Return the [`BuildArena`] used by the [`SharedState`].
pub fn build_arena(&self) -> &BuildArena<SourceBuild> {
&self.build_arena
}
} }

View file

@ -27,7 +27,6 @@ rkyv = { workspace = true, features = ["smallvec-1"] }
serde = { workspace = true } serde = { workspace = true }
smallvec = { workspace = true } smallvec = { workspace = true }
thiserror = { workspace = true } thiserror = { workspace = true }
url = { workspace = true }
[dev-dependencies] [dev-dependencies]
insta = { version = "1.40.0" } insta = { version = "1.40.0" }

View file

@ -5,7 +5,6 @@ use std::str::FromStr;
use memchr::memchr; use memchr::memchr;
use serde::{Deserialize, Deserializer, Serialize, Serializer, de}; use serde::{Deserialize, Deserializer, Serialize, Serializer, de};
use thiserror::Error; use thiserror::Error;
use url::Url;
use uv_cache_key::cache_digest; use uv_cache_key::cache_digest;
use uv_normalize::{InvalidNameError, PackageName}; use uv_normalize::{InvalidNameError, PackageName};
@ -300,29 +299,6 @@ impl WheelFilename {
} }
} }
impl TryFrom<&Url> for WheelFilename {
type Error = WheelFilenameError;
fn try_from(url: &Url) -> Result<Self, Self::Error> {
let filename = url
.path_segments()
.ok_or_else(|| {
WheelFilenameError::InvalidWheelFileName(
url.to_string(),
"URL must have a path".to_string(),
)
})?
.next_back()
.ok_or_else(|| {
WheelFilenameError::InvalidWheelFileName(
url.to_string(),
"URL must contain a filename".to_string(),
)
})?;
Self::from_str(filename)
}
}
impl<'de> Deserialize<'de> for WheelFilename { impl<'de> Deserialize<'de> for WheelFilename {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error> fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where where

View file

@ -29,6 +29,7 @@ uv-platform-tags = { workspace = true }
uv-pypi-types = { workspace = true } uv-pypi-types = { workspace = true }
uv-redacted = { workspace = true } uv-redacted = { workspace = true }
uv-small-str = { workspace = true } uv-small-str = { workspace = true }
uv-warnings = { workspace = true }
arcstr = { workspace = true } arcstr = { workspace = true }
bitflags = { workspace = true } bitflags = { workspace = true }

View file

@ -1,3 +1,4 @@
use std::borrow::Cow;
use std::fmt::{self, Display, Formatter}; use std::fmt::{self, Display, Formatter};
use std::str::FromStr; use std::str::FromStr;
@ -160,16 +161,33 @@ impl UrlString {
.unwrap_or(self.as_ref()) .unwrap_or(self.as_ref())
} }
/// Return the [`UrlString`] with any fragments removed. /// Return the [`UrlString`] (as a [`Cow`]) with any fragments removed.
#[must_use] #[must_use]
pub fn without_fragment(&self) -> Self { pub fn without_fragment(&self) -> Cow<'_, Self> {
Self(
self.as_ref() self.as_ref()
.split_once('#') .split_once('#')
.map(|(path, _)| path) .map(|(path, _)| Cow::Owned(UrlString(SmallString::from(path))))
.map(SmallString::from) .unwrap_or(Cow::Borrowed(self))
.unwrap_or_else(|| self.0.clone()), }
)
/// Return the [`UrlString`] (as a [`Cow`]) with trailing slash removed.
///
/// This matches the semantics of [`Url::pop_if_empty`], which will not trim a trailing slash if
/// it's the only path segment, e.g., `https://example.com/` would be unchanged.
#[must_use]
pub fn without_trailing_slash(&self) -> Cow<'_, Self> {
self.as_ref()
.strip_suffix('/')
.filter(|path| {
// Only strip the trailing slash if there's _another_ trailing slash that isn't a
// part of the scheme.
path.split_once("://")
.map(|(_scheme, rest)| rest)
.unwrap_or(path)
.contains('/')
})
.map(|path| Cow::Owned(UrlString(SmallString::from(path))))
.unwrap_or(Cow::Borrowed(self))
} }
} }
@ -252,16 +270,51 @@ mod tests {
#[test] #[test]
fn without_fragment() { fn without_fragment() {
// Borrows a URL without a fragment
let url = UrlString("https://example.com/path".into());
assert_eq!(&*url.without_fragment(), &url);
assert!(matches!(url.without_fragment(), Cow::Borrowed(_)));
// Removes the fragment if present on the URL
let url = UrlString("https://example.com/path?query#fragment".into()); let url = UrlString("https://example.com/path?query#fragment".into());
assert_eq!( assert_eq!(
url.without_fragment(), &*url.without_fragment(),
UrlString("https://example.com/path?query".into()) &UrlString("https://example.com/path?query".into())
); );
assert!(matches!(url.without_fragment(), Cow::Owned(_)));
}
let url = UrlString("https://example.com/path#fragment".into()); #[test]
assert_eq!(url.base_str(), "https://example.com/path"); fn without_trailing_slash() {
// Borrows a URL without a slash
let url = UrlString("https://example.com/path".into()); let url = UrlString("https://example.com/path".into());
assert_eq!(url.base_str(), "https://example.com/path"); assert_eq!(&*url.without_trailing_slash(), &url);
assert!(matches!(url.without_trailing_slash(), Cow::Borrowed(_)));
// Removes the trailing slash if present on the URL
let url = UrlString("https://example.com/path/".into());
assert_eq!(
&*url.without_trailing_slash(),
&UrlString("https://example.com/path".into())
);
assert!(matches!(url.without_trailing_slash(), Cow::Owned(_)));
// Does not remove a trailing slash if it's the only path segment
let url = UrlString("https://example.com/".into());
assert_eq!(&*url.without_trailing_slash(), &url);
assert!(matches!(url.without_trailing_slash(), Cow::Borrowed(_)));
// Does not remove a trailing slash if it's the only path segment with a missing scheme
let url = UrlString("example.com/".into());
assert_eq!(&*url.without_trailing_slash(), &url);
assert!(matches!(url.without_trailing_slash(), Cow::Borrowed(_)));
// Removes the trailing slash when the scheme is missing
let url = UrlString("example.com/path/".into());
assert_eq!(
&*url.without_trailing_slash(),
&UrlString("example.com/path".into())
);
assert!(matches!(url.without_trailing_slash(), Cow::Owned(_)));
} }
} }

View file

@ -12,6 +12,7 @@ use url::{ParseError, Url};
use uv_pep508::{Scheme, VerbatimUrl, VerbatimUrlError, split_scheme}; use uv_pep508::{Scheme, VerbatimUrl, VerbatimUrlError, split_scheme};
use uv_redacted::DisplaySafeUrl; use uv_redacted::DisplaySafeUrl;
use uv_warnings::warn_user;
use crate::{Index, IndexStatusCodeStrategy, Verbatim}; use crate::{Index, IndexStatusCodeStrategy, Verbatim};
@ -37,6 +38,8 @@ impl IndexUrl {
/// ///
/// If no root directory is provided, relative paths are resolved against the current working /// If no root directory is provided, relative paths are resolved against the current working
/// directory. /// directory.
///
/// Normalizes non-file URLs by removing trailing slashes for consistency.
pub fn parse(path: &str, root_dir: Option<&Path>) -> Result<Self, IndexUrlError> { pub fn parse(path: &str, root_dir: Option<&Path>) -> Result<Self, IndexUrlError> {
let url = match split_scheme(path) { let url = match split_scheme(path) {
Some((scheme, ..)) => { Some((scheme, ..)) => {
@ -92,20 +95,15 @@ impl IndexUrl {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for IndexUrl { impl schemars::JsonSchema for IndexUrl {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
"IndexUrl".to_string() Cow::Borrowed("IndexUrl")
} }
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::schema::SchemaObject { schemars::json_schema!({
instance_type: Some(schemars::schema::InstanceType::String.into()), "type": "string",
metadata: Some(Box::new(schemars::schema::Metadata { "description": "The URL of an index to use for fetching packages (e.g., `https://pypi.org/simple`), or a local path."
description: Some("The URL of an index to use for fetching packages (e.g., `https://pypi.org/simple`), or a local path.".to_string()), })
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
} }
} }
@ -140,6 +138,30 @@ impl IndexUrl {
Cow::Owned(url) Cow::Owned(url)
} }
} }
/// Warn user if the given URL was provided as an ambiguous relative path.
///
/// This is a temporary warning. Ambiguous values will not be
/// accepted in the future.
pub fn warn_on_disambiguated_relative_path(&self) {
let Self::Path(verbatim_url) = &self else {
return;
};
if let Some(path) = verbatim_url.given() {
if !is_disambiguated_path(path) {
if cfg!(windows) {
warn_user!(
"Relative paths passed to `--index` or `--default-index` should be disambiguated from index names (use `.\\{path}` or `./{path}`). Support for ambiguous values will be removed in the future"
);
} else {
warn_user!(
"Relative paths passed to `--index` or `--default-index` should be disambiguated from index names (use `./{path}`). Support for ambiguous values will be removed in the future"
);
}
}
}
}
} }
impl Display for IndexUrl { impl Display for IndexUrl {
@ -162,6 +184,28 @@ impl Verbatim for IndexUrl {
} }
} }
/// Checks if a path is disambiguated.
///
/// Disambiguated paths are absolute paths, paths with valid schemes,
/// and paths starting with "./" or "../" on Unix or ".\\", "..\\",
/// "./", or "../" on Windows.
fn is_disambiguated_path(path: &str) -> bool {
if cfg!(windows) {
if path.starts_with(".\\") || path.starts_with("..\\") || path.starts_with('/') {
return true;
}
}
if path.starts_with("./") || path.starts_with("../") || Path::new(path).is_absolute() {
return true;
}
// Check if the path has a scheme (like `file://`)
if let Some((scheme, _)) = split_scheme(path) {
return Scheme::parse(scheme).is_some();
}
// This is an ambiguous relative path
false
}
/// An error that can occur when parsing an [`IndexUrl`]. /// An error that can occur when parsing an [`IndexUrl`].
#[derive(Error, Debug)] #[derive(Error, Debug)]
pub enum IndexUrlError { pub enum IndexUrlError {
@ -214,16 +258,23 @@ impl<'de> serde::de::Deserialize<'de> for IndexUrl {
} }
impl From<VerbatimUrl> for IndexUrl { impl From<VerbatimUrl> for IndexUrl {
fn from(url: VerbatimUrl) -> Self { fn from(mut url: VerbatimUrl) -> Self {
if url.scheme() == "file" { if url.scheme() == "file" {
Self::Path(Arc::new(url)) Self::Path(Arc::new(url))
} else if *url.raw() == *PYPI_URL { } else {
// Remove trailing slashes for consistency. They'll be re-added if necessary when
// querying the Simple API.
if let Ok(mut path_segments) = url.raw_mut().path_segments_mut() {
path_segments.pop_if_empty();
}
if *url.raw() == *PYPI_URL {
Self::Pypi(Arc::new(url)) Self::Pypi(Arc::new(url))
} else { } else {
Self::Url(Arc::new(url)) Self::Url(Arc::new(url))
} }
} }
} }
}
impl From<IndexUrl> for DisplaySafeUrl { impl From<IndexUrl> for DisplaySafeUrl {
fn from(index: IndexUrl) -> Self { fn from(index: IndexUrl) -> Self {
@ -411,6 +462,19 @@ impl<'a> IndexLocations {
indexes indexes
} }
} }
/// Add all authenticated sources to the cache.
pub fn cache_index_credentials(&self) {
for index in self.allowed_indexes() {
if let Some(credentials) = index.credentials() {
let credentials = Arc::new(credentials);
uv_auth::store_credentials(index.raw_url(), credentials.clone());
if let Some(root_url) = index.root_url() {
uv_auth::store_credentials(&root_url, credentials.clone());
}
}
}
}
} }
impl From<&IndexLocations> for uv_auth::Indexes { impl From<&IndexLocations> for uv_auth::Indexes {
@ -625,3 +689,41 @@ impl IndexCapabilities {
.insert(Flags::FORBIDDEN); .insert(Flags::FORBIDDEN);
} }
} }
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_index_url_parse_valid_paths() {
// Absolute path
assert!(is_disambiguated_path("/absolute/path"));
// Relative path
assert!(is_disambiguated_path("./relative/path"));
assert!(is_disambiguated_path("../../relative/path"));
if cfg!(windows) {
// Windows absolute path
assert!(is_disambiguated_path("C:/absolute/path"));
// Windows relative path
assert!(is_disambiguated_path(".\\relative\\path"));
assert!(is_disambiguated_path("..\\..\\relative\\path"));
}
}
#[test]
fn test_index_url_parse_ambiguous_paths() {
// Test single-segment ambiguous path
assert!(!is_disambiguated_path("index"));
// Test multi-segment ambiguous path
assert!(!is_disambiguated_path("relative/path"));
}
#[test]
fn test_index_url_parse_with_schemes() {
assert!(is_disambiguated_path("file:///absolute/path"));
assert!(is_disambiguated_path("https://registry.com/simple/"));
assert!(is_disambiguated_path(
"git+https://github.com/example/repo.git"
));
}
}

View file

@ -365,7 +365,7 @@ impl InstalledDist {
pub fn installer(&self) -> Result<Option<String>, InstalledDistError> { pub fn installer(&self) -> Result<Option<String>, InstalledDistError> {
let path = self.install_path().join("INSTALLER"); let path = self.install_path().join("INSTALLER");
match fs::read_to_string(path) { match fs::read_to_string(path) {
Ok(installer) => Ok(Some(installer)), Ok(installer) => Ok(Some(installer.trim().to_owned())),
Err(err) if err.kind() == std::io::ErrorKind::NotFound => Ok(None), Err(err) if err.kind() == std::io::ErrorKind::NotFound => Ok(None),
Err(err) => Err(err.into()), Err(err) => Err(err.into()),
} }

View file

@ -3,6 +3,8 @@
//! flags set. //! flags set.
use serde::{Deserialize, Deserializer, Serialize}; use serde::{Deserialize, Deserializer, Serialize};
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::path::Path; use std::path::Path;
use crate::{Index, IndexUrl}; use crate::{Index, IndexUrl};
@ -50,14 +52,14 @@ macro_rules! impl_index {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for $name { impl schemars::JsonSchema for $name {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
IndexUrl::schema_name() IndexUrl::schema_name()
} }
fn json_schema( fn json_schema(
r#gen: &mut schemars::r#gen::SchemaGenerator, generator: &mut schemars::generate::SchemaGenerator,
) -> schemars::schema::Schema { ) -> schemars::Schema {
IndexUrl::json_schema(r#gen) IndexUrl::json_schema(generator)
} }
} }
}; };

View file

@ -66,15 +66,8 @@ impl RequiresPython {
) -> Option<Self> { ) -> Option<Self> {
// Convert to PubGrub range and perform an intersection. // Convert to PubGrub range and perform an intersection.
let range = specifiers let range = specifiers
.into_iter() .map(|specs| release_specifiers_to_ranges(specs.clone()))
.map(|specifier| release_specifiers_to_ranges(specifier.clone())) .reduce(|acc, r| acc.intersection(&r))?;
.fold(None, |range: Option<Ranges<Version>>, requires_python| {
if let Some(range) = range {
Some(range.intersection(&requires_python))
} else {
Some(requires_python)
}
})?;
// If the intersection is empty, return `None`. // If the intersection is empty, return `None`.
if range.is_empty() { if range.is_empty() {

View file

@ -1,3 +1,5 @@
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::ops::Deref; use std::ops::Deref;
use http::StatusCode; use http::StatusCode;
@ -136,17 +138,17 @@ impl<'de> Deserialize<'de> for SerializableStatusCode {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for SerializableStatusCode { impl schemars::JsonSchema for SerializableStatusCode {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
"StatusCode".to_string() Cow::Borrowed("StatusCode")
} }
fn json_schema(r#gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
let mut schema = r#gen.subschema_for::<u16>().into_object(); schemars::json_schema!({
schema.metadata().description = Some("HTTP status code (100-599)".to_string()); "type": "number",
schema.number().minimum = Some(100.0); "minimum": 100,
schema.number().maximum = Some(599.0); "maximum": 599,
"description": "HTTP status code (100-599)"
schema.into() })
} }
} }

View file

@ -108,6 +108,8 @@ pub enum Error {
CacheHeal(String, HashAlgorithm), CacheHeal(String, HashAlgorithm),
#[error("The source distribution requires Python {0}, but {1} is installed")] #[error("The source distribution requires Python {0}, but {1} is installed")]
RequiresPython(VersionSpecifiers, Version), RequiresPython(VersionSpecifiers, Version),
#[error("Failed to identify base Python interpreter")]
BaseInterpreter(#[source] std::io::Error),
/// A generic request middleware error happened while making a request. /// A generic request middleware error happened while making a request.
/// Refer to the error message for more details. /// Refer to the error message for more details.

View file

@ -13,7 +13,7 @@ use uv_git_types::{GitReference, GitUrl, GitUrlParseError};
use uv_normalize::{ExtraName, GroupName, PackageName}; use uv_normalize::{ExtraName, GroupName, PackageName};
use uv_pep440::VersionSpecifiers; use uv_pep440::VersionSpecifiers;
use uv_pep508::{MarkerTree, VerbatimUrl, VersionOrUrl, looks_like_git_repository}; use uv_pep508::{MarkerTree, VerbatimUrl, VersionOrUrl, looks_like_git_repository};
use uv_pypi_types::{ConflictItem, ParsedUrlError, VerbatimParsedUrl}; use uv_pypi_types::{ConflictItem, ParsedGitUrl, ParsedUrlError, VerbatimParsedUrl};
use uv_redacted::DisplaySafeUrl; use uv_redacted::DisplaySafeUrl;
use uv_workspace::Workspace; use uv_workspace::Workspace;
use uv_workspace::pyproject::{PyProjectToml, Source, Sources}; use uv_workspace::pyproject::{PyProjectToml, Source, Sources};
@ -700,17 +700,23 @@ fn path_source(
}; };
if is_dir { if is_dir {
if let Some(git_member) = git_member { if let Some(git_member) = git_member {
let git = git_member.git_source.git.clone();
let subdirectory = uv_fs::relative_to(install_path, git_member.fetch_root) let subdirectory = uv_fs::relative_to(install_path, git_member.fetch_root)
.expect("Workspace member must be relative"); .expect("Workspace member must be relative");
let subdirectory = uv_fs::normalize_path_buf(subdirectory); let subdirectory = uv_fs::normalize_path_buf(subdirectory);
return Ok(RequirementSource::Git { let subdirectory = if subdirectory == PathBuf::new() {
git: git_member.git_source.git.clone(),
subdirectory: if subdirectory == PathBuf::new() {
None None
} else { } else {
Some(subdirectory.into_boxed_path()) Some(subdirectory.into_boxed_path())
}, };
url, let url = DisplaySafeUrl::from(ParsedGitUrl {
url: git.clone(),
subdirectory: subdirectory.clone(),
});
return Ok(RequirementSource::Git {
git,
subdirectory,
url: VerbatimUrl::from_url(url),
}); });
} }

View file

@ -43,7 +43,7 @@ use uv_normalize::PackageName;
use uv_pep440::{Version, release_specifiers_to_ranges}; use uv_pep440::{Version, release_specifiers_to_ranges};
use uv_platform_tags::Tags; use uv_platform_tags::Tags;
use uv_pypi_types::{HashAlgorithm, HashDigest, HashDigests, PyProjectToml, ResolutionMetadata}; use uv_pypi_types::{HashAlgorithm, HashDigest, HashDigests, PyProjectToml, ResolutionMetadata};
use uv_types::{BuildContext, BuildStack, SourceBuildTrait}; use uv_types::{BuildContext, BuildKey, BuildStack, SourceBuildTrait};
use uv_workspace::pyproject::ToolUvSources; use uv_workspace::pyproject::ToolUvSources;
use crate::distribution_database::ManagedClient; use crate::distribution_database::ManagedClient;
@ -1860,6 +1860,12 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
} }
}; };
// If the URL is already precise, return it.
if self.build_context.git().get_precise(git).is_some() {
debug!("Precise commit already known: {source}");
return Ok(());
}
// If this is GitHub URL, attempt to resolve to a precise commit using the GitHub API. // If this is GitHub URL, attempt to resolve to a precise commit using the GitHub API.
if self if self
.build_context .build_context
@ -2270,6 +2276,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
fs::create_dir_all(&cache_shard) fs::create_dir_all(&cache_shard)
.await .await
.map_err(Error::CacheWrite)?; .map_err(Error::CacheWrite)?;
// Try a direct build if that isn't disabled and the uv build backend is used. // Try a direct build if that isn't disabled and the uv build backend is used.
let disk_filename = if let Some(name) = self let disk_filename = if let Some(name) = self
.build_context .build_context
@ -2290,7 +2297,47 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
// In the uv build backend, the normalized filename and the disk filename are the same. // In the uv build backend, the normalized filename and the disk filename are the same.
name.to_string() name.to_string()
} else { } else {
// Identify the base Python interpreter to use in the cache key.
let base_python = if cfg!(unix) {
self.build_context self.build_context
.interpreter()
.find_base_python()
.map_err(Error::BaseInterpreter)?
} else {
self.build_context
.interpreter()
.to_base_python()
.map_err(Error::BaseInterpreter)?
};
let build_kind = if source.is_editable() {
BuildKind::Editable
} else {
BuildKind::Wheel
};
let build_key = BuildKey {
base_python: base_python.into_boxed_path(),
source_root: source_root.to_path_buf().into_boxed_path(),
subdirectory: subdirectory
.map(|subdirectory| subdirectory.to_path_buf().into_boxed_path()),
source_strategy,
build_kind,
};
if let Some(builder) = self.build_context.build_arena().remove(&build_key) {
debug!("Creating build environment for: {source}");
let wheel = builder.wheel(temp_dir.path()).await.map_err(Error::Build)?;
// Store the build context.
self.build_context.build_arena().insert(build_key, builder);
wheel
} else {
debug!("Reusing existing build environment for: {source}");
let builder = self
.build_context
.setup_build( .setup_build(
source_root, source_root,
subdirectory, subdirectory,
@ -2307,10 +2354,16 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
self.build_stack.cloned().unwrap_or_default(), self.build_stack.cloned().unwrap_or_default(),
) )
.await .await
.map_err(|err| Error::Build(err.into()))? .map_err(|err| Error::Build(err.into()))?;
.wheel(temp_dir.path())
.await // Build the wheel.
.map_err(Error::Build)? let wheel = builder.wheel(temp_dir.path()).await.map_err(Error::Build)?;
// Store the build context.
self.build_context.build_arena().insert(build_key, builder);
wheel
}
}; };
// Read the metadata from the wheel. // Read the metadata from the wheel.
@ -2365,6 +2418,26 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
} }
} }
// Identify the base Python interpreter to use in the cache key.
let base_python = if cfg!(unix) {
self.build_context
.interpreter()
.find_base_python()
.map_err(Error::BaseInterpreter)?
} else {
self.build_context
.interpreter()
.to_base_python()
.map_err(Error::BaseInterpreter)?
};
// Determine whether this is an editable or non-editable build.
let build_kind = if source.is_editable() {
BuildKind::Editable
} else {
BuildKind::Wheel
};
// Set up the builder. // Set up the builder.
let mut builder = self let mut builder = self
.build_context .build_context
@ -2375,11 +2448,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
Some(&source.to_string()), Some(&source.to_string()),
source.as_dist(), source.as_dist(),
source_strategy, source_strategy,
if source.is_editable() { build_kind,
BuildKind::Editable
} else {
BuildKind::Wheel
},
BuildOutput::Debug, BuildOutput::Debug,
self.build_stack.cloned().unwrap_or_default(), self.build_stack.cloned().unwrap_or_default(),
) )
@ -2388,6 +2457,21 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
// Build the metadata. // Build the metadata.
let dist_info = builder.metadata().await.map_err(Error::Build)?; let dist_info = builder.metadata().await.map_err(Error::Build)?;
// Store the build context.
self.build_context.build_arena().insert(
BuildKey {
base_python: base_python.into_boxed_path(),
source_root: source_root.to_path_buf().into_boxed_path(),
subdirectory: subdirectory
.map(|subdirectory| subdirectory.to_path_buf().into_boxed_path()),
source_strategy,
build_kind,
},
builder,
);
// Return the `.dist-info` directory, if it exists.
let Some(dist_info) = dist_info else { let Some(dist_info) = dist_info else {
return Ok(None); return Ok(None);
}; };

View file

@ -2,11 +2,11 @@ use std::{ffi::OsString, path::PathBuf};
#[derive(Debug, thiserror::Error)] #[derive(Debug, thiserror::Error)]
pub enum Error { pub enum Error {
#[error(transparent)] #[error("Failed to read from zip file")]
Zip(#[from] zip::result::ZipError), Zip(#[from] zip::result::ZipError),
#[error(transparent)] #[error("Failed to read from zip file")]
AsyncZip(#[from] async_zip::error::ZipError), AsyncZip(#[from] async_zip::error::ZipError),
#[error(transparent)] #[error("I/O operation failed during extraction")]
Io(#[from] std::io::Error), Io(#[from] std::io::Error),
#[error( #[error(
"The top-level of the archive must only contain a list directory, but it contains: {0:?}" "The top-level of the archive must only contain a list directory, but it contains: {0:?}"

View file

@ -601,6 +601,7 @@ pub fn is_virtualenv_base(path: impl AsRef<Path>) -> bool {
/// A file lock that is automatically released when dropped. /// A file lock that is automatically released when dropped.
#[derive(Debug)] #[derive(Debug)]
#[must_use]
pub struct LockedFile(fs_err::File); pub struct LockedFile(fs_err::File);
impl LockedFile { impl LockedFile {

View file

@ -330,11 +330,11 @@ pub struct PortablePathBuf(Box<Path>);
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for PortablePathBuf { impl schemars::JsonSchema for PortablePathBuf {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
PathBuf::schema_name() Cow::Borrowed("PortablePathBuf")
} }
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(_gen: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
PathBuf::json_schema(_gen) PathBuf::json_schema(_gen)
} }
} }

View file

@ -17,7 +17,7 @@ fn get_binary_type(path: &Path) -> windows::core::Result<u32> {
.chain(Some(0)) .chain(Some(0))
.collect::<Vec<u16>>(); .collect::<Vec<u16>>();
// SAFETY: winapi call // SAFETY: winapi call
unsafe { GetBinaryTypeW(PCWSTR(name.as_ptr()), &mut binary_type)? }; unsafe { GetBinaryTypeW(PCWSTR(name.as_ptr()), &raw mut binary_type)? };
Ok(binary_type) Ok(binary_type)
} }

View file

@ -20,6 +20,8 @@ use uv_redacted::DisplaySafeUrl;
use uv_static::EnvVars; use uv_static::EnvVars;
use uv_version::version; use uv_version::version;
use crate::rate_limit::{GITHUB_RATE_LIMIT_STATUS, is_github_rate_limited};
/// A file indicates that if present, `git reset` has been done and a repo /// A file indicates that if present, `git reset` has been done and a repo
/// checkout is ready to go. See [`GitCheckout::reset`] for why we need this. /// checkout is ready to go. See [`GitCheckout::reset`] for why we need this.
const CHECKOUT_READY_LOCK: &str = ".ok"; const CHECKOUT_READY_LOCK: &str = ".ok";
@ -787,7 +789,15 @@ fn github_fast_path(
} }
}; };
let url = format!("https://api.github.com/repos/{owner}/{repo}/commits/{github_branch_name}"); // Check if we're rate-limited by GitHub before determining the FastPathRev
if GITHUB_RATE_LIMIT_STATUS.is_active() {
debug!("Skipping GitHub fast path attempt for: {url} (rate-limited)");
return Ok(FastPathRev::Indeterminate);
}
let base_url = std::env::var(EnvVars::UV_GITHUB_FAST_PATH_URL)
.unwrap_or("https://api.github.com/repos".to_owned());
let url = format!("{base_url}/{owner}/{repo}/commits/{github_branch_name}");
let runtime = tokio::runtime::Builder::new_current_thread() let runtime = tokio::runtime::Builder::new_current_thread()
.enable_all() .enable_all()
@ -807,6 +817,11 @@ fn github_fast_path(
let response = request.send().await?; let response = request.send().await?;
if is_github_rate_limited(&response) {
// Mark that we are being rate-limited by GitHub
GITHUB_RATE_LIMIT_STATUS.activate();
}
// GitHub returns a 404 if the repository does not exist, and a 422 if it exists but GitHub // GitHub returns a 404 if the repository does not exist, and a 422 if it exists but GitHub
// is unable to resolve the requested revision. // is unable to resolve the requested revision.
response.error_for_status_ref()?; response.error_for_status_ref()?;

View file

@ -7,5 +7,6 @@ pub use crate::source::{Fetch, GitSource, Reporter};
mod credentials; mod credentials;
mod git; mod git;
mod rate_limit;
mod resolver; mod resolver;
mod source; mod source;

View file

@ -0,0 +1,37 @@
use reqwest::{Response, StatusCode};
use std::sync::atomic::{AtomicBool, Ordering};
/// A global state on whether we are being rate-limited by GitHub's REST API.
/// If we are, avoid "fast-path" attempts.
pub(crate) static GITHUB_RATE_LIMIT_STATUS: GitHubRateLimitStatus = GitHubRateLimitStatus::new();
/// GitHub REST API rate limit status tracker.
///
/// ## Assumptions
///
/// The rate limit timeout duration is much longer than the runtime of a `uv` command.
/// And so we do not need to invalidate this state based on `x-ratelimit-reset`.
#[derive(Debug)]
pub(crate) struct GitHubRateLimitStatus(AtomicBool);
impl GitHubRateLimitStatus {
const fn new() -> Self {
Self(AtomicBool::new(false))
}
pub(crate) fn activate(&self) {
self.0.store(true, Ordering::Relaxed);
}
pub(crate) fn is_active(&self) -> bool {
self.0.load(Ordering::Relaxed)
}
}
/// Determine if GitHub is applying rate-limiting based on the response
pub(crate) fn is_github_rate_limited(response: &Response) -> bool {
// HTTP 403 and 429 are possible status codes in the event of a primary or secondary rate limit.
// Source: https://docs.github.com/en/rest/using-the-rest-api/troubleshooting-the-rest-api?apiVersion=2022-11-28#rate-limit-errors
let status_code = response.status();
status_code == StatusCode::FORBIDDEN || status_code == StatusCode::TOO_MANY_REQUESTS
}

View file

@ -15,7 +15,10 @@ use uv_git_types::{GitHubRepository, GitOid, GitReference, GitUrl};
use uv_static::EnvVars; use uv_static::EnvVars;
use uv_version::version; use uv_version::version;
use crate::{Fetch, GitSource, Reporter}; use crate::{
Fetch, GitSource, Reporter,
rate_limit::{GITHUB_RATE_LIMIT_STATUS, is_github_rate_limited},
};
#[derive(Debug, thiserror::Error)] #[derive(Debug, thiserror::Error)]
pub enum GitResolverError { pub enum GitResolverError {
@ -46,6 +49,21 @@ impl GitResolver {
self.0.get(reference) self.0.get(reference)
} }
pub fn get_precise(&self, url: &GitUrl) -> Option<GitOid> {
// If the URL is already precise, return it.
if let Some(precise) = url.precise() {
return Some(precise);
}
// If we know the precise commit already, return it.
let reference = RepositoryReference::from(url);
if let Some(precise) = self.get(&reference) {
return Some(*precise);
}
None
}
/// Resolve a Git URL to a specific commit without performing any Git operations. /// Resolve a Git URL to a specific commit without performing any Git operations.
/// ///
/// Returns a [`GitOid`] if the URL has already been resolved (i.e., is available in the cache), /// Returns a [`GitOid`] if the URL has already been resolved (i.e., is available in the cache),
@ -59,31 +77,32 @@ impl GitResolver {
return Ok(None); return Ok(None);
} }
let reference = RepositoryReference::from(url); // If the URL is already precise or we know the precise commit, return it.
if let Some(precise) = self.get_precise(url) {
// If the URL is already precise, return it.
if let Some(precise) = url.precise() {
return Ok(Some(precise)); return Ok(Some(precise));
} }
// If we know the precise commit already, return it.
if let Some(precise) = self.get(&reference) {
return Ok(Some(*precise));
}
// If the URL is a GitHub URL, attempt to resolve it via the GitHub API. // If the URL is a GitHub URL, attempt to resolve it via the GitHub API.
let Some(GitHubRepository { owner, repo }) = GitHubRepository::parse(url.repository()) let Some(GitHubRepository { owner, repo }) = GitHubRepository::parse(url.repository())
else { else {
return Ok(None); return Ok(None);
}; };
// Check if we're rate-limited by GitHub, before determining the Git reference
if GITHUB_RATE_LIMIT_STATUS.is_active() {
debug!("Rate-limited by GitHub. Skipping GitHub fast path attempt for: {url}");
return Ok(None);
}
// Determine the Git reference. // Determine the Git reference.
let rev = url.reference().as_rev(); let rev = url.reference().as_rev();
let url = format!("https://api.github.com/repos/{owner}/{repo}/commits/{rev}"); let github_api_base_url = std::env::var(EnvVars::UV_GITHUB_FAST_PATH_URL)
.unwrap_or("https://api.github.com/repos".to_owned());
let github_api_url = format!("{github_api_base_url}/{owner}/{repo}/commits/{rev}");
debug!("Querying GitHub for commit at: {url}"); debug!("Querying GitHub for commit at: {github_api_url}");
let mut request = client.get(&url); let mut request = client.get(&github_api_url);
request = request.header("Accept", "application/vnd.github.3.sha"); request = request.header("Accept", "application/vnd.github.3.sha");
request = request.header( request = request.header(
"User-Agent", "User-Agent",
@ -91,13 +110,20 @@ impl GitResolver {
); );
let response = request.send().await?; let response = request.send().await?;
if !response.status().is_success() { let status = response.status();
if !status.is_success() {
// Returns a 404 if the repository does not exist, and a 422 if GitHub is unable to // Returns a 404 if the repository does not exist, and a 422 if GitHub is unable to
// resolve the requested rev. // resolve the requested rev.
debug!( debug!(
"GitHub API request failed for: {url} ({})", "GitHub API request failed for: {github_api_url} ({})",
response.status() response.status()
); );
if is_github_rate_limited(&response) {
// Mark that we are being rate-limited by GitHub
GITHUB_RATE_LIMIT_STATUS.activate();
}
return Ok(None); return Ok(None);
} }
@ -108,7 +134,7 @@ impl GitResolver {
// Insert the resolved URL into the in-memory cache. This ensures that subsequent fetches // Insert the resolved URL into the in-memory cache. This ensures that subsequent fetches
// resolve to the same precise commit. // resolve to the same precise commit.
self.insert(reference, precise); self.insert(RepositoryReference::from(url), precise);
Ok(Some(precise)) Ok(Some(precise))
} }

View file

@ -34,7 +34,7 @@ pub use {
VersionPatternParseError, VersionPatternParseError,
}, },
version_specifier::{ version_specifier::{
VersionSpecifier, VersionSpecifierBuildError, VersionSpecifiers, TildeVersionSpecifier, VersionSpecifier, VersionSpecifierBuildError, VersionSpecifiers,
VersionSpecifiersParseError, VersionSpecifiersParseError,
}, },
}; };

View file

@ -610,6 +610,24 @@ impl Version {
Self::new(self.release().iter().copied()) Self::new(self.release().iter().copied())
} }
/// Return the version with any segments apart from the release removed, with trailing zeroes
/// trimmed.
#[inline]
#[must_use]
pub fn only_release_trimmed(&self) -> Self {
if let Some(last_non_zero) = self.release().iter().rposition(|segment| *segment != 0) {
if last_non_zero == self.release().len() {
// Already trimmed.
self.clone()
} else {
Self::new(self.release().iter().take(last_non_zero + 1).copied())
}
} else {
// `0` is a valid version.
Self::new([0])
}
}
/// Return the version with trailing `.0` release segments removed. /// Return the version with trailing `.0` release segments removed.
/// ///
/// # Panics /// # Panics

View file

@ -132,7 +132,7 @@ impl From<VersionSpecifier> for Ranges<Version> {
pub fn release_specifiers_to_ranges(specifiers: VersionSpecifiers) -> Ranges<Version> { pub fn release_specifiers_to_ranges(specifiers: VersionSpecifiers) -> Ranges<Version> {
let mut range = Ranges::full(); let mut range = Ranges::full();
for specifier in specifiers { for specifier in specifiers {
range = range.intersection(&release_specifier_to_range(specifier)); range = range.intersection(&release_specifier_to_range(specifier, false));
} }
range range
} }
@ -148,67 +148,57 @@ pub fn release_specifiers_to_ranges(specifiers: VersionSpecifiers) -> Ranges<Ver
/// is allowed for projects that declare `requires-python = ">3.13"`. /// is allowed for projects that declare `requires-python = ">3.13"`.
/// ///
/// See: <https://github.com/pypa/pip/blob/a432c7f4170b9ef798a15f035f5dfdb4cc939f35/src/pip/_internal/resolution/resolvelib/candidates.py#L540> /// See: <https://github.com/pypa/pip/blob/a432c7f4170b9ef798a15f035f5dfdb4cc939f35/src/pip/_internal/resolution/resolvelib/candidates.py#L540>
pub fn release_specifier_to_range(specifier: VersionSpecifier) -> Ranges<Version> { pub fn release_specifier_to_range(specifier: VersionSpecifier, trim: bool) -> Ranges<Version> {
let VersionSpecifier { operator, version } = specifier; let VersionSpecifier { operator, version } = specifier;
// Note(konsti): We switched strategies to trimmed for the markers, but we don't want to cause
// churn in lockfile requires-python, so we only trim for markers.
let version_trimmed = if trim {
version.only_release_trimmed()
} else {
version.only_release()
};
match operator { match operator {
Operator::Equal => { // Trailing zeroes are not semantically relevant.
let version = version.only_release(); Operator::Equal => Ranges::singleton(version_trimmed),
Ranges::singleton(version) Operator::ExactEqual => Ranges::singleton(version_trimmed),
} Operator::NotEqual => Ranges::singleton(version_trimmed).complement(),
Operator::ExactEqual => { Operator::LessThan => Ranges::strictly_lower_than(version_trimmed),
let version = version.only_release(); Operator::LessThanEqual => Ranges::lower_than(version_trimmed),
Ranges::singleton(version) Operator::GreaterThan => Ranges::strictly_higher_than(version_trimmed),
} Operator::GreaterThanEqual => Ranges::higher_than(version_trimmed),
Operator::NotEqual => {
let version = version.only_release(); // Trailing zeroes are semantically relevant.
Ranges::singleton(version).complement()
}
Operator::TildeEqual => { Operator::TildeEqual => {
let release = version.release(); let release = version.release();
let [rest @ .., last, _] = &*release else { let [rest @ .., last, _] = &*release else {
unreachable!("~= must have at least two segments"); unreachable!("~= must have at least two segments");
}; };
let upper = Version::new(rest.iter().chain([&(last + 1)])); let upper = Version::new(rest.iter().chain([&(last + 1)]));
let version = version.only_release(); Ranges::from_range_bounds(version_trimmed..upper)
Ranges::from_range_bounds(version..upper)
}
Operator::LessThan => {
let version = version.only_release();
Ranges::strictly_lower_than(version)
}
Operator::LessThanEqual => {
let version = version.only_release();
Ranges::lower_than(version)
}
Operator::GreaterThan => {
let version = version.only_release();
Ranges::strictly_higher_than(version)
}
Operator::GreaterThanEqual => {
let version = version.only_release();
Ranges::higher_than(version)
} }
Operator::EqualStar => { Operator::EqualStar => {
let low = version.only_release(); // For (not-)equal-star, trailing zeroes are still before the star.
let low_full = version.only_release();
let high = { let high = {
let mut high = low.clone(); let mut high = low_full.clone();
let mut release = high.release().to_vec(); let mut release = high.release().to_vec();
*release.last_mut().unwrap() += 1; *release.last_mut().unwrap() += 1;
high = high.with_release(release); high = high.with_release(release);
high high
}; };
Ranges::from_range_bounds(low..high) Ranges::from_range_bounds(version..high)
} }
Operator::NotEqualStar => { Operator::NotEqualStar => {
let low = version.only_release(); // For (not-)equal-star, trailing zeroes are still before the star.
let low_full = version.only_release();
let high = { let high = {
let mut high = low.clone(); let mut high = low_full.clone();
let mut release = high.release().to_vec(); let mut release = high.release().to_vec();
*release.last_mut().unwrap() += 1; *release.last_mut().unwrap() += 1;
high = high.with_release(release); high = high.with_release(release);
high high
}; };
Ranges::from_range_bounds(low..high).complement() Ranges::from_range_bounds(version..high).complement()
} }
} }
} }
@ -223,8 +213,8 @@ impl LowerBound {
/// These bounds use release-only semantics when comparing versions. /// These bounds use release-only semantics when comparing versions.
pub fn new(bound: Bound<Version>) -> Self { pub fn new(bound: Bound<Version>) -> Self {
Self(match bound { Self(match bound {
Bound::Included(version) => Bound::Included(version.only_release()), Bound::Included(version) => Bound::Included(version.only_release_trimmed()),
Bound::Excluded(version) => Bound::Excluded(version.only_release()), Bound::Excluded(version) => Bound::Excluded(version.only_release_trimmed()),
Bound::Unbounded => Bound::Unbounded, Bound::Unbounded => Bound::Unbounded,
}) })
} }
@ -358,8 +348,8 @@ impl UpperBound {
/// These bounds use release-only semantics when comparing versions. /// These bounds use release-only semantics when comparing versions.
pub fn new(bound: Bound<Version>) -> Self { pub fn new(bound: Bound<Version>) -> Self {
Self(match bound { Self(match bound {
Bound::Included(version) => Bound::Included(version.only_release()), Bound::Included(version) => Bound::Included(version.only_release_trimmed()),
Bound::Excluded(version) => Bound::Excluded(version.only_release()), Bound::Excluded(version) => Bound::Excluded(version.only_release_trimmed()),
Bound::Unbounded => Bound::Unbounded, Bound::Unbounded => Bound::Unbounded,
}) })
} }

View file

@ -80,25 +80,39 @@ impl VersionSpecifiers {
// Add specifiers for the holes between the bounds. // Add specifiers for the holes between the bounds.
for (lower, upper) in bounds { for (lower, upper) in bounds {
match (next, lower) { let specifier = match (next, lower) {
// Ex) [3.7, 3.8.5), (3.8.5, 3.9] -> >=3.7,!=3.8.5,<=3.9 // Ex) [3.7, 3.8.5), (3.8.5, 3.9] -> >=3.7,!=3.8.5,<=3.9
(Bound::Excluded(prev), Bound::Excluded(lower)) if prev == lower => { (Bound::Excluded(prev), Bound::Excluded(lower)) if prev == lower => {
specifiers.push(VersionSpecifier::not_equals_version(prev.clone())); Some(VersionSpecifier::not_equals_version(prev.clone()))
} }
// Ex) [3.7, 3.8), (3.8, 3.9] -> >=3.7,!=3.8.*,<=3.9 // Ex) [3.7, 3.8), (3.8, 3.9] -> >=3.7,!=3.8.*,<=3.9
(Bound::Excluded(prev), Bound::Included(lower)) (Bound::Excluded(prev), Bound::Included(lower)) => {
if prev.release().len() == 2 match *prev.only_release_trimmed().release() {
&& *lower.release() == [prev.release()[0], prev.release()[1] + 1] => [major] if *lower.only_release_trimmed().release() == [major, 1] => {
{ Some(VersionSpecifier::not_equals_star_version(Version::new([
specifiers.push(VersionSpecifier::not_equals_star_version(prev.clone())); major, 0,
])))
} }
_ => { [major, minor]
if *lower.only_release_trimmed().release() == [major, minor + 1] =>
{
Some(VersionSpecifier::not_equals_star_version(Version::new([
major, minor,
])))
}
_ => None,
}
}
_ => None,
};
if let Some(specifier) = specifier {
specifiers.push(specifier);
} else {
#[cfg(feature = "tracing")] #[cfg(feature = "tracing")]
warn!( warn!(
"Ignoring unsupported gap in `requires-python` version: {next:?} -> {lower:?}" "Ignoring unsupported gap in `requires-python` version: {next:?} -> {lower:?}"
); );
} }
}
next = upper; next = upper;
} }
let end = next; let end = next;
@ -348,6 +362,33 @@ impl VersionSpecifier {
Ok(Self { operator, version }) Ok(Self { operator, version })
} }
/// Remove all non-release parts of the version.
///
/// The marker decision diagram relies on the assumption that the negation of a marker tree is
/// the complement of the marker space. However, pre-release versions violate this assumption.
///
/// For example, the marker `python_full_version > '3.9' or python_full_version <= '3.9'`
/// does not match `python_full_version == 3.9.0a0` and so cannot simplify to `true`. However,
/// its negation, `python_full_version > '3.9' and python_full_version <= '3.9'`, also does not
/// match `3.9.0a0` and simplifies to `false`, which violates the algebra decision diagrams
/// rely on. For this reason we ignore pre-release versions entirely when evaluating markers.
///
/// Note that `python_version` cannot take on pre-release values as it is truncated to just the
/// major and minor version segments. Thus using release-only specifiers is definitely necessary
/// for `python_version` to fully simplify any ranges, such as
/// `python_version > '3.9' or python_version <= '3.9'`, which is always `true` for
/// `python_version`. For `python_full_version` however, this decision is a semantic change.
///
/// For Python versions, the major.minor is considered the API version, so unlike the rules
/// for package versions in PEP 440, we Python `3.9.0a0` is acceptable for `>= "3.9"`.
#[must_use]
pub fn only_release(self) -> Self {
Self {
operator: self.operator,
version: self.version.only_release(),
}
}
/// `==<version>` /// `==<version>`
pub fn equals_version(version: Version) -> Self { pub fn equals_version(version: Version) -> Self {
Self { Self {
@ -416,7 +457,7 @@ impl VersionSpecifier {
&self.operator &self.operator
} }
/// Get the version, e.g. `<=` in `<= 2.0.0` /// Get the version, e.g. `2.0.0` in `<= 2.0.0`
pub fn version(&self) -> &Version { pub fn version(&self) -> &Version {
&self.version &self.version
} }
@ -442,14 +483,23 @@ impl VersionSpecifier {
(Some(VersionSpecifier::equals_version(v1.clone())), None) (Some(VersionSpecifier::equals_version(v1.clone())), None)
} }
// `v >= 3.7 && v < 3.8` is equivalent to `v == 3.7.*` // `v >= 3.7 && v < 3.8` is equivalent to `v == 3.7.*`
(Bound::Included(v1), Bound::Excluded(v2)) (Bound::Included(v1), Bound::Excluded(v2)) => {
if v1.release().len() == 2 match *v1.only_release_trimmed().release() {
&& *v2.release() == [v1.release()[0], v1.release()[1] + 1] => [major] if *v2.only_release_trimmed().release() == [major, 1] => {
let version = Version::new([major, 0]);
(Some(VersionSpecifier::equals_star_version(version)), None)
}
[major, minor]
if *v2.only_release_trimmed().release() == [major, minor + 1] =>
{ {
( let version = Version::new([major, minor]);
Some(VersionSpecifier::equals_star_version(v1.clone())), (Some(VersionSpecifier::equals_star_version(version)), None)
None, }
) _ => (
VersionSpecifier::from_lower_bound(&Bound::Included(v1.clone())),
VersionSpecifier::from_upper_bound(&Bound::Excluded(v2.clone())),
),
}
} }
(lower, upper) => ( (lower, upper) => (
VersionSpecifier::from_lower_bound(lower), VersionSpecifier::from_lower_bound(lower),
@ -838,6 +888,90 @@ pub(crate) fn parse_version_specifiers(
Ok(version_ranges) Ok(version_ranges)
} }
/// A simple `~=` version specifier with a major, minor and (optional) patch version, e.g., `~=3.13`
/// or `~=3.13.0`.
#[derive(Clone, Debug)]
pub struct TildeVersionSpecifier<'a> {
inner: Cow<'a, VersionSpecifier>,
}
impl<'a> TildeVersionSpecifier<'a> {
/// Create a new [`TildeVersionSpecifier`] from a [`VersionSpecifier`] value.
///
/// If a [`Operator::TildeEqual`] is not used, or the version includes more than minor and patch
/// segments, this will return [`None`].
pub fn from_specifier(specifier: VersionSpecifier) -> Option<TildeVersionSpecifier<'a>> {
TildeVersionSpecifier::new(Cow::Owned(specifier))
}
/// Create a new [`TildeVersionSpecifier`] from a [`VersionSpecifier`] reference.
///
/// See [`TildeVersionSpecifier::from_specifier`].
pub fn from_specifier_ref(
specifier: &'a VersionSpecifier,
) -> Option<TildeVersionSpecifier<'a>> {
TildeVersionSpecifier::new(Cow::Borrowed(specifier))
}
fn new(specifier: Cow<'a, VersionSpecifier>) -> Option<Self> {
if specifier.operator != Operator::TildeEqual {
return None;
}
if specifier.version().release().len() < 2 || specifier.version().release().len() > 3 {
return None;
}
if specifier.version().any_prerelease()
|| specifier.version().is_local()
|| specifier.version().is_post()
{
return None;
}
Some(Self { inner: specifier })
}
/// Whether a patch version is present in this tilde version specifier.
pub fn has_patch(&self) -> bool {
self.inner.version.release().len() == 3
}
/// Construct the lower and upper bounding version specifiers for this tilde version specifier,
/// e.g., for `~=3.13` this would return `>=3.13` and `<4` and for `~=3.13.0` it would
/// return `>=3.13.0` and `<3.14`.
pub fn bounding_specifiers(&self) -> (VersionSpecifier, VersionSpecifier) {
let release = self.inner.version().release();
let lower = self.inner.version.clone();
let upper = if self.has_patch() {
Version::new([release[0], release[1] + 1])
} else {
Version::new([release[0] + 1])
};
(
VersionSpecifier::greater_than_equal_version(lower),
VersionSpecifier::less_than_version(upper),
)
}
/// Construct a new tilde `VersionSpecifier` with the given patch version appended.
pub fn with_patch_version(&self, patch: u64) -> TildeVersionSpecifier {
let mut release = self.inner.version.release().to_vec();
if self.has_patch() {
release.pop();
}
release.push(patch);
TildeVersionSpecifier::from_specifier(
VersionSpecifier::from_version(Operator::TildeEqual, Version::new(release))
.expect("We should always derive a valid new version specifier"),
)
.expect("We should always derive a new tilde version specifier")
}
}
impl std::fmt::Display for TildeVersionSpecifier<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.inner)
}
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use std::{cmp::Ordering, str::FromStr}; use std::{cmp::Ordering, str::FromStr};

View file

@ -41,7 +41,7 @@ version-ranges = { workspace = true }
[dev-dependencies] [dev-dependencies]
insta = { version = "1.40.0" } insta = { version = "1.40.0" }
serde_json = { version = "1.0.128" } serde_json = { workspace = true }
tracing-test = { version = "0.2.5" } tracing-test = { version = "0.2.5" }
[features] [features]

View file

@ -16,6 +16,8 @@
#![warn(missing_docs)] #![warn(missing_docs)]
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::error::Error; use std::error::Error;
use std::fmt::{Debug, Display, Formatter}; use std::fmt::{Debug, Display, Formatter};
use std::path::Path; use std::path::Path;
@ -334,22 +336,15 @@ impl Reporter for TracingReporter {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl<T: Pep508Url> schemars::JsonSchema for Requirement<T> { impl<T: Pep508Url> schemars::JsonSchema for Requirement<T> {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
"Requirement".to_string() Cow::Borrowed("Requirement")
} }
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(_gen: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::schema::SchemaObject { schemars::json_schema!({
instance_type: Some(schemars::schema::InstanceType::String.into()), "type": "string",
metadata: Some(Box::new(schemars::schema::Metadata { "description": "A PEP 508 dependency specifier, e.g., `ruff >= 0.6.0`"
description: Some( })
"A PEP 508 dependency specifier, e.g., `ruff >= 0.6.0`".to_string(),
),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
} }
} }

View file

@ -172,7 +172,7 @@ impl InternerGuard<'_> {
), ),
// Normalize `python_version` markers to `python_full_version` nodes. // Normalize `python_version` markers to `python_full_version` nodes.
MarkerValueVersion::PythonVersion => { MarkerValueVersion::PythonVersion => {
match python_version_to_full_version(normalize_specifier(specifier)) { match python_version_to_full_version(specifier.only_release()) {
Ok(specifier) => ( Ok(specifier) => (
Variable::Version(CanonicalMarkerValueVersion::PythonFullVersion), Variable::Version(CanonicalMarkerValueVersion::PythonFullVersion),
Edges::from_specifier(specifier), Edges::from_specifier(specifier),
@ -1214,7 +1214,7 @@ impl Edges {
/// Returns the [`Edges`] for a version specifier. /// Returns the [`Edges`] for a version specifier.
fn from_specifier(specifier: VersionSpecifier) -> Edges { fn from_specifier(specifier: VersionSpecifier) -> Edges {
let specifier = release_specifier_to_range(normalize_specifier(specifier)); let specifier = release_specifier_to_range(specifier.only_release(), true);
Edges::Version { Edges::Version {
edges: Edges::from_range(&specifier), edges: Edges::from_range(&specifier),
} }
@ -1227,9 +1227,9 @@ impl Edges {
let mut range: Ranges<Version> = versions let mut range: Ranges<Version> = versions
.into_iter() .into_iter()
.map(|version| { .map(|version| {
let specifier = VersionSpecifier::equals_version(version.clone()); let specifier = VersionSpecifier::equals_version(version.only_release());
let specifier = python_version_to_full_version(specifier)?; let specifier = python_version_to_full_version(specifier)?;
Ok(release_specifier_to_range(normalize_specifier(specifier))) Ok(release_specifier_to_range(specifier, true))
}) })
.flatten_ok() .flatten_ok()
.collect::<Result<Ranges<_>, NodeId>>()?; .collect::<Result<Ranges<_>, NodeId>>()?;
@ -1526,57 +1526,62 @@ impl Edges {
} }
} }
// Normalize a [`VersionSpecifier`] before adding it to the tree.
fn normalize_specifier(specifier: VersionSpecifier) -> VersionSpecifier {
let (operator, version) = specifier.into_parts();
// The decision diagram relies on the assumption that the negation of a marker tree is
// the complement of the marker space. However, pre-release versions violate this assumption.
//
// For example, the marker `python_full_version > '3.9' or python_full_version <= '3.9'`
// does not match `python_full_version == 3.9.0a0` and so cannot simplify to `true`. However,
// its negation, `python_full_version > '3.9' and python_full_version <= '3.9'`, also does not
// match `3.9.0a0` and simplifies to `false`, which violates the algebra decision diagrams
// rely on. For this reason we ignore pre-release versions entirely when evaluating markers.
//
// Note that `python_version` cannot take on pre-release values as it is truncated to just the
// major and minor version segments. Thus using release-only specifiers is definitely necessary
// for `python_version` to fully simplify any ranges, such as `python_version > '3.9' or python_version <= '3.9'`,
// which is always `true` for `python_version`. For `python_full_version` however, this decision
// is a semantic change.
let mut release = &*version.release();
// Strip any trailing `0`s.
//
// The [`Version`] type ignores trailing `0`s for equality, but still preserves them in its
// [`Display`] output. We must normalize all versions by stripping trailing `0`s to remove the
// distinction between versions like `3.9` and `3.9.0`. Otherwise, their output would depend on
// which form was added to the global marker interner first.
//
// Note that we cannot strip trailing `0`s for star equality, as `==3.0.*` is different from `==3.*`.
if !operator.is_star() {
if let Some(end) = release.iter().rposition(|segment| *segment != 0) {
if end > 0 {
release = &release[..=end];
}
}
}
VersionSpecifier::from_version(operator, Version::new(release)).unwrap()
}
/// Returns the equivalent `python_full_version` specifier for a `python_version` specifier. /// Returns the equivalent `python_full_version` specifier for a `python_version` specifier.
/// ///
/// Returns `Err` with a constant node if the equivalent comparison is always `true` or `false`. /// Returns `Err` with a constant node if the equivalent comparison is always `true` or `false`.
fn python_version_to_full_version(specifier: VersionSpecifier) -> Result<VersionSpecifier, NodeId> { fn python_version_to_full_version(specifier: VersionSpecifier) -> Result<VersionSpecifier, NodeId> {
// Trailing zeroes matter only for (not-)equals-star and tilde-equals. This means that below
// the next two blocks, we can use the trimmed release as the release.
if specifier.operator().is_star() {
// Input python_version python_full_version
// ==3.* 3.* 3.*
// ==3.0.* 3.0 3.0.*
// ==3.0.0.* 3.0 3.0.*
// ==3.9.* 3.9 3.9.*
// ==3.9.0.* 3.9 3.9.*
// ==3.9.0.0.* 3.9 3.9.*
// ==3.9.1.* FALSE FALSE
// ==3.9.1.0.* FALSE FALSE
// ==3.9.1.0.0.* FALSE FALSE
return match &*specifier.version().release() {
// `3.*`
[_major] => Ok(specifier),
// Ex) `3.9.*`, `3.9.0.*`, or `3.9.0.0.*`
[major, minor, rest @ ..] if rest.iter().all(|x| *x == 0) => {
let python_version = Version::new([major, minor]);
// Unwrap safety: A star operator with two version segments is always valid.
Ok(VersionSpecifier::from_version(*specifier.operator(), python_version).unwrap())
}
// Ex) `3.9.1.*` or `3.9.0.1.*`
_ => Err(NodeId::FALSE),
};
}
if *specifier.operator() == Operator::TildeEqual {
// python_version python_full_version
// ~=3 (not possible)
// ~= 3.0 >= 3.0, < 4.0
// ~= 3.9 >= 3.9, < 4.0
// ~= 3.9.0 == 3.9.*
// ~= 3.9.1 FALSE
// ~= 3.9.0.0 == 3.9.*
// ~= 3.9.0.1 FALSE
return match &*specifier.version().release() {
// Ex) `3.0`, `3.7`
[_major, _minor] => Ok(specifier),
// Ex) `3.9`, `3.9.0`, or `3.9.0.0`
[major, minor, rest @ ..] if rest.iter().all(|x| *x == 0) => {
let python_version = Version::new([major, minor]);
Ok(VersionSpecifier::equals_star_version(python_version))
}
// Ex) `3.9.1` or `3.9.0.1`
_ => Err(NodeId::FALSE),
};
}
// Extract the major and minor version segments if the specifier contains exactly // Extract the major and minor version segments if the specifier contains exactly
// those segments, or if it contains a major segment with an implied minor segment of `0`. // those segments, or if it contains a major segment with an implied minor segment of `0`.
let major_minor = match *specifier.version().release() { let major_minor = match *specifier.version().only_release_trimmed().release() {
// For star operators, we cannot add a trailing `0`.
//
// `python_version == 3.*` is equivalent to `python_full_version == 3.*`. Adding a
// trailing `0` would result in `python_version == 3.0.*`, which is incorrect.
[_major] if specifier.operator().is_star() => return Ok(specifier),
// Add a trailing `0` for the minor version, which is implied. // Add a trailing `0` for the minor version, which is implied.
// For example, `python_version == 3` matches `3.0.1`, `3.0.2`, etc. // For example, `python_version == 3` matches `3.0.1`, `3.0.2`, etc.
[major] => Some((major, 0)), [major] => Some((major, 0)),
@ -1614,9 +1619,10 @@ fn python_version_to_full_version(specifier: VersionSpecifier) -> Result<Version
VersionSpecifier::less_than_version(Version::new([major, minor + 1])) VersionSpecifier::less_than_version(Version::new([major, minor + 1]))
} }
// `==3.7.*`, `!=3.7.*`, `~=3.7` already represent the equivalent `python_full_version` Operator::EqualStar | Operator::NotEqualStar | Operator::TildeEqual => {
// comparison. // Handled above.
Operator::EqualStar | Operator::NotEqualStar | Operator::TildeEqual => specifier, unreachable!()
}
}) })
} else { } else {
let [major, minor, ..] = *specifier.version().release() else { let [major, minor, ..] = *specifier.version().release() else {
@ -1624,13 +1630,14 @@ fn python_version_to_full_version(specifier: VersionSpecifier) -> Result<Version
}; };
Ok(match specifier.operator() { Ok(match specifier.operator() {
// `python_version` cannot have more than two release segments, so equality is impossible. // `python_version` cannot have more than two release segments, and we know
Operator::Equal | Operator::ExactEqual | Operator::EqualStar | Operator::TildeEqual => { // that the following release segments aren't purely zeroes so equality is impossible.
Operator::Equal | Operator::ExactEqual => {
return Err(NodeId::FALSE); return Err(NodeId::FALSE);
} }
// Similarly, inequalities are always `true`. // Similarly, inequalities are always `true`.
Operator::NotEqual | Operator::NotEqualStar => return Err(NodeId::TRUE), Operator::NotEqual => return Err(NodeId::TRUE),
// `python_version {<,<=} 3.7.8` is equivalent to `python_full_version < 3.8`. // `python_version {<,<=} 3.7.8` is equivalent to `python_full_version < 3.8`.
Operator::LessThan | Operator::LessThanEqual => { Operator::LessThan | Operator::LessThanEqual => {
@ -1641,6 +1648,11 @@ fn python_version_to_full_version(specifier: VersionSpecifier) -> Result<Version
Operator::GreaterThan | Operator::GreaterThanEqual => { Operator::GreaterThan | Operator::GreaterThanEqual => {
VersionSpecifier::greater_than_equal_version(Version::new([major, minor + 1])) VersionSpecifier::greater_than_equal_version(Version::new([major, minor + 1]))
} }
Operator::EqualStar | Operator::NotEqualStar | Operator::TildeEqual => {
// Handled above.
unreachable!()
}
}) })
} }
} }

View file

@ -64,8 +64,8 @@ fn collect_dnf(
continue; continue;
} }
// Detect whether the range for this edge can be simplified as a star inequality. // Detect whether the range for this edge can be simplified as a star specifier.
if let Some(specifier) = star_range_inequality(&range) { if let Some(specifier) = star_range_specifier(&range) {
path.push(MarkerExpression::Version { path.push(MarkerExpression::Version {
key: marker.key().into(), key: marker.key().into(),
specifier, specifier,
@ -343,23 +343,35 @@ where
Some(excluded) Some(excluded)
} }
/// Returns `Some` if the version expression can be simplified as a star inequality with the given /// Returns `Some` if the version range can be simplified as a star specifier.
/// specifier.
/// ///
/// For example, `python_full_version < '3.8' or python_full_version >= '3.9'` can be simplified to /// Only for the two bounds case not covered by [`VersionSpecifier::from_release_only_bounds`].
/// `python_full_version != '3.8.*'`. ///
fn star_range_inequality(range: &Ranges<Version>) -> Option<VersionSpecifier> { /// For negative ranges like `python_full_version < '3.8' or python_full_version >= '3.9'`,
/// returns `!= '3.8.*'`.
fn star_range_specifier(range: &Ranges<Version>) -> Option<VersionSpecifier> {
if range.iter().count() != 2 {
return None;
}
// Check for negative star range: two segments [(Unbounded, Excluded(v1)), (Included(v2), Unbounded)]
let (b1, b2) = range.iter().collect_tuple()?; let (b1, b2) = range.iter().collect_tuple()?;
if let ((Bound::Unbounded, Bound::Excluded(v1)), (Bound::Included(v2), Bound::Unbounded)) =
match (b1, b2) { (b1, b2)
((Bound::Unbounded, Bound::Excluded(v1)), (Bound::Included(v2), Bound::Unbounded))
if v1.release().len() == 2
&& *v2.release() == [v1.release()[0], v1.release()[1] + 1] =>
{ {
match *v1.only_release_trimmed().release() {
[major] if *v2.release() == [major, 1] => {
Some(VersionSpecifier::not_equals_star_version(Version::new([
major, 0,
])))
}
[major, minor] if *v2.release() == [major, minor + 1] => {
Some(VersionSpecifier::not_equals_star_version(v1.clone())) Some(VersionSpecifier::not_equals_star_version(v1.clone()))
} }
_ => None, _ => None,
} }
} else {
None
}
} }
/// Returns `true` if the LHS is the negation of the RHS, or vice versa. /// Returns `true` if the LHS is the negation of the RHS, or vice versa.

View file

@ -1707,23 +1707,15 @@ impl Display for MarkerTreeContents {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for MarkerTree { impl schemars::JsonSchema for MarkerTree {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
"MarkerTree".to_string() Cow::Borrowed("MarkerTree")
} }
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::schema::SchemaObject { schemars::json_schema!({
instance_type: Some(schemars::schema::InstanceType::String.into()), "type": "string",
metadata: Some(Box::new(schemars::schema::Metadata { "description": "A PEP 508-compliant marker expression, e.g., `sys_platform == 'Darwin'`"
description: Some( })
"A PEP 508-compliant marker expression, e.g., `sys_platform == 'Darwin'`"
.to_string(),
),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
} }
} }
@ -2279,13 +2271,13 @@ mod test {
#[test] #[test]
fn test_marker_simplification() { fn test_marker_simplification() {
assert_false("python_version == '3.9.1'"); assert_false("python_version == '3.9.1'");
assert_false("python_version == '3.9.0.*'");
assert_true("python_version != '3.9.1'"); assert_true("python_version != '3.9.1'");
// Technically these is are valid substring comparison, but we do not allow them. // This is an edge case that happens to be supported, but is not critical to support.
// e.g., using a version with patch components with `python_version` is considered assert_simplifies(
// impossible to satisfy since the value it is truncated at the minor version "python_version in '3.9.0'",
assert_false("python_version in '3.9.0'"); "python_full_version == '3.9.*'",
);
// e.g., using a version that is not PEP 440 compliant is considered arbitrary // e.g., using a version that is not PEP 440 compliant is considered arbitrary
assert_true("python_version in 'foo'"); assert_true("python_version in 'foo'");
// e.g., including `*` versions, which would require tracking a version specifier // e.g., including `*` versions, which would require tracking a version specifier
@ -2295,16 +2287,25 @@ mod test {
assert_true("python_version in '3.9,3.10'"); assert_true("python_version in '3.9,3.10'");
assert_true("python_version in '3.9 or 3.10'"); assert_true("python_version in '3.9 or 3.10'");
// e.g, when one of the values cannot be true // This is an edge case that happens to be supported, but is not critical to support.
// TODO(zanieb): This seems like a quirk of the `python_full_version` normalization, this assert_simplifies(
// should just act as though the patch version isn't present "python_version in '3.9 3.10.0 3.11'",
assert_false("python_version in '3.9 3.10.0 3.11'"); "python_full_version >= '3.9' and python_full_version < '3.12'",
);
assert_simplifies("python_version == '3.9'", "python_full_version == '3.9.*'"); assert_simplifies("python_version == '3.9'", "python_full_version == '3.9.*'");
assert_simplifies( assert_simplifies(
"python_version == '3.9.0'", "python_version == '3.9.0'",
"python_full_version == '3.9.*'", "python_full_version == '3.9.*'",
); );
assert_simplifies(
"python_version == '3.9.0.*'",
"python_full_version == '3.9.*'",
);
assert_simplifies(
"python_version == '3.*'",
"python_full_version >= '3' and python_full_version < '4'",
);
// `<version> in` // `<version> in`
// e.g., when the range is not contiguous // e.g., when the range is not contiguous
@ -2515,7 +2516,7 @@ mod test {
#[test] #[test]
fn test_simplification_extra_versus_other() { fn test_simplification_extra_versus_other() {
// Here, the `extra != 'foo'` cannot be simplified out, because // Here, the `extra != 'foo'` cannot be simplified out, because
// `extra == 'foo'` can be true even when `extra == 'bar`' is true. // `extra == 'foo'` can be true even when `extra == 'bar'`' is true.
assert_simplifies( assert_simplifies(
r#"extra != "foo" and (extra == "bar" or extra == "baz")"#, r#"extra != "foo" and (extra == "bar" or extra == "baz")"#,
"(extra == 'bar' and extra != 'foo') or (extra == 'baz' and extra != 'foo')", "(extra == 'bar' and extra != 'foo') or (extra == 'baz' and extra != 'foo')",
@ -2536,6 +2537,68 @@ mod test {
); );
} }
#[test]
fn test_python_version_equal_star() {
// Input, equivalent with python_version, equivalent with python_full_version
let cases = [
("3.*", "3.*", "3.*"),
("3.0.*", "3.0", "3.0.*"),
("3.0.0.*", "3.0", "3.0.*"),
("3.9.*", "3.9", "3.9.*"),
("3.9.0.*", "3.9", "3.9.*"),
("3.9.0.0.*", "3.9", "3.9.*"),
];
for (input, equal_python_version, equal_python_full_version) in cases {
assert_eq!(
m(&format!("python_version == '{input}'")),
m(&format!("python_version == '{equal_python_version}'")),
"{input} {equal_python_version}"
);
assert_eq!(
m(&format!("python_version == '{input}'")),
m(&format!(
"python_full_version == '{equal_python_full_version}'"
)),
"{input} {equal_python_full_version}"
);
}
let cases_false = ["3.9.1.*", "3.9.1.0.*", "3.9.1.0.0.*"];
for input in cases_false {
assert!(
m(&format!("python_version == '{input}'")).is_false(),
"{input}"
);
}
}
#[test]
fn test_tilde_equal_normalization() {
assert_eq!(
m("python_version ~= '3.10.0'"),
m("python_version >= '3.10.0' and python_version < '3.11.0'")
);
// Two digit versions such as `python_version` get padded with a zero, so they can never
// match
assert_eq!(m("python_version ~= '3.10.1'"), MarkerTree::FALSE);
assert_eq!(
m("python_version ~= '3.10'"),
m("python_version >= '3.10' and python_version < '4.0'")
);
assert_eq!(
m("python_full_version ~= '3.10.0'"),
m("python_full_version >= '3.10.0' and python_full_version < '3.11.0'")
);
assert_eq!(
m("python_full_version ~= '3.10'"),
m("python_full_version >= '3.10' and python_full_version < '4.0'")
);
}
/// This tests marker implication. /// This tests marker implication.
/// ///
/// Specifically, these test cases come from a [bug] where `foo` and `bar` /// Specifically, these test cases come from a [bug] where `foo` and `bar`
@ -3332,4 +3395,32 @@ mod test {
] ]
); );
} }
/// Case a: There is no version `3` (no trailing zero) in the interner yet.
#[test]
fn marker_normalization_a() {
let left_tree = m("python_version == '3.0.*'");
let left = left_tree.try_to_string().unwrap();
let right = "python_full_version == '3.0.*'";
assert_eq!(left, right, "{left} != {right}");
}
/// Case b: There is already a version `3` (no trailing zero) in the interner.
#[test]
fn marker_normalization_b() {
m("python_version >= '3' and python_version <= '3.0'");
let left_tree = m("python_version == '3.0.*'");
let left = left_tree.try_to_string().unwrap();
let right = "python_full_version == '3.0.*'";
assert_eq!(left, right, "{left} != {right}");
}
#[test]
fn marker_normalization_c() {
let left_tree = MarkerTree::from_str("python_version == '3.10.0.*'").unwrap();
let left = left_tree.try_to_string().unwrap();
let right = "python_full_version == '3.10.*'";
assert_eq!(left, right, "{left} != {right}");
}
} }

View file

@ -18,11 +18,16 @@ use uv_redacted::DisplaySafeUrl;
use crate::Pep508Url; use crate::Pep508Url;
/// A wrapper around [`Url`] that preserves the original string. /// A wrapper around [`Url`] that preserves the original string.
///
/// The original string is not preserved after serialization/deserialization.
#[derive(Debug, Clone, Eq)] #[derive(Debug, Clone, Eq)]
pub struct VerbatimUrl { pub struct VerbatimUrl {
/// The parsed URL. /// The parsed URL.
url: DisplaySafeUrl, url: DisplaySafeUrl,
/// The URL as it was provided by the user. /// The URL as it was provided by the user.
///
/// Even if originally set, this will be [`None`] after
/// serialization/deserialization.
given: Option<ArcStr>, given: Option<ArcStr>,
} }
@ -166,6 +171,11 @@ impl VerbatimUrl {
&self.url &self.url
} }
/// Return a mutable reference to the underlying [`DisplaySafeUrl`].
pub fn raw_mut(&mut self) -> &mut DisplaySafeUrl {
&mut self.url
}
/// Convert a [`VerbatimUrl`] into a [`DisplaySafeUrl`]. /// Convert a [`VerbatimUrl`] into a [`DisplaySafeUrl`].
pub fn to_url(&self) -> DisplaySafeUrl { pub fn to_url(&self) -> DisplaySafeUrl {
self.url.clone() self.url.clone()

View file

@ -758,6 +758,14 @@ impl FormMetadata {
} }
} }
impl<'a> IntoIterator for &'a FormMetadata {
type Item = &'a (&'a str, String);
type IntoIter = std::slice::Iter<'a, (&'a str, String)>;
fn into_iter(self) -> Self::IntoIter {
self.iter()
}
}
/// Build the upload request. /// Build the upload request.
/// ///
/// Returns the request and the reporter progress bar id. /// Returns the request and the reporter progress bar id.

View file

@ -3,6 +3,8 @@ use petgraph::{
graph::{DiGraph, NodeIndex}, graph::{DiGraph, NodeIndex},
}; };
use rustc_hash::{FxHashMap, FxHashSet}; use rustc_hash::{FxHashMap, FxHashSet};
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::{collections::BTreeSet, hash::Hash, rc::Rc}; use std::{collections::BTreeSet, hash::Hash, rc::Rc};
use uv_normalize::{ExtraName, GroupName, PackageName}; use uv_normalize::{ExtraName, GroupName, PackageName};
@ -638,12 +640,12 @@ pub struct SchemaConflictItem {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for SchemaConflictItem { impl schemars::JsonSchema for SchemaConflictItem {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
"SchemaConflictItem".to_string() Cow::Borrowed("SchemaConflictItem")
} }
fn json_schema(r#gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
<ConflictItemWire as schemars::JsonSchema>::json_schema(r#gen) <ConflictItemWire as schemars::JsonSchema>::json_schema(generator)
} }
} }

View file

@ -1,4 +1,6 @@
use serde::{Serialize, Serializer}; use serde::{Serialize, Serializer};
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::fmt::Display; use std::fmt::Display;
use std::str::FromStr; use std::str::FromStr;
use thiserror::Error; use thiserror::Error;
@ -99,25 +101,16 @@ impl Serialize for Identifier {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for Identifier { impl schemars::JsonSchema for Identifier {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
"Identifier".to_string() Cow::Borrowed("Identifier")
} }
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::schema::SchemaObject { schemars::json_schema!({
instance_type: Some(schemars::schema::InstanceType::String.into()), "type": "string",
string: Some(Box::new(schemars::schema::StringValidation { "pattern": r"^[_\p{Alphabetic}][_0-9\p{Alphabetic}]*$",
// Best-effort Unicode support (https://stackoverflow.com/a/68844380/3549270) "description": "An identifier in Python"
pattern: Some(r"^[_\p{Alphabetic}][_0-9\p{Alphabetic}]*$".to_string()), })
..schemars::schema::StringValidation::default()
})),
metadata: Some(Box::new(schemars::schema::Metadata {
description: Some("An identifier in Python".to_string()),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
} }
} }

File diff suppressed because it is too large Load diff

View file

@ -1433,7 +1433,7 @@ pub(crate) fn is_windows_store_shim(path: &Path) -> bool {
0, 0,
buf.as_mut_ptr().cast(), buf.as_mut_ptr().cast(),
buf.len() as u32 * 2, buf.len() as u32 * 2,
&mut bytes_returned, &raw mut bytes_returned,
std::ptr::null_mut(), std::ptr::null_mut(),
) != 0 ) != 0
}; };

View file

@ -12,7 +12,7 @@ use futures::TryStreamExt;
use itertools::Itertools; use itertools::Itertools;
use once_cell::sync::OnceCell; use once_cell::sync::OnceCell;
use owo_colors::OwoColorize; use owo_colors::OwoColorize;
use reqwest_retry::RetryPolicy; use reqwest_retry::{RetryError, RetryPolicy};
use serde::Deserialize; use serde::Deserialize;
use thiserror::Error; use thiserror::Error;
use tokio::io::{AsyncRead, AsyncWriteExt, BufWriter, ReadBuf}; use tokio::io::{AsyncRead, AsyncWriteExt, BufWriter, ReadBuf};
@ -111,6 +111,33 @@ pub enum Error {
}, },
} }
impl Error {
// Return the number of attempts that were made to complete this request before this error was
// returned. Note that e.g. 3 retries equates to 4 attempts.
//
// It's easier to do arithmetic with "attempts" instead of "retries", because if you have
// nested retry loops you can just add up all the attempts directly, while adding up the
// retries requires +1/-1 adjustments.
fn attempts(&self) -> u32 {
// Unfortunately different variants of `Error` track retry counts in different ways. We
// could consider unifying the variants we handle here in `Error::from_reqwest_middleware`
// instead, but both approaches will be fragile as new variants get added over time.
if let Error::NetworkErrorWithRetries { retries, .. } = self {
return retries + 1;
}
// TODO(jack): let-chains are stable as of Rust 1.88. We should use them here as soon as
// our rust-version is high enough.
if let Error::NetworkMiddlewareError(_, anyhow_error) = self {
if let Some(RetryError::WithRetries { retries, .. }) =
anyhow_error.downcast_ref::<RetryError>()
{
return retries + 1;
}
}
1
}
}
#[derive(Debug, PartialEq, Eq, Clone, Hash)] #[derive(Debug, PartialEq, Eq, Clone, Hash)]
pub struct ManagedPythonDownload { pub struct ManagedPythonDownload {
key: PythonInstallationKey, key: PythonInstallationKey,
@ -695,7 +722,8 @@ impl ManagedPythonDownload {
pypy_install_mirror: Option<&str>, pypy_install_mirror: Option<&str>,
reporter: Option<&dyn Reporter>, reporter: Option<&dyn Reporter>,
) -> Result<DownloadResult, Error> { ) -> Result<DownloadResult, Error> {
let mut n_past_retries = 0; let mut total_attempts = 0;
let mut retried_here = false;
let start_time = SystemTime::now(); let start_time = SystemTime::now();
let retry_policy = client.retry_policy(); let retry_policy = client.retry_policy();
loop { loop {
@ -710,13 +738,19 @@ impl ManagedPythonDownload {
reporter, reporter,
) )
.await; .await;
if result let result = match result {
.as_ref() Ok(download_result) => Ok(download_result),
.err() Err(err) => {
.is_some_and(|err| is_extended_transient_error(err)) // Inner retry loops (e.g. `reqwest-retry` middleware) might make more than one
{ // attempt per error we see here.
total_attempts += err.attempts();
// We currently interpret e.g. "3 retries" to mean we should make 4 attempts.
let n_past_retries = total_attempts - 1;
if is_extended_transient_error(&err) {
let retry_decision = retry_policy.should_retry(start_time, n_past_retries); let retry_decision = retry_policy.should_retry(start_time, n_past_retries);
if let reqwest_retry::RetryDecision::Retry { execute_after } = retry_decision { if let reqwest_retry::RetryDecision::Retry { execute_after } =
retry_decision
{
debug!( debug!(
"Transient failure while handling response for {}; retrying...", "Transient failure while handling response for {}; retrying...",
self.key() self.key()
@ -725,10 +759,20 @@ impl ManagedPythonDownload {
.duration_since(SystemTime::now()) .duration_since(SystemTime::now())
.unwrap_or_else(|_| Duration::default()); .unwrap_or_else(|_| Duration::default());
tokio::time::sleep(duration).await; tokio::time::sleep(duration).await;
n_past_retries += 1; retried_here = true;
continue; continue; // Retry.
} }
} }
if retried_here {
Err(Error::NetworkErrorWithRetries {
err: Box::new(err),
retries: n_past_retries,
})
} else {
Err(err)
}
}
};
return result; return result;
} }
} }
@ -772,7 +816,9 @@ impl ManagedPythonDownload {
let temp_dir = tempfile::tempdir_in(scratch_dir).map_err(Error::DownloadDirError)?; let temp_dir = tempfile::tempdir_in(scratch_dir).map_err(Error::DownloadDirError)?;
if let Some(python_builds_dir) = env::var_os(EnvVars::UV_PYTHON_CACHE_DIR) { if let Some(python_builds_dir) =
env::var_os(EnvVars::UV_PYTHON_CACHE_DIR).filter(|s| !s.is_empty())
{
let python_builds_dir = PathBuf::from(python_builds_dir); let python_builds_dir = PathBuf::from(python_builds_dir);
fs_err::create_dir_all(&python_builds_dir)?; fs_err::create_dir_all(&python_builds_dir)?;
let hash_prefix = match self.sha256 { let hash_prefix = match self.sha256 {

View file

@ -44,6 +44,13 @@ impl ImplementationName {
Self::GraalPy => "GraalPy", Self::GraalPy => "GraalPy",
} }
} }
pub fn executable_name(self) -> &'static str {
match self {
Self::CPython => "python",
Self::PyPy | Self::GraalPy => self.into(),
}
}
} }
impl LenientImplementationName { impl LenientImplementationName {
@ -53,6 +60,13 @@ impl LenientImplementationName {
Self::Unknown(name) => name, Self::Unknown(name) => name,
} }
} }
pub fn executable_name(&self) -> &str {
match self {
Self::Known(implementation) => implementation.executable_name(),
Self::Unknown(name) => name,
}
}
} }
impl From<&ImplementationName> for &'static str { impl From<&ImplementationName> for &'static str {

View file

@ -967,25 +967,10 @@ impl InterpreterInfo {
pub(crate) fn query_cached(executable: &Path, cache: &Cache) -> Result<Self, Error> { pub(crate) fn query_cached(executable: &Path, cache: &Cache) -> Result<Self, Error> {
let absolute = std::path::absolute(executable)?; let absolute = std::path::absolute(executable)?;
let cache_entry = cache.entry( // Provide a better error message if the link is broken or the file does not exist. Since
CacheBucket::Interpreter, // `canonicalize_executable` does not resolve the file on Windows, we must re-use this logic
// Shard interpreter metadata by host architecture, operating system, and version, to // for the subsequent metadata read as we may not have actually resolved the path.
// invalidate the cache (e.g.) on OS upgrades. let handle_io_error = |err: io::Error| -> Error {
cache_digest(&(
ARCH,
sys_info::os_type().unwrap_or_default(),
sys_info::os_release().unwrap_or_default(),
)),
// We use the absolute path for the cache entry to avoid cache collisions for relative
// paths. But we don't to query the executable with symbolic links resolved.
format!("{}.msgpack", cache_digest(&absolute)),
);
// We check the timestamp of the canonicalized executable to check if an underlying
// interpreter has been modified.
let modified = canonicalize_executable(&absolute)
.and_then(Timestamp::from_path)
.map_err(|err| {
if err.kind() == io::ErrorKind::NotFound { if err.kind() == io::ErrorKind::NotFound {
// Check if it looks like a venv interpreter where the underlying Python // Check if it looks like a venv interpreter where the underlying Python
// installation was removed. // installation was removed.
@ -1003,7 +988,32 @@ impl InterpreterInfo {
} else { } else {
err.into() err.into()
} }
})?; };
let canonical = canonicalize_executable(&absolute).map_err(handle_io_error)?;
let cache_entry = cache.entry(
CacheBucket::Interpreter,
// Shard interpreter metadata by host architecture, operating system, and version, to
// invalidate the cache (e.g.) on OS upgrades.
cache_digest(&(
ARCH,
sys_info::os_type().unwrap_or_default(),
sys_info::os_release().unwrap_or_default(),
)),
// We use the absolute path for the cache entry to avoid cache collisions for relative
// paths. But we don't want to query the executable with symbolic links resolved because
// that can change reported values, e.g., `sys.executable`. We include the canonical
// path in the cache entry as well, otherwise we can have cache collisions if an
// absolute path refers to different interpreters with matching ctimes, e.g., if you
// have a `.venv/bin/python` pointing to both Python 3.12 and Python 3.13 that were
// modified at the same time.
format!("{}.msgpack", cache_digest(&(&absolute, &canonical))),
);
// We check the timestamp of the canonicalized executable to check if an underlying
// interpreter has been modified.
let modified = Timestamp::from_path(canonical).map_err(handle_io_error)?;
// Read from the cache. // Read from the cache.
if cache if cache
@ -1015,7 +1025,7 @@ impl InterpreterInfo {
Ok(cached) => { Ok(cached) => {
if cached.timestamp == modified { if cached.timestamp == modified {
trace!( trace!(
"Cached interpreter info for Python {}, skipping probing: {}", "Found cached interpreter info for Python {}, skipping query of: {}",
cached.data.markers.python_full_version(), cached.data.markers.python_full_version(),
executable.user_display() executable.user_display()
); );

View file

@ -362,11 +362,7 @@ impl ManagedPythonInstallation {
/// If windowed is true, `pythonw.exe` is selected over `python.exe` on windows, with no changes /// If windowed is true, `pythonw.exe` is selected over `python.exe` on windows, with no changes
/// on non-windows. /// on non-windows.
pub fn executable(&self, windowed: bool) -> PathBuf { pub fn executable(&self, windowed: bool) -> PathBuf {
let implementation = match self.implementation() { let implementation = self.implementation().executable_name();
ImplementationName::CPython => "python",
ImplementationName::PyPy => "pypy",
ImplementationName::GraalPy => "graalpy",
};
let version = match self.implementation() { let version = match self.implementation() {
ImplementationName::CPython => { ImplementationName::CPython => {

View file

@ -43,15 +43,36 @@ impl Ord for Arch {
return self.variant.cmp(&other.variant); return self.variant.cmp(&other.variant);
} }
let native = Arch::from_env(); // For the time being, manually make aarch64 windows disfavored
// on its own host platform, because most packages don't have wheels for
// aarch64 windows, making emulation more useful than native execution!
//
// The reason we do this in "sorting" and not "supports" is so that we don't
// *refuse* to use an aarch64 windows pythons if they happen to be installed
// and nothing else is available.
//
// Similarly if someone manually requests an aarch64 windows install, we
// should respect that request (this is the way users should "override"
// this behaviour).
let preferred = if cfg!(all(windows, target_arch = "aarch64")) {
Arch {
family: target_lexicon::Architecture::X86_64,
variant: None,
}
} else {
// Prefer native architectures // Prefer native architectures
match (self.family == native.family, other.family == native.family) { Arch::from_env()
};
match (
self.family == preferred.family,
other.family == preferred.family,
) {
(true, true) => unreachable!(), (true, true) => unreachable!(),
(true, false) => std::cmp::Ordering::Less, (true, false) => std::cmp::Ordering::Less,
(false, true) => std::cmp::Ordering::Greater, (false, true) => std::cmp::Ordering::Greater,
(false, false) => { (false, false) => {
// Both non-native, fallback to lexicographic order // Both non-preferred, fallback to lexicographic order
self.family.to_string().cmp(&other.family.to_string()) self.family.to_string().cmp(&other.family.to_string())
} }
} }

View file

@ -1,3 +1,5 @@
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::fmt::{Display, Formatter}; use std::fmt::{Display, Formatter};
use std::ops::Deref; use std::ops::Deref;
use std::str::FromStr; use std::str::FromStr;
@ -65,26 +67,16 @@ impl FromStr for PythonVersion {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for PythonVersion { impl schemars::JsonSchema for PythonVersion {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
String::from("PythonVersion") Cow::Borrowed("PythonVersion")
} }
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::schema::SchemaObject { schemars::json_schema!({
instance_type: Some(schemars::schema::InstanceType::String.into()), "type": "string",
string: Some(Box::new(schemars::schema::StringValidation { "pattern": r"^3\.\d+(\.\d+)?$",
pattern: Some(r"^3\.\d+(\.\d+)?$".to_string()), "description": "A Python version specifier, e.g. `3.11` or `3.12.4`."
..schemars::schema::StringValidation::default() })
})),
metadata: Some(Box::new(schemars::schema::Metadata {
description: Some(
"A Python version specifier, e.g. `3.11` or `3.12.4`.".to_string(),
),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
} }
} }

View file

@ -1,7 +1,7 @@
//! DO NOT EDIT //! DO NOT EDIT
//! //!
//! Generated with `cargo run dev generate-sysconfig-metadata` //! Generated with `cargo run dev generate-sysconfig-metadata`
//! Targets from <https://github.com/astral-sh/python-build-standalone/blob/20250612/cpython-unix/targets.yml> //! Targets from <https://github.com/astral-sh/python-build-standalone/blob/20250702/cpython-unix/targets.yml>
//! //!
#![allow(clippy::all)] #![allow(clippy::all)]
#![cfg_attr(any(), rustfmt::skip)] #![cfg_attr(any(), rustfmt::skip)]
@ -15,7 +15,6 @@ use crate::sysconfig::replacements::{ReplacementEntry, ReplacementMode};
pub(crate) static DEFAULT_VARIABLE_UPDATES: LazyLock<BTreeMap<String, Vec<ReplacementEntry>>> = LazyLock::new(|| { pub(crate) static DEFAULT_VARIABLE_UPDATES: LazyLock<BTreeMap<String, Vec<ReplacementEntry>>> = LazyLock::new(|| {
BTreeMap::from_iter([ BTreeMap::from_iter([
("BLDSHARED".to_string(), vec![ ("BLDSHARED".to_string(), vec![
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/aarch64-linux-gnu-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-gcc".to_string() }, to: "cc".to_string() },
@ -28,7 +27,6 @@ pub(crate) static DEFAULT_VARIABLE_UPDATES: LazyLock<BTreeMap<String, Vec<Replac
ReplacementEntry { mode: ReplacementMode::Partial { from: "musl-clang".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "musl-clang".to_string() }, to: "cc".to_string() },
]), ]),
("CC".to_string(), vec![ ("CC".to_string(), vec![
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/aarch64-linux-gnu-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-gcc".to_string() }, to: "cc".to_string() },
@ -41,7 +39,6 @@ pub(crate) static DEFAULT_VARIABLE_UPDATES: LazyLock<BTreeMap<String, Vec<Replac
ReplacementEntry { mode: ReplacementMode::Partial { from: "musl-clang".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "musl-clang".to_string() }, to: "cc".to_string() },
]), ]),
("CXX".to_string(), vec![ ("CXX".to_string(), vec![
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/aarch64-linux-gnu-g++".to_string() }, to: "c++".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-g++".to_string() }, to: "c++".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-g++".to_string() }, to: "c++".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-g++".to_string() }, to: "c++".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-g++".to_string() }, to: "c++".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-g++".to_string() }, to: "c++".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-g++".to_string() }, to: "c++".to_string() },
@ -53,7 +50,6 @@ pub(crate) static DEFAULT_VARIABLE_UPDATES: LazyLock<BTreeMap<String, Vec<Replac
ReplacementEntry { mode: ReplacementMode::Partial { from: "clang++".to_string() }, to: "c++".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "clang++".to_string() }, to: "c++".to_string() },
]), ]),
("LDCXXSHARED".to_string(), vec![ ("LDCXXSHARED".to_string(), vec![
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/aarch64-linux-gnu-g++".to_string() }, to: "c++".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-g++".to_string() }, to: "c++".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-g++".to_string() }, to: "c++".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-g++".to_string() }, to: "c++".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-g++".to_string() }, to: "c++".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-g++".to_string() }, to: "c++".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-g++".to_string() }, to: "c++".to_string() },
@ -65,7 +61,6 @@ pub(crate) static DEFAULT_VARIABLE_UPDATES: LazyLock<BTreeMap<String, Vec<Replac
ReplacementEntry { mode: ReplacementMode::Partial { from: "clang++".to_string() }, to: "c++".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "clang++".to_string() }, to: "c++".to_string() },
]), ]),
("LDSHARED".to_string(), vec![ ("LDSHARED".to_string(), vec![
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/aarch64-linux-gnu-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-gcc".to_string() }, to: "cc".to_string() },
@ -78,7 +73,6 @@ pub(crate) static DEFAULT_VARIABLE_UPDATES: LazyLock<BTreeMap<String, Vec<Replac
ReplacementEntry { mode: ReplacementMode::Partial { from: "musl-clang".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "musl-clang".to_string() }, to: "cc".to_string() },
]), ]),
("LINKCC".to_string(), vec![ ("LINKCC".to_string(), vec![
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/aarch64-linux-gnu-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabi-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/arm-linux-gnueabihf-gcc".to_string() }, to: "cc".to_string() },
ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-gcc".to_string() }, to: "cc".to_string() }, ReplacementEntry { mode: ReplacementMode::Partial { from: "/usr/bin/mips-linux-gnu-gcc".to_string() }, to: "cc".to_string() },

View file

@ -349,7 +349,7 @@ mod tests {
// Cross-compiles use GNU // Cross-compiles use GNU
let sysconfigdata = [ let sysconfigdata = [
("CC", "/usr/bin/aarch64-linux-gnu-gcc"), ("CC", "/usr/bin/riscv64-linux-gnu-gcc"),
("CXX", "/usr/bin/x86_64-linux-gnu-g++"), ("CXX", "/usr/bin/x86_64-linux-gnu-g++"),
] ]
.into_iter() .into_iter()

View file

@ -177,7 +177,9 @@ impl FromStr for DisplaySafeUrl {
} }
fn is_ssh_git_username(url: &Url) -> bool { fn is_ssh_git_username(url: &Url) -> bool {
matches!(url.scheme(), "ssh" | "git+ssh") && url.username() == "git" && url.password().is_none() matches!(url.scheme(), "ssh" | "git+ssh" | "git+https")
&& url.username() == "git"
&& url.password().is_none()
} }
fn display_with_redacted_credentials( fn display_with_redacted_credentials(

View file

@ -1,6 +1,7 @@
--- ---
source: crates/uv-requirements-txt/src/lib.rs source: crates/uv-requirements-txt/src/lib.rs
expression: actual expression: actual
snapshot_kind: text
--- ---
RequirementsTxt { RequirementsTxt {
requirements: [ requirements: [
@ -23,7 +24,7 @@ RequirementsTxt {
), ),
), ),
), ),
marker: python_full_version >= '3.8' and python_full_version < '4.0', marker: python_full_version >= '3.8' and python_full_version < '4',
origin: Some( origin: Some(
File( File(
"<REQUIREMENTS_DIR>/poetry-with-hashes.txt", "<REQUIREMENTS_DIR>/poetry-with-hashes.txt",
@ -54,7 +55,7 @@ RequirementsTxt {
), ),
), ),
), ),
marker: python_full_version >= '3.8' and python_full_version < '4.0', marker: python_full_version >= '3.8' and python_full_version < '4',
origin: Some( origin: Some(
File( File(
"<REQUIREMENTS_DIR>/poetry-with-hashes.txt", "<REQUIREMENTS_DIR>/poetry-with-hashes.txt",
@ -85,7 +86,7 @@ RequirementsTxt {
), ),
), ),
), ),
marker: python_full_version >= '3.8' and python_full_version < '4.0' and sys_platform == 'win32', marker: python_full_version >= '3.8' and python_full_version < '4' and sys_platform == 'win32',
origin: Some( origin: Some(
File( File(
"<REQUIREMENTS_DIR>/poetry-with-hashes.txt", "<REQUIREMENTS_DIR>/poetry-with-hashes.txt",
@ -116,7 +117,7 @@ RequirementsTxt {
), ),
), ),
), ),
marker: python_full_version >= '3.8' and python_full_version < '4.0', marker: python_full_version >= '3.8' and python_full_version < '4',
origin: Some( origin: Some(
File( File(
"<REQUIREMENTS_DIR>/poetry-with-hashes.txt", "<REQUIREMENTS_DIR>/poetry-with-hashes.txt",
@ -148,7 +149,7 @@ RequirementsTxt {
), ),
), ),
), ),
marker: python_full_version >= '3.8' and python_full_version < '4.0', marker: python_full_version >= '3.8' and python_full_version < '4',
origin: Some( origin: Some(
File( File(
"<REQUIREMENTS_DIR>/poetry-with-hashes.txt", "<REQUIREMENTS_DIR>/poetry-with-hashes.txt",

View file

@ -1,6 +1,7 @@
--- ---
source: crates/uv-requirements-txt/src/lib.rs source: crates/uv-requirements-txt/src/lib.rs
expression: actual expression: actual
snapshot_kind: text
--- ---
RequirementsTxt { RequirementsTxt {
requirements: [ requirements: [
@ -23,7 +24,7 @@ RequirementsTxt {
), ),
), ),
), ),
marker: python_full_version >= '3.8' and python_full_version < '4.0', marker: python_full_version >= '3.8' and python_full_version < '4',
origin: Some( origin: Some(
File( File(
"<REQUIREMENTS_DIR>/poetry-with-hashes.txt", "<REQUIREMENTS_DIR>/poetry-with-hashes.txt",
@ -54,7 +55,7 @@ RequirementsTxt {
), ),
), ),
), ),
marker: python_full_version >= '3.8' and python_full_version < '4.0', marker: python_full_version >= '3.8' and python_full_version < '4',
origin: Some( origin: Some(
File( File(
"<REQUIREMENTS_DIR>/poetry-with-hashes.txt", "<REQUIREMENTS_DIR>/poetry-with-hashes.txt",
@ -85,7 +86,7 @@ RequirementsTxt {
), ),
), ),
), ),
marker: python_full_version >= '3.8' and python_full_version < '4.0' and sys_platform == 'win32', marker: python_full_version >= '3.8' and python_full_version < '4' and sys_platform == 'win32',
origin: Some( origin: Some(
File( File(
"<REQUIREMENTS_DIR>/poetry-with-hashes.txt", "<REQUIREMENTS_DIR>/poetry-with-hashes.txt",
@ -116,7 +117,7 @@ RequirementsTxt {
), ),
), ),
), ),
marker: python_full_version >= '3.8' and python_full_version < '4.0', marker: python_full_version >= '3.8' and python_full_version < '4',
origin: Some( origin: Some(
File( File(
"<REQUIREMENTS_DIR>/poetry-with-hashes.txt", "<REQUIREMENTS_DIR>/poetry-with-hashes.txt",
@ -148,7 +149,7 @@ RequirementsTxt {
), ),
), ),
), ),
marker: python_full_version >= '3.8' and python_full_version < '4.0', marker: python_full_version >= '3.8' and python_full_version < '4',
origin: Some( origin: Some(
File( File(
"<REQUIREMENTS_DIR>/poetry-with-hashes.txt", "<REQUIREMENTS_DIR>/poetry-with-hashes.txt",

View file

@ -13,10 +13,9 @@ use uv_normalize::PackageName;
use uv_pep440::Version; use uv_pep440::Version;
use uv_types::InstalledPackagesProvider; use uv_types::InstalledPackagesProvider;
use crate::preferences::{Entry, Preferences}; use crate::preferences::{Entry, PreferenceSource, Preferences};
use crate::prerelease::{AllowPrerelease, PrereleaseStrategy}; use crate::prerelease::{AllowPrerelease, PrereleaseStrategy};
use crate::resolution_mode::ResolutionStrategy; use crate::resolution_mode::ResolutionStrategy;
use crate::universal_marker::UniversalMarker;
use crate::version_map::{VersionMap, VersionMapDistHandle}; use crate::version_map::{VersionMap, VersionMapDistHandle};
use crate::{Exclusions, Manifest, Options, ResolverEnvironment}; use crate::{Exclusions, Manifest, Options, ResolverEnvironment};
@ -188,7 +187,7 @@ impl CandidateSelector {
if index.is_some_and(|index| !entry.index().matches(index)) { if index.is_some_and(|index| !entry.index().matches(index)) {
return None; return None;
} }
Either::Left(std::iter::once((entry.marker(), entry.pin().version()))) Either::Left(std::iter::once((entry.pin().version(), entry.source())))
} }
[..] => { [..] => {
type Entries<'a> = SmallVec<[&'a Entry; 3]>; type Entries<'a> = SmallVec<[&'a Entry; 3]>;
@ -219,7 +218,7 @@ impl CandidateSelector {
Either::Right( Either::Right(
preferences preferences
.into_iter() .into_iter()
.map(|entry| (entry.marker(), entry.pin().version())), .map(|entry| (entry.pin().version(), entry.source())),
) )
} }
}; };
@ -238,7 +237,7 @@ impl CandidateSelector {
/// Return the first preference that satisfies the current range and is allowed. /// Return the first preference that satisfies the current range and is allowed.
fn get_preferred_from_iter<'a, InstalledPackages: InstalledPackagesProvider>( fn get_preferred_from_iter<'a, InstalledPackages: InstalledPackagesProvider>(
&'a self, &'a self,
preferences: impl Iterator<Item = (&'a UniversalMarker, &'a Version)>, preferences: impl Iterator<Item = (&'a Version, PreferenceSource)>,
package_name: &'a PackageName, package_name: &'a PackageName,
range: &Range<Version>, range: &Range<Version>,
version_maps: &'a [VersionMap], version_maps: &'a [VersionMap],
@ -246,7 +245,7 @@ impl CandidateSelector {
reinstall: bool, reinstall: bool,
env: &ResolverEnvironment, env: &ResolverEnvironment,
) -> Option<Candidate<'a>> { ) -> Option<Candidate<'a>> {
for (marker, version) in preferences { for (version, source) in preferences {
// Respect the version range for this requirement. // Respect the version range for this requirement.
if !range.contains(version) { if !range.contains(version) {
continue; continue;
@ -290,9 +289,14 @@ impl CandidateSelector {
let allow = match self.prerelease_strategy.allows(package_name, env) { let allow = match self.prerelease_strategy.allows(package_name, env) {
AllowPrerelease::Yes => true, AllowPrerelease::Yes => true,
AllowPrerelease::No => false, AllowPrerelease::No => false,
// If the pre-release is "global" (i.e., provided via a lockfile, rather than // If the pre-release was provided via an existing file, rather than from the
// a fork), accept it unless pre-releases are completely banned. // current solve, accept it unless pre-releases are completely banned.
AllowPrerelease::IfNecessary => marker.is_true(), AllowPrerelease::IfNecessary => match source {
PreferenceSource::Resolver => false,
PreferenceSource::Lock
| PreferenceSource::Environment
| PreferenceSource::RequirementsTxt => true,
},
}; };
if !allow { if !allow {
continue; continue;

View file

@ -3,6 +3,7 @@ use std::fmt::Formatter;
use std::sync::Arc; use std::sync::Arc;
use indexmap::IndexSet; use indexmap::IndexSet;
use itertools::Itertools;
use owo_colors::OwoColorize; use owo_colors::OwoColorize;
use pubgrub::{ use pubgrub::{
DefaultStringReporter, DerivationTree, Derived, External, Range, Ranges, Reporter, Term, DefaultStringReporter, DerivationTree, Derived, External, Range, Ranges, Reporter, Term,
@ -17,6 +18,8 @@ use uv_normalize::{ExtraName, InvalidNameError, PackageName};
use uv_pep440::{LocalVersionSlice, LowerBound, Version, VersionSpecifier}; use uv_pep440::{LocalVersionSlice, LowerBound, Version, VersionSpecifier};
use uv_pep508::{MarkerEnvironment, MarkerExpression, MarkerTree, MarkerValueVersion}; use uv_pep508::{MarkerEnvironment, MarkerExpression, MarkerTree, MarkerValueVersion};
use uv_platform_tags::Tags; use uv_platform_tags::Tags;
use uv_pypi_types::ParsedUrl;
use uv_redacted::DisplaySafeUrl;
use uv_static::EnvVars; use uv_static::EnvVars;
use crate::candidate_selector::CandidateSelector; use crate::candidate_selector::CandidateSelector;
@ -56,11 +59,14 @@ pub enum ResolveError {
} else { } else {
format!(" in {env}") format!(" in {env}")
}, },
urls.join("\n- "), urls.iter()
.map(|url| format!("{}{}", DisplaySafeUrl::from(url.clone()), if url.is_editable() { " (editable)" } else { "" }))
.collect::<Vec<_>>()
.join("\n- ")
)] )]
ConflictingUrls { ConflictingUrls {
package_name: PackageName, package_name: PackageName,
urls: Vec<String>, urls: Vec<ParsedUrl>,
env: ResolverEnvironment, env: ResolverEnvironment,
}, },
@ -71,11 +77,14 @@ pub enum ResolveError {
} else { } else {
format!(" in {env}") format!(" in {env}")
}, },
indexes.join("\n- "), indexes.iter()
.map(std::string::ToString::to_string)
.collect::<Vec<_>>()
.join("\n- ")
)] )]
ConflictingIndexesForEnvironment { ConflictingIndexesForEnvironment {
package_name: PackageName, package_name: PackageName,
indexes: Vec<String>, indexes: Vec<IndexUrl>,
env: ResolverEnvironment, env: ResolverEnvironment,
}, },
@ -148,7 +157,7 @@ impl<T> From<tokio::sync::mpsc::error::SendError<T>> for ResolveError {
} }
} }
pub(crate) type ErrorTree = DerivationTree<PubGrubPackage, Range<Version>, UnavailableReason>; pub type ErrorTree = DerivationTree<PubGrubPackage, Range<Version>, UnavailableReason>;
/// A wrapper around [`pubgrub::error::NoSolutionError`] that displays a resolution failure report. /// A wrapper around [`pubgrub::error::NoSolutionError`] that displays a resolution failure report.
pub struct NoSolutionError { pub struct NoSolutionError {
@ -359,6 +368,11 @@ impl NoSolutionError {
NoSolutionHeader::new(self.env.clone()) NoSolutionHeader::new(self.env.clone())
} }
/// Get the conflict derivation tree for external analysis
pub fn derivation_tree(&self) -> &ErrorTree {
&self.error
}
/// Hint at limiting the resolver environment if universal resolution failed for a target /// Hint at limiting the resolver environment if universal resolution failed for a target
/// that is not the current platform or not the current Python version. /// that is not the current platform or not the current Python version.
fn hint_disjoint_targets(&self, f: &mut Formatter) -> std::fmt::Result { fn hint_disjoint_targets(&self, f: &mut Formatter) -> std::fmt::Result {
@ -396,6 +410,15 @@ impl NoSolutionError {
} }
Ok(()) Ok(())
} }
/// Get the packages that are involved in this error.
pub fn packages(&self) -> impl Iterator<Item = &PackageName> {
self.error
.packages()
.into_iter()
.filter_map(|p| p.name())
.unique()
}
} }
impl std::fmt::Debug for NoSolutionError { impl std::fmt::Debug for NoSolutionError {
@ -1213,6 +1236,69 @@ impl SentinelRange<'_> {
} }
} }
/// A prefix match, e.g., `==2.4.*`, which is desugared to a range like `>=2.4.dev0,<2.5.dev0`.
#[derive(Debug, Clone, PartialEq, Eq)]
pub(crate) struct PrefixMatch<'a> {
version: &'a Version,
}
impl<'a> PrefixMatch<'a> {
/// Determine whether a given range is equivalent to a prefix match (e.g., `==2.4.*`).
///
/// Prefix matches are desugared to (e.g.) `>=2.4.dev0,<2.5.dev0`, but we want to render them
/// as `==2.4.*` in error messages.
pub(crate) fn from_range(lower: &'a Bound<Version>, upper: &'a Bound<Version>) -> Option<Self> {
let Bound::Included(lower) = lower else {
return None;
};
let Bound::Excluded(upper) = upper else {
return None;
};
if lower.is_pre() || lower.is_post() || lower.is_local() {
return None;
}
if upper.is_pre() || upper.is_post() || upper.is_local() {
return None;
}
if lower.dev() != Some(0) {
return None;
}
if upper.dev() != Some(0) {
return None;
}
if lower.release().len() != upper.release().len() {
return None;
}
// All segments should be the same, except the last one, which should be incremented.
let num_segments = lower.release().len();
for (i, (lower, upper)) in lower
.release()
.iter()
.zip(upper.release().iter())
.enumerate()
{
if i == num_segments - 1 {
if lower + 1 != *upper {
return None;
}
} else {
if lower != upper {
return None;
}
}
}
Some(PrefixMatch { version: lower })
}
}
impl std::fmt::Display for PrefixMatch<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
write!(f, "=={}.*", self.version.only_release())
}
}
#[derive(Debug)] #[derive(Debug)]
pub struct NoSolutionHeader { pub struct NoSolutionHeader {
/// The [`ResolverEnvironment`] that caused the failure. /// The [`ResolverEnvironment`] that caused the failure.

View file

@ -1,3 +1,5 @@
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::str::FromStr; use std::str::FromStr;
use jiff::{Timestamp, ToSpan, tz::TimeZone}; use jiff::{Timestamp, ToSpan, tz::TimeZone};
@ -67,25 +69,15 @@ impl std::fmt::Display for ExcludeNewer {
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for ExcludeNewer { impl schemars::JsonSchema for ExcludeNewer {
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
"ExcludeNewer".to_string() Cow::Borrowed("ExcludeNewer")
} }
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::schema::SchemaObject { schemars::json_schema!({
instance_type: Some(schemars::schema::InstanceType::String.into()), "type": "string",
string: Some(Box::new(schemars::schema::StringValidation { "pattern": r"^\d{4}-\d{2}-\d{2}(T\d{2}:\d{2}:\d{2}(Z|[+-]\d{2}:\d{2}))?$",
pattern: Some( "description": "Exclude distributions uploaded after the given timestamp.\n\nAccepts both RFC 3339 timestamps (e.g., `2006-12-02T02:07:43Z`) and local dates in the same format (e.g., `2006-12-02`).",
r"^\d{4}-\d{2}-\d{2}(T\d{2}:\d{2}:\d{2}(Z|[+-]\d{2}:\d{2}))?$".to_string(), })
),
..schemars::schema::StringValidation::default()
})),
metadata: Some(Box::new(schemars::schema::Metadata {
description: Some("Exclude distributions uploaded after the given timestamp.\n\nAccepts both RFC 3339 timestamps (e.g., `2006-12-02T02:07:43Z`) and local dates in the same format (e.g., `2006-12-02`).".to_string()),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
} }
} }

View file

@ -24,7 +24,7 @@ impl ForkIndexes {
) -> Result<(), ResolveError> { ) -> Result<(), ResolveError> {
if let Some(previous) = self.0.insert(package_name.clone(), index.clone()) { if let Some(previous) = self.0.insert(package_name.clone(), index.clone()) {
if &previous != index { if &previous != index {
let mut conflicts = vec![previous.url.to_string(), index.url.to_string()]; let mut conflicts = vec![previous.url, index.url.clone()];
conflicts.sort(); conflicts.sort();
return Err(ResolveError::ConflictingIndexesForEnvironment { return Err(ResolveError::ConflictingIndexesForEnvironment {
package_name: package_name.clone(), package_name: package_name.clone(),

View file

@ -2,7 +2,6 @@ use std::collections::hash_map::Entry;
use rustc_hash::FxHashMap; use rustc_hash::FxHashMap;
use uv_distribution_types::Verbatim;
use uv_normalize::PackageName; use uv_normalize::PackageName;
use uv_pypi_types::VerbatimParsedUrl; use uv_pypi_types::VerbatimParsedUrl;
@ -34,10 +33,8 @@ impl ForkUrls {
match self.0.entry(package_name.clone()) { match self.0.entry(package_name.clone()) {
Entry::Occupied(previous) => { Entry::Occupied(previous) => {
if previous.get() != url { if previous.get() != url {
let mut conflicting_url = vec![ let mut conflicting_url =
previous.get().verbatim.verbatim().to_string(), vec![previous.get().parsed_url.clone(), url.parsed_url.clone()];
url.verbatim.verbatim().to_string(),
];
conflicting_url.sort(); conflicting_url.sort();
return Err(ResolveError::ConflictingUrls { return Err(ResolveError::ConflictingUrls {
package_name: package_name.clone(), package_name: package_name.clone(),

View file

@ -1,5 +1,5 @@
pub use dependency_mode::DependencyMode; pub use dependency_mode::DependencyMode;
pub use error::{NoSolutionError, NoSolutionHeader, ResolveError, SentinelRange}; pub use error::{ErrorTree, NoSolutionError, NoSolutionHeader, ResolveError, SentinelRange};
pub use exclude_newer::ExcludeNewer; pub use exclude_newer::ExcludeNewer;
pub use exclusions::Exclusions; pub use exclusions::Exclusions;
pub use flat_index::{FlatDistributions, FlatIndex}; pub use flat_index::{FlatDistributions, FlatIndex};
@ -54,7 +54,7 @@ mod options;
mod pins; mod pins;
mod preferences; mod preferences;
mod prerelease; mod prerelease;
mod pubgrub; pub mod pubgrub;
mod python_requirement; mod python_requirement;
mod redirect; mod redirect;
mod resolution; mod resolution;

View file

@ -1478,9 +1478,11 @@ impl Lock {
if let Source::Registry(index) = &package.id.source { if let Source::Registry(index) = &package.id.source {
match index { match index {
RegistrySource::Url(url) => { RegistrySource::Url(url) => {
// Normalize URL before validating.
let url = url.without_trailing_slash();
if remotes if remotes
.as_ref() .as_ref()
.is_some_and(|remotes| !remotes.contains(url)) .is_some_and(|remotes| !remotes.contains(&url))
{ {
let name = &package.id.name; let name = &package.id.name;
let version = &package let version = &package
@ -1488,7 +1490,11 @@ impl Lock {
.version .version
.as_ref() .as_ref()
.expect("version for registry source"); .expect("version for registry source");
return Ok(SatisfiesResult::MissingRemoteIndex(name, version, url)); return Ok(SatisfiesResult::MissingRemoteIndex(
name,
version,
url.into_owned(),
));
} }
} }
RegistrySource::Path(path) => { RegistrySource::Path(path) => {
@ -1793,7 +1799,7 @@ pub enum SatisfiesResult<'lock> {
/// The lockfile is missing a workspace member. /// The lockfile is missing a workspace member.
MissingRoot(PackageName), MissingRoot(PackageName),
/// The lockfile referenced a remote index that was not provided /// The lockfile referenced a remote index that was not provided
MissingRemoteIndex(&'lock PackageName, &'lock Version, &'lock UrlString), MissingRemoteIndex(&'lock PackageName, &'lock Version, UrlString),
/// The lockfile referenced a local index that was not provided /// The lockfile referenced a local index that was not provided
MissingLocalIndex(&'lock PackageName, &'lock Version, &'lock Path), MissingLocalIndex(&'lock PackageName, &'lock Version, &'lock Path),
/// A package in the lockfile contains different `requires-dist` metadata than expected. /// A package in the lockfile contains different `requires-dist` metadata than expected.
@ -2371,7 +2377,13 @@ impl Package {
let sdist = match &self.id.source { let sdist = match &self.id.source {
Source::Path(path) => { Source::Path(path) => {
// A direct path source can also be a wheel, so validate the extension. // A direct path source can also be a wheel, so validate the extension.
let DistExtension::Source(ext) = DistExtension::from_path(path)? else { let DistExtension::Source(ext) = DistExtension::from_path(path).map_err(|err| {
LockErrorKind::MissingExtension {
id: self.id.clone(),
err,
}
})?
else {
return Ok(None); return Ok(None);
}; };
let install_path = absolute_path(workspace_root, path)?; let install_path = absolute_path(workspace_root, path)?;
@ -2444,7 +2456,14 @@ impl Package {
} }
Source::Direct(url, direct) => { Source::Direct(url, direct) => {
// A direct URL source can also be a wheel, so validate the extension. // A direct URL source can also be a wheel, so validate the extension.
let DistExtension::Source(ext) = DistExtension::from_path(url.as_ref())? else { let DistExtension::Source(ext) =
DistExtension::from_path(url.base_str()).map_err(|err| {
LockErrorKind::MissingExtension {
id: self.id.clone(),
err,
}
})?
else {
return Ok(None); return Ok(None);
}; };
let location = url.to_url().map_err(LockErrorKind::InvalidUrl)?; let location = url.to_url().map_err(LockErrorKind::InvalidUrl)?;
@ -2483,7 +2502,12 @@ impl Package {
.ok_or_else(|| LockErrorKind::MissingFilename { .ok_or_else(|| LockErrorKind::MissingFilename {
id: self.id.clone(), id: self.id.clone(),
})?; })?;
let ext = SourceDistExtension::from_path(filename.as_ref())?; let ext = SourceDistExtension::from_path(filename.as_ref()).map_err(|err| {
LockErrorKind::MissingExtension {
id: self.id.clone(),
err,
}
})?;
let file = Box::new(uv_distribution_types::File { let file = Box::new(uv_distribution_types::File {
dist_info_metadata: false, dist_info_metadata: false,
filename: SmallString::from(filename), filename: SmallString::from(filename),
@ -2523,19 +2547,41 @@ impl Package {
.as_ref() .as_ref()
.expect("version for registry source"); .expect("version for registry source");
let file_path = sdist.path().ok_or_else(|| LockErrorKind::MissingPath { let file_url = match sdist {
SourceDist::Url { url: file_url, .. } => {
FileLocation::AbsoluteUrl(file_url.clone())
}
SourceDist::Path {
path: file_path, ..
} => {
let file_path = workspace_root.join(path).join(file_path);
let file_url =
DisplaySafeUrl::from_file_path(&file_path).map_err(|()| {
LockErrorKind::PathToUrl {
path: file_path.into_boxed_path(),
}
})?;
FileLocation::AbsoluteUrl(UrlString::from(file_url))
}
SourceDist::Metadata { .. } => {
return Err(LockErrorKind::MissingPath {
name: name.clone(), name: name.clone(),
version: version.clone(), version: version.clone(),
})?; }
let file_url = .into());
DisplaySafeUrl::from_file_path(workspace_root.join(path).join(file_path)) }
.map_err(|()| LockErrorKind::PathToUrl)?; };
let filename = sdist let filename = sdist
.filename() .filename()
.ok_or_else(|| LockErrorKind::MissingFilename { .ok_or_else(|| LockErrorKind::MissingFilename {
id: self.id.clone(), id: self.id.clone(),
})?; })?;
let ext = SourceDistExtension::from_path(filename.as_ref())?; let ext = SourceDistExtension::from_path(filename.as_ref()).map_err(|err| {
LockErrorKind::MissingExtension {
id: self.id.clone(),
err,
}
})?;
let file = Box::new(uv_distribution_types::File { let file = Box::new(uv_distribution_types::File {
dist_info_metadata: false, dist_info_metadata: false,
filename: SmallString::from(filename), filename: SmallString::from(filename),
@ -2545,9 +2591,10 @@ impl Package {
requires_python: None, requires_python: None,
size: sdist.size(), size: sdist.size(),
upload_time_utc_ms: sdist.upload_time().map(Timestamp::as_millisecond), upload_time_utc_ms: sdist.upload_time().map(Timestamp::as_millisecond),
url: FileLocation::AbsoluteUrl(UrlString::from(file_url)), url: file_url,
yanked: None, yanked: None,
}); });
let index = IndexUrl::from( let index = IndexUrl::from(
VerbatimUrl::from_absolute_path(workspace_root.join(path)) VerbatimUrl::from_absolute_path(workspace_root.join(path))
.map_err(LockErrorKind::RegistryVerbatimUrl)?, .map_err(LockErrorKind::RegistryVerbatimUrl)?,
@ -3227,7 +3274,9 @@ impl Source {
Ok(Source::Registry(source)) Ok(Source::Registry(source))
} }
IndexUrl::Path(url) => { IndexUrl::Path(url) => {
let path = url.to_file_path().map_err(|()| LockErrorKind::UrlToPath)?; let path = url
.to_file_path()
.map_err(|()| LockErrorKind::UrlToPath { url: url.to_url() })?;
let path = relative_to(&path, root) let path = relative_to(&path, root)
.or_else(|_| std::path::absolute(&path)) .or_else(|_| std::path::absolute(&path))
.map_err(LockErrorKind::IndexRelativePath)?; .map_err(LockErrorKind::IndexRelativePath)?;
@ -3660,14 +3709,6 @@ impl SourceDist {
} }
} }
fn path(&self) -> Option<&Path> {
match &self {
SourceDist::Metadata { .. } => None,
SourceDist::Url { .. } => None,
SourceDist::Path { path, .. } => Some(path),
}
}
pub(crate) fn hash(&self) -> Option<&Hash> { pub(crate) fn hash(&self) -> Option<&Hash> {
match &self { match &self {
SourceDist::Metadata { metadata } => metadata.hash.as_ref(), SourceDist::Metadata { metadata } => metadata.hash.as_ref(),
@ -3787,14 +3828,19 @@ impl SourceDist {
})) }))
} }
IndexUrl::Path(path) => { IndexUrl::Path(path) => {
let index_path = path.to_file_path().map_err(|()| LockErrorKind::UrlToPath)?; let index_path = path
let reg_dist_path = reg_dist .to_file_path()
.map_err(|()| LockErrorKind::UrlToPath { url: path.to_url() })?;
let url = reg_dist
.file .file
.url .url
.to_url() .to_url()
.map_err(LockErrorKind::InvalidUrl)? .map_err(LockErrorKind::InvalidUrl)?;
if url.scheme() == "file" {
let reg_dist_path = url
.to_file_path() .to_file_path()
.map_err(|()| LockErrorKind::UrlToPath)?; .map_err(|()| LockErrorKind::UrlToPath { url })?;
let path = relative_to(&reg_dist_path, index_path) let path = relative_to(&reg_dist_path, index_path)
.or_else(|_| std::path::absolute(&reg_dist_path)) .or_else(|_| std::path::absolute(&reg_dist_path))
.map_err(LockErrorKind::DistributionRelativePath)? .map_err(LockErrorKind::DistributionRelativePath)?
@ -3815,6 +3861,27 @@ impl SourceDist {
upload_time, upload_time,
}, },
})) }))
} else {
let url = normalize_file_location(&reg_dist.file.url)
.map_err(LockErrorKind::InvalidUrl)
.map_err(LockError::from)?;
let hash = reg_dist.file.hashes.iter().max().cloned().map(Hash::from);
let size = reg_dist.file.size;
let upload_time = reg_dist
.file
.upload_time_utc_ms
.map(Timestamp::from_millisecond)
.transpose()
.map_err(LockErrorKind::InvalidTimestamp)?;
Ok(Some(SourceDist::Url {
url,
metadata: SourceDistMetadata {
hash,
size,
upload_time,
},
}))
}
} }
} }
} }
@ -4117,14 +4184,15 @@ impl Wheel {
}) })
} }
IndexUrl::Path(path) => { IndexUrl::Path(path) => {
let index_path = path.to_file_path().map_err(|()| LockErrorKind::UrlToPath)?; let index_path = path
let wheel_path = wheel
.file
.url
.to_url()
.map_err(LockErrorKind::InvalidUrl)?
.to_file_path() .to_file_path()
.map_err(|()| LockErrorKind::UrlToPath)?; .map_err(|()| LockErrorKind::UrlToPath { url: path.to_url() })?;
let wheel_url = wheel.file.url.to_url().map_err(LockErrorKind::InvalidUrl)?;
if wheel_url.scheme() == "file" {
let wheel_path = wheel_url
.to_file_path()
.map_err(|()| LockErrorKind::UrlToPath { url: wheel_url })?;
let path = relative_to(&wheel_path, index_path) let path = relative_to(&wheel_path, index_path)
.or_else(|_| std::path::absolute(&wheel_path)) .or_else(|_| std::path::absolute(&wheel_path))
.map_err(LockErrorKind::DistributionRelativePath)? .map_err(LockErrorKind::DistributionRelativePath)?
@ -4136,6 +4204,26 @@ impl Wheel {
upload_time: None, upload_time: None,
filename, filename,
}) })
} else {
let url = normalize_file_location(&wheel.file.url)
.map_err(LockErrorKind::InvalidUrl)
.map_err(LockError::from)?;
let hash = wheel.file.hashes.iter().max().cloned().map(Hash::from);
let size = wheel.file.size;
let upload_time = wheel
.file
.upload_time_utc_ms
.map(Timestamp::from_millisecond)
.transpose()
.map_err(LockErrorKind::InvalidTimestamp)?;
Ok(Wheel {
url: WheelWireSource::Url { url },
hash,
size,
filename,
upload_time,
})
}
} }
} }
} }
@ -4173,8 +4261,10 @@ impl Wheel {
match source { match source {
RegistrySource::Url(url) => { RegistrySource::Url(url) => {
let file_url = match &self.url { let file_location = match &self.url {
WheelWireSource::Url { url } => url, WheelWireSource::Url { url: file_url } => {
FileLocation::AbsoluteUrl(file_url.clone())
}
WheelWireSource::Path { .. } | WheelWireSource::Filename { .. } => { WheelWireSource::Path { .. } | WheelWireSource::Filename { .. } => {
return Err(LockErrorKind::MissingUrl { return Err(LockErrorKind::MissingUrl {
name: filename.name, name: filename.name,
@ -4190,7 +4280,7 @@ impl Wheel {
requires_python: None, requires_python: None,
size: self.size, size: self.size,
upload_time_utc_ms: self.upload_time.map(Timestamp::as_millisecond), upload_time_utc_ms: self.upload_time.map(Timestamp::as_millisecond),
url: FileLocation::AbsoluteUrl(file_url.clone()), url: file_location,
yanked: None, yanked: None,
}); });
let index = IndexUrl::from(VerbatimUrl::from_url( let index = IndexUrl::from(VerbatimUrl::from_url(
@ -4203,9 +4293,21 @@ impl Wheel {
}) })
} }
RegistrySource::Path(index_path) => { RegistrySource::Path(index_path) => {
let file_path = match &self.url { let file_location = match &self.url {
WheelWireSource::Path { path } => path, WheelWireSource::Url { url: file_url } => {
WheelWireSource::Url { .. } | WheelWireSource::Filename { .. } => { FileLocation::AbsoluteUrl(file_url.clone())
}
WheelWireSource::Path { path: file_path } => {
let file_path = root.join(index_path).join(file_path);
let file_url =
DisplaySafeUrl::from_file_path(&file_path).map_err(|()| {
LockErrorKind::PathToUrl {
path: file_path.into_boxed_path(),
}
})?;
FileLocation::AbsoluteUrl(UrlString::from(file_url))
}
WheelWireSource::Filename { .. } => {
return Err(LockErrorKind::MissingPath { return Err(LockErrorKind::MissingPath {
name: filename.name, name: filename.name,
version: filename.version, version: filename.version,
@ -4213,9 +4315,6 @@ impl Wheel {
.into()); .into());
} }
}; };
let file_url =
DisplaySafeUrl::from_file_path(root.join(index_path).join(file_path))
.map_err(|()| LockErrorKind::PathToUrl)?;
let file = Box::new(uv_distribution_types::File { let file = Box::new(uv_distribution_types::File {
dist_info_metadata: false, dist_info_metadata: false,
filename: SmallString::from(filename.to_string()), filename: SmallString::from(filename.to_string()),
@ -4223,7 +4322,7 @@ impl Wheel {
requires_python: None, requires_python: None,
size: self.size, size: self.size,
upload_time_utc_ms: self.upload_time.map(Timestamp::as_millisecond), upload_time_utc_ms: self.upload_time.map(Timestamp::as_millisecond),
url: FileLocation::AbsoluteUrl(UrlString::from(file_url)), url: file_location,
yanked: None, yanked: None,
}); });
let index = IndexUrl::from( let index = IndexUrl::from(
@ -4597,7 +4696,7 @@ impl From<Hash> for Hashes {
/// Convert a [`FileLocation`] into a normalized [`UrlString`]. /// Convert a [`FileLocation`] into a normalized [`UrlString`].
fn normalize_file_location(location: &FileLocation) -> Result<UrlString, ToUrlError> { fn normalize_file_location(location: &FileLocation) -> Result<UrlString, ToUrlError> {
match location { match location {
FileLocation::AbsoluteUrl(absolute) => Ok(absolute.without_fragment()), FileLocation::AbsoluteUrl(absolute) => Ok(absolute.without_fragment().into_owned()),
FileLocation::RelativeUrl(_, _) => Ok(normalize_url(location.to_url()?)), FileLocation::RelativeUrl(_, _) => Ok(normalize_url(location.to_url()?)),
} }
} }
@ -5222,8 +5321,13 @@ enum LockErrorKind {
), ),
/// An error that occurs when the extension can't be determined /// An error that occurs when the extension can't be determined
/// for a given wheel or source distribution. /// for a given wheel or source distribution.
#[error("Failed to parse file extension; expected one of: {0}")] #[error("Failed to parse file extension for `{id}`; expected one of: {err}", id = id.cyan())]
MissingExtension(#[from] ExtensionError), MissingExtension {
/// The filename that was expected to have an extension.
id: PackageId,
/// The list of valid extensions that were expected.
err: ExtensionError,
},
/// Failed to parse a Git source URL. /// Failed to parse a Git source URL.
#[error("Failed to parse Git URL")] #[error("Failed to parse Git URL")]
InvalidGitSourceUrl( InvalidGitSourceUrl(
@ -5421,11 +5525,11 @@ enum LockErrorKind {
VerbatimUrlError, VerbatimUrlError,
), ),
/// An error that occurs when converting a path to a URL. /// An error that occurs when converting a path to a URL.
#[error("Failed to convert path to URL")] #[error("Failed to convert path to URL: {path}", path = path.display().cyan())]
PathToUrl, PathToUrl { path: Box<Path> },
/// An error that occurs when converting a URL to a path /// An error that occurs when converting a URL to a path
#[error("Failed to convert URL to path")] #[error("Failed to convert URL to path: {url}", url = url.cyan())]
UrlToPath, UrlToPath { url: DisplaySafeUrl },
/// An error that occurs when multiple packages with the same /// An error that occurs when multiple packages with the same
/// name were found when identifying the root packages. /// name were found when identifying the root packages.
#[error("Found multiple packages matching `{name}`", name = name.cyan())] #[error("Found multiple packages matching `{name}`", name = name.cyan())]

View file

@ -34,6 +34,8 @@ pub struct Preference {
/// is part of, otherwise `None`. /// is part of, otherwise `None`.
fork_markers: Vec<UniversalMarker>, fork_markers: Vec<UniversalMarker>,
hashes: HashDigests, hashes: HashDigests,
/// The source of the preference.
source: PreferenceSource,
} }
impl Preference { impl Preference {
@ -73,6 +75,7 @@ impl Preference {
.map(String::as_str) .map(String::as_str)
.map(HashDigest::from_str) .map(HashDigest::from_str)
.collect::<Result<_, _>>()?, .collect::<Result<_, _>>()?,
source: PreferenceSource::RequirementsTxt,
})) }))
} }
@ -91,6 +94,7 @@ impl Preference {
index: PreferenceIndex::from(package.index(install_path)?), index: PreferenceIndex::from(package.index(install_path)?),
fork_markers: package.fork_markers().to_vec(), fork_markers: package.fork_markers().to_vec(),
hashes: HashDigests::empty(), hashes: HashDigests::empty(),
source: PreferenceSource::Lock,
})) }))
} }
@ -112,6 +116,7 @@ impl Preference {
// `pylock.toml` doesn't have fork annotations. // `pylock.toml` doesn't have fork annotations.
fork_markers: vec![], fork_markers: vec![],
hashes: HashDigests::empty(), hashes: HashDigests::empty(),
source: PreferenceSource::Lock,
})) }))
} }
@ -127,6 +132,7 @@ impl Preference {
index: PreferenceIndex::Any, index: PreferenceIndex::Any,
fork_markers: vec![], fork_markers: vec![],
hashes: HashDigests::empty(), hashes: HashDigests::empty(),
source: PreferenceSource::Environment,
}) })
} }
@ -171,11 +177,24 @@ impl From<Option<IndexUrl>> for PreferenceIndex {
} }
} }
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub(crate) enum PreferenceSource {
/// The preference is from an installed package in the environment.
Environment,
/// The preference is from a `uv.ock` file.
Lock,
/// The preference is from a `requirements.txt` file.
RequirementsTxt,
/// The preference is from the current solve.
Resolver,
}
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub(crate) struct Entry { pub(crate) struct Entry {
marker: UniversalMarker, marker: UniversalMarker,
index: PreferenceIndex, index: PreferenceIndex,
pin: Pin, pin: Pin,
source: PreferenceSource,
} }
impl Entry { impl Entry {
@ -193,6 +212,11 @@ impl Entry {
pub(crate) fn pin(&self) -> &Pin { pub(crate) fn pin(&self) -> &Pin {
&self.pin &self.pin
} }
/// Return the source of the entry.
pub(crate) fn source(&self) -> PreferenceSource {
self.source
}
} }
/// A set of pinned packages that should be preserved during resolution, if possible. /// A set of pinned packages that should be preserved during resolution, if possible.
@ -245,6 +269,7 @@ impl Preferences {
version: preference.version, version: preference.version,
hashes: preference.hashes, hashes: preference.hashes,
}, },
source: preference.source,
}); });
} else { } else {
for fork_marker in preference.fork_markers { for fork_marker in preference.fork_markers {
@ -255,6 +280,7 @@ impl Preferences {
version: preference.version.clone(), version: preference.version.clone(),
hashes: preference.hashes.clone(), hashes: preference.hashes.clone(),
}, },
source: preference.source,
}); });
} }
} }
@ -270,11 +296,13 @@ impl Preferences {
index: Option<IndexUrl>, index: Option<IndexUrl>,
markers: UniversalMarker, markers: UniversalMarker,
pin: impl Into<Pin>, pin: impl Into<Pin>,
source: PreferenceSource,
) { ) {
self.0.entry(package_name).or_default().push(Entry { self.0.entry(package_name).or_default().push(Entry {
marker: markers, marker: markers,
index: PreferenceIndex::from(index), index: PreferenceIndex::from(index),
pin: pin.into(), pin: pin.into(),
source,
}); });
} }

View file

@ -1,6 +1,6 @@
pub(crate) use crate::pubgrub::dependencies::PubGrubDependency; pub(crate) use crate::pubgrub::dependencies::PubGrubDependency;
pub(crate) use crate::pubgrub::distribution::PubGrubDistribution; pub(crate) use crate::pubgrub::distribution::PubGrubDistribution;
pub(crate) use crate::pubgrub::package::{PubGrubPackage, PubGrubPackageInner, PubGrubPython}; pub use crate::pubgrub::package::{PubGrubPackage, PubGrubPackageInner, PubGrubPython};
pub(crate) use crate::pubgrub::priority::{PubGrubPriorities, PubGrubPriority, PubGrubTiebreaker}; pub(crate) use crate::pubgrub::priority::{PubGrubPriorities, PubGrubPriority, PubGrubTiebreaker};
pub(crate) use crate::pubgrub::report::PubGrubReportFormatter; pub(crate) use crate::pubgrub::report::PubGrubReportFormatter;

View file

@ -9,7 +9,7 @@ use crate::python_requirement::PythonRequirement;
/// [`Arc`] wrapper around [`PubGrubPackageInner`] to make cloning (inside PubGrub) cheap. /// [`Arc`] wrapper around [`PubGrubPackageInner`] to make cloning (inside PubGrub) cheap.
#[derive(Debug, Clone, Eq, Hash, PartialEq, PartialOrd, Ord)] #[derive(Debug, Clone, Eq, Hash, PartialEq, PartialOrd, Ord)]
pub(crate) struct PubGrubPackage(Arc<PubGrubPackageInner>); pub struct PubGrubPackage(Arc<PubGrubPackageInner>);
impl Deref for PubGrubPackage { impl Deref for PubGrubPackage {
type Target = PubGrubPackageInner; type Target = PubGrubPackageInner;
@ -39,7 +39,7 @@ impl From<PubGrubPackageInner> for PubGrubPackage {
/// package (e.g., `black[colorama]`), and mark it as a dependency of the real package (e.g., /// package (e.g., `black[colorama]`), and mark it as a dependency of the real package (e.g.,
/// `black`). We then discard the virtual packages at the end of the resolution process. /// `black`). We then discard the virtual packages at the end of the resolution process.
#[derive(Debug, Clone, Eq, Hash, PartialEq, PartialOrd, Ord)] #[derive(Debug, Clone, Eq, Hash, PartialEq, PartialOrd, Ord)]
pub(crate) enum PubGrubPackageInner { pub enum PubGrubPackageInner {
/// The root package, which is used to start the resolution process. /// The root package, which is used to start the resolution process.
Root(Option<PackageName>), Root(Option<PackageName>),
/// A Python version. /// A Python version.
@ -295,7 +295,7 @@ impl PubGrubPackage {
} }
#[derive(Debug, Copy, Clone, Eq, PartialEq, PartialOrd, Hash, Ord)] #[derive(Debug, Copy, Clone, Eq, PartialEq, PartialOrd, Hash, Ord)]
pub(crate) enum PubGrubPython { pub enum PubGrubPython {
/// The Python version installed in the current environment. /// The Python version installed in the current environment.
Installed, Installed,
/// The Python version for which dependencies are being resolved. /// The Python version for which dependencies are being resolved.

View file

@ -18,7 +18,7 @@ use uv_pep440::{Version, VersionSpecifiers};
use uv_platform_tags::{AbiTag, IncompatibleTag, LanguageTag, PlatformTag, Tags}; use uv_platform_tags::{AbiTag, IncompatibleTag, LanguageTag, PlatformTag, Tags};
use crate::candidate_selector::CandidateSelector; use crate::candidate_selector::CandidateSelector;
use crate::error::ErrorTree; use crate::error::{ErrorTree, PrefixMatch};
use crate::fork_indexes::ForkIndexes; use crate::fork_indexes::ForkIndexes;
use crate::fork_urls::ForkUrls; use crate::fork_urls::ForkUrls;
use crate::prerelease::AllowPrerelease; use crate::prerelease::AllowPrerelease;
@ -944,17 +944,30 @@ impl PubGrubReportFormatter<'_> {
hints: &mut IndexSet<PubGrubHint>, hints: &mut IndexSet<PubGrubHint>,
) { ) {
let any_prerelease = set.iter().any(|(start, end)| { let any_prerelease = set.iter().any(|(start, end)| {
// Ignore, e.g., `>=2.4.dev0,<2.5.dev0`, which is the desugared form of `==2.4.*`.
if PrefixMatch::from_range(start, end).is_some() {
return false;
}
let is_pre1 = match start { let is_pre1 = match start {
Bound::Included(version) => version.any_prerelease(), Bound::Included(version) => version.any_prerelease(),
Bound::Excluded(version) => version.any_prerelease(), Bound::Excluded(version) => version.any_prerelease(),
Bound::Unbounded => false, Bound::Unbounded => false,
}; };
if is_pre1 {
return true;
}
let is_pre2 = match end { let is_pre2 = match end {
Bound::Included(version) => version.any_prerelease(), Bound::Included(version) => version.any_prerelease(),
Bound::Excluded(version) => version.any_prerelease(), Bound::Excluded(version) => version.any_prerelease(),
Bound::Unbounded => false, Bound::Unbounded => false,
}; };
is_pre1 || is_pre2 if is_pre2 {
return true;
}
false
}); });
if any_prerelease { if any_prerelease {
@ -1928,11 +1941,11 @@ impl std::fmt::Display for PackageRange<'_> {
PackageRangeKind::Available => write!(f, "are available:")?, PackageRangeKind::Available => write!(f, "are available:")?,
} }
} }
for segment in &segments { for (lower, upper) in &segments {
if segments.len() > 1 { if segments.len() > 1 {
write!(f, "\n ")?; write!(f, "\n ")?;
} }
match segment { match (lower, upper) {
(Bound::Unbounded, Bound::Unbounded) => match self.kind { (Bound::Unbounded, Bound::Unbounded) => match self.kind {
PackageRangeKind::Dependency => write!(f, "{package}")?, PackageRangeKind::Dependency => write!(f, "{package}")?,
PackageRangeKind::Compatibility => write!(f, "all versions of {package}")?, PackageRangeKind::Compatibility => write!(f, "all versions of {package}")?,
@ -1948,7 +1961,13 @@ impl std::fmt::Display for PackageRange<'_> {
write!(f, "{package}>={v},<={b}")?; write!(f, "{package}>={v},<={b}")?;
} }
} }
(Bound::Included(v), Bound::Excluded(b)) => write!(f, "{package}>={v},<{b}")?, (Bound::Included(v), Bound::Excluded(b)) => {
if let Some(prefix) = PrefixMatch::from_range(lower, upper) {
write!(f, "{package}{prefix}")?;
} else {
write!(f, "{package}>={v},<{b}")?;
}
}
(Bound::Excluded(v), Bound::Unbounded) => write!(f, "{package}>{v}")?, (Bound::Excluded(v), Bound::Unbounded) => write!(f, "{package}>{v}")?,
(Bound::Excluded(v), Bound::Included(b)) => write!(f, "{package}>{v},<={b}")?, (Bound::Excluded(v), Bound::Included(b)) => write!(f, "{package}>{v},<={b}")?,
(Bound::Excluded(v), Bound::Excluded(b)) => write!(f, "{package}>{v},<{b}")?, (Bound::Excluded(v), Bound::Excluded(b)) => write!(f, "{package}>{v},<{b}")?,

View file

@ -7,7 +7,7 @@ use uv_platform_tags::{AbiTag, Tags};
/// The reason why a package or a version cannot be used. /// The reason why a package or a version cannot be used.
#[derive(Debug, Clone, Eq, PartialEq)] #[derive(Debug, Clone, Eq, PartialEq)]
pub(crate) enum UnavailableReason { pub enum UnavailableReason {
/// The entire package cannot be used. /// The entire package cannot be used.
Package(UnavailablePackage), Package(UnavailablePackage),
/// A single version cannot be used. /// A single version cannot be used.
@ -29,7 +29,7 @@ impl Display for UnavailableReason {
/// Most variant are from [`MetadataResponse`] without the error source, since we don't format /// Most variant are from [`MetadataResponse`] without the error source, since we don't format
/// the source and we want to merge unavailable messages across versions. /// the source and we want to merge unavailable messages across versions.
#[derive(Debug, Clone, Eq, PartialEq)] #[derive(Debug, Clone, Eq, PartialEq)]
pub(crate) enum UnavailableVersion { pub enum UnavailableVersion {
/// Version is incompatible because it has no usable distributions /// Version is incompatible because it has no usable distributions
IncompatibleDist(IncompatibleDist), IncompatibleDist(IncompatibleDist),
/// The wheel metadata was found, but could not be parsed. /// The wheel metadata was found, but could not be parsed.
@ -123,7 +123,7 @@ impl From<&MetadataUnavailable> for UnavailableVersion {
/// The package is unavailable and cannot be used. /// The package is unavailable and cannot be used.
#[derive(Debug, Clone, Eq, PartialEq)] #[derive(Debug, Clone, Eq, PartialEq)]
pub(crate) enum UnavailablePackage { pub enum UnavailablePackage {
/// Index lookups were disabled (i.e., `--no-index`) and the package was not found in a flat index (i.e. from `--find-links`). /// Index lookups were disabled (i.e., `--no-index`) and the package was not found in a flat index (i.e. from `--find-links`).
NoIndex, NoIndex,
/// Network requests were disabled (i.e., `--offline`), and the package was not found in the cache. /// Network requests were disabled (i.e., `--offline`), and the package was not found in the cache.

View file

@ -47,7 +47,7 @@ use crate::fork_strategy::ForkStrategy;
use crate::fork_urls::ForkUrls; use crate::fork_urls::ForkUrls;
use crate::manifest::Manifest; use crate::manifest::Manifest;
use crate::pins::FilePins; use crate::pins::FilePins;
use crate::preferences::Preferences; use crate::preferences::{PreferenceSource, Preferences};
use crate::pubgrub::{ use crate::pubgrub::{
PubGrubDependency, PubGrubDistribution, PubGrubPackage, PubGrubPackageInner, PubGrubPriorities, PubGrubDependency, PubGrubDistribution, PubGrubPackage, PubGrubPackageInner, PubGrubPriorities,
PubGrubPython, PubGrubPython,
@ -447,6 +447,7 @@ impl<InstalledPackages: InstalledPackagesProvider> ResolverState<InstalledPackag
.try_universal_markers() .try_universal_markers()
.unwrap_or(UniversalMarker::TRUE), .unwrap_or(UniversalMarker::TRUE),
version.clone(), version.clone(),
PreferenceSource::Resolver,
); );
} }
} }

View file

@ -4,7 +4,6 @@ use same_file::is_same_file;
use tracing::debug; use tracing::debug;
use uv_cache_key::CanonicalUrl; use uv_cache_key::CanonicalUrl;
use uv_distribution_types::Verbatim;
use uv_git::GitResolver; use uv_git::GitResolver;
use uv_normalize::PackageName; use uv_normalize::PackageName;
use uv_pep508::{MarkerTree, VerbatimUrl}; use uv_pep508::{MarkerTree, VerbatimUrl};
@ -170,8 +169,8 @@ impl Urls {
let [allowed_url] = matching_urls.as_slice() else { let [allowed_url] = matching_urls.as_slice() else {
let mut conflicting_urls: Vec<_> = matching_urls let mut conflicting_urls: Vec<_> = matching_urls
.into_iter() .into_iter()
.map(|parsed_url| parsed_url.verbatim.verbatim().to_string()) .map(|parsed_url| parsed_url.parsed_url.clone())
.chain(std::iter::once(verbatim_url.verbatim().to_string())) .chain(std::iter::once(parsed_url.clone()))
.collect(); .collect();
conflicting_urls.sort(); conflicting_urls.sort();
return Err(ResolveError::ConflictingUrls { return Err(ResolveError::ConflictingUrls {

View file

@ -41,6 +41,7 @@ pub(crate) struct Tools {
#[derive(Debug, Clone, Default, Deserialize, CombineOptions, OptionsMetadata)] #[derive(Debug, Clone, Default, Deserialize, CombineOptions, OptionsMetadata)]
#[serde(from = "OptionsWire", rename_all = "kebab-case")] #[serde(from = "OptionsWire", rename_all = "kebab-case")]
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))] #[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
#[cfg_attr(feature = "schemars", schemars(!from))]
pub struct Options { pub struct Options {
#[serde(flatten)] #[serde(flatten)]
pub globals: GlobalOptions, pub globals: GlobalOptions,

View file

@ -147,15 +147,15 @@ impl PartialOrd<SmallString> for rkyv::string::ArchivedString {
/// An [`schemars::JsonSchema`] implementation for [`SmallString`]. /// An [`schemars::JsonSchema`] implementation for [`SmallString`].
#[cfg(feature = "schemars")] #[cfg(feature = "schemars")]
impl schemars::JsonSchema for SmallString { impl schemars::JsonSchema for SmallString {
fn is_referenceable() -> bool { fn inline_schema() -> bool {
String::is_referenceable() true
} }
fn schema_name() -> String { fn schema_name() -> Cow<'static, str> {
String::schema_name() String::schema_name()
} }
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
String::json_schema(_gen) String::json_schema(generator)
} }
} }

View file

@ -359,10 +359,6 @@ impl EnvVars {
#[attr_hidden] #[attr_hidden]
pub const UV_INTERNAL__SHOW_DERIVATION_TREE: &'static str = "UV_INTERNAL__SHOW_DERIVATION_TREE"; pub const UV_INTERNAL__SHOW_DERIVATION_TREE: &'static str = "UV_INTERNAL__SHOW_DERIVATION_TREE";
/// Used to set a temporary directory for some tests.
#[attr_hidden]
pub const UV_INTERNAL__TEST_DIR: &'static str = "UV_INTERNAL__TEST_DIR";
/// Path to system-level configuration directory on Unix systems. /// Path to system-level configuration directory on Unix systems.
pub const XDG_CONFIG_DIRS: &'static str = "XDG_CONFIG_DIRS"; pub const XDG_CONFIG_DIRS: &'static str = "XDG_CONFIG_DIRS";
@ -667,6 +663,10 @@ impl EnvVars {
#[attr_hidden] #[attr_hidden]
pub const UV_TEST_INDEX_URL: &'static str = "UV_TEST_INDEX_URL"; pub const UV_TEST_INDEX_URL: &'static str = "UV_TEST_INDEX_URL";
/// Used to set the GitHub fast-path url for tests.
#[attr_hidden]
pub const UV_GITHUB_FAST_PATH_URL: &'static str = "UV_GITHUB_FAST_PATH_URL";
/// Hide progress messages with non-deterministic order in tests. /// Hide progress messages with non-deterministic order in tests.
#[attr_hidden] #[attr_hidden]
pub const UV_TEST_NO_CLI_PROGRESS: &'static str = "UV_TEST_NO_CLI_PROGRESS"; pub const UV_TEST_NO_CLI_PROGRESS: &'static str = "UV_TEST_NO_CLI_PROGRESS";

View file

@ -521,7 +521,7 @@ if __name__ == "__main__":
} }
#[test] #[test]
#[ignore] #[ignore = "This test will spawn a GUI and wait until you close the window."]
fn gui_launcher() -> Result<()> { fn gui_launcher() -> Result<()> {
// Create Temp Dirs // Create Temp Dirs
let temp_dir = assert_fs::TempDir::new()?; let temp_dir = assert_fs::TempDir::new()?;

Some files were not shown because too many files have changed in this diff Show more