Compare commits

...

250 commits
0.7.10 ... main

Author SHA1 Message Date
renovate[bot]
ddb1577a93
Update Rust crate reqwest to v0.12.22 (#14475)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [reqwest](https://redirect.github.com/seanmonstar/reqwest) |
workspace.dependencies | patch | `=0.12.15` -> `=0.12.22` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>seanmonstar/reqwest (reqwest)</summary>

###
[`v0.12.22`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01222)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.21...v0.12.22)

- Fix socks proxies when resolving IPv6 destinations.

###
[`v0.12.21`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01221)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.20...v0.12.21)

- Fix socks proxy to use `socks4a://` instead of `socks4h://`.
- Fix `Error::is_timeout()` to check for hyper and IO timeouts too.
- Fix request `Error` to again include URLs when possible.
- Fix socks connect error to include more context.
- (wasm) implement `Default` for `Body`.

###
[`v0.12.20`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01220)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.19...v0.12.20)

- Add `ClientBuilder::tcp_user_timeout(Duration)` option to set
`TCP_USER_TIMEOUT`.
- Fix proxy headers only using the first matched proxy.
- (wasm) Fix re-adding `Error::is_status()`.

###
[`v0.12.19`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01219)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.18...v0.12.19)

- Fix redirect that changes the method to GET should remove payload
headers.
- Fix redirect to only check the next scheme if the policy action is to
follow.
- (wasm) Fix compilation error if `cookies` feature is enabled (by the
way, it's a noop feature in wasm).

###
[`v0.12.18`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01218)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.17...v0.12.18)

- Fix compilation when `socks` enabled without TLS.

###
[`v0.12.17`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01217)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.16...v0.12.17)

- Fix compilation on macOS.

###
[`v0.12.16`](https://redirect.github.com/seanmonstar/reqwest/blob/HEAD/CHANGELOG.md#v01216)

[Compare
Source](https://redirect.github.com/seanmonstar/reqwest/compare/v0.12.15...v0.12.16)

- Add `ClientBuilder::http3_congestion_bbr()` to enable BBR congestion
control.
- Add `ClientBuilder::http3_send_grease()` to configure whether to send
use QUIC grease.
- Add `ClientBuilder::http3_max_field_section_size()` to configure the
maximum response headers.
- Add `ClientBuilder::tcp_keepalive_interval()` to configure TCP probe
interval.
- Add `ClientBuilder::tcp_keepalive_retries()` to configure TCP probe
count.
- Add `Proxy::headers()` to add extra headers that should be sent to a
proxy.
- Fix `redirect::Policy::limit()` which had an off-by-1 error, allowing
1 more redirect than specified.
- Fix HTTP/3 to support streaming request bodies.
- (wasm) Fix null bodies when calling `Response::bytes_stream()`.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box


---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

---

Closes #14243

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: konstin <konstin@mailbox.org>
2025-07-07 13:39:57 +02:00
John Mumm
d31e6ad7c7
Move fragment preservation test to directly test our redirect handling logic (#14480)
When [updating](https://github.com/astral-sh/uv/pull/14475) to the
latest `reqwest` version, our fragment propagation test broke. That test
was partially testing the `reqwest` behavior, so this PR moves the
fragment test to directly test our logic for constructing redirect
requests.
2025-07-07 12:51:21 +02:00
renovate[bot]
3a77b9cdd9
Update aws-actions/configure-aws-credentials digest to f503a18 (#14473)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| aws-actions/configure-aws-credentials | action | digest | `3d8cba3` ->
`f503a18` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 11:31:31 +02:00
renovate[bot]
1d027bd92a
Update pre-commit dependencies (#14474)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[astral-sh/ruff-pre-commit](https://redirect.github.com/astral-sh/ruff-pre-commit)
| repository | patch | `v0.12.1` -> `v0.12.2` |
| [crate-ci/typos](https://redirect.github.com/crate-ci/typos) |
repository | minor | `v1.33.1` -> `v1.34.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

Note: The `pre-commit` manager in Renovate is not supported by the
`pre-commit` maintainers or community. Please do not report any problems
there, instead [create a Discussion in the Renovate
repository](https://redirect.github.com/renovatebot/renovate/discussions/new)
if you have any questions.

---

### Release Notes

<details>
<summary>astral-sh/ruff-pre-commit (astral-sh/ruff-pre-commit)</summary>

###
[`v0.12.2`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.2)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.12.1...v0.12.2)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.2

</details>

<details>
<summary>crate-ci/typos (crate-ci/typos)</summary>

###
[`v1.34.0`](https://redirect.github.com/crate-ci/typos/releases/tag/v1.34.0)

[Compare
Source](https://redirect.github.com/crate-ci/typos/compare/v1.33.1...v1.34.0)

#### \[1.34.0] - 2025-06-30

##### Features

- Updated the dictionary with the [June
2025](https://redirect.github.com/crate-ci/typos/issues/1309) changes

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 10:36:26 +02:00
renovate[bot]
bb738aeb44
Update Rust crate test-log to v0.2.18 (#14477)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [test-log](https://redirect.github.com/d-e-s-o/test-log) |
dev-dependencies | patch | `0.2.17` -> `0.2.18` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>d-e-s-o/test-log (test-log)</summary>

###
[`v0.2.18`](https://redirect.github.com/d-e-s-o/test-log/blob/HEAD/CHANGELOG.md#0218)

[Compare
Source](https://redirect.github.com/d-e-s-o/test-log/compare/v0.2.17...v0.2.18)

- Improved cooperation with other similar procedural macros to enable
  attribute stacking

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 10:04:50 +02:00
renovate[bot]
fc758bb755
Update Rust crate schemars to v1.0.4 (#14476)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [schemars](https://graham.cool/schemars/)
([source](https://redirect.github.com/GREsau/schemars)) |
workspace.dependencies | patch | `1.0.3` -> `1.0.4` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>GREsau/schemars (schemars)</summary>

###
[`v1.0.4`](https://redirect.github.com/GREsau/schemars/blob/HEAD/CHANGELOG.md#104---2025-07-06)

[Compare
Source](https://redirect.github.com/GREsau/schemars/compare/v1.0.3...v1.0.4)

##### Fixed

- Fix `JsonSchema` impl on
[atomic](https://doc.rust-lang.org/std/sync/atomic/) types being ignored
on non-nightly compilers due to a buggy `cfg` check
([https://github.com/GREsau/schemars/issues/453](https://redirect.github.com/GREsau/schemars/issues/453))
- Fix compatibility with minimal dependency versions, e.g. old(-ish)
versions of `syn`
([https://github.com/GREsau/schemars/issues/450](https://redirect.github.com/GREsau/schemars/issues/450))
- Fix derive for empty tuple variants
([https://github.com/GREsau/schemars/issues/455](https://redirect.github.com/GREsau/schemars/issues/455))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 10:00:52 +02:00
renovate[bot]
1308c85efe
Update Rust crate async-channel to v2.5.0 (#14478)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [async-channel](https://redirect.github.com/smol-rs/async-channel) |
workspace.dependencies | minor | `2.3.1` -> `2.5.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>smol-rs/async-channel (async-channel)</summary>

###
[`v2.5.0`](https://redirect.github.com/smol-rs/async-channel/blob/HEAD/CHANGELOG.md#Version-250)

[Compare
Source](https://redirect.github.com/smol-rs/async-channel/compare/v2.4.0...v2.5.0)

- Add `Sender::closed()`
([#&#8203;102](https://redirect.github.com/smol-rs/async-channel/issues/102))

###
[`v2.4.0`](https://redirect.github.com/smol-rs/async-channel/blob/HEAD/CHANGELOG.md#Version-240)

[Compare
Source](https://redirect.github.com/smol-rs/async-channel/compare/v2.3.1...v2.4.0)

- Add `Sender::same_channel()` and `Receiver::same_channel()`.
([#&#8203;98](https://redirect.github.com/smol-rs/async-channel/issues/98))
- Add `portable-atomic` feature to support platforms without atomics.
([#&#8203;106](https://redirect.github.com/smol-rs/async-channel/issues/106))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-07 09:58:05 +02:00
John Mumm
f609e1ddaf
Document that VerbatimUrl does not preserve original string after serialization (#14456)
Some checks failed
CI / check cache | macos aarch64 (push) Has been cancelled
CI / check system | python on debian (push) Has been cancelled
CI / check system | python on fedora (push) Has been cancelled
CI / check system | python on ubuntu (push) Has been cancelled
CI / check system | python on rocky linux 8 (push) Has been cancelled
CI / check system | python on rocky linux 9 (push) Has been cancelled
CI / check system | graalpy on ubuntu (push) Has been cancelled
CI / check system | pypy on ubuntu (push) Has been cancelled
CI / check system | python on macos aarch64 (push) Has been cancelled
CI / check system | homebrew python on macos aarch64 (push) Has been cancelled
CI / check system | python on macos x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86 (push) Has been cancelled
CI / check system | python3.13 on windows x86-64 (push) Has been cancelled
CI / check system | x86-64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | aarch64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | windows registry (push) Has been cancelled
CI / check system | python3.12 via chocolatey (push) Has been cancelled
CI / check system | python3.9 via pyenv (push) Has been cancelled
CI / check system | python3.13 (push) Has been cancelled
CI / check system | conda3.11 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.8 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.11 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.8 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.11 on windows x86-64 (push) Has been cancelled
CI / check system | conda3.8 on windows x86-64 (push) Has been cancelled
CI / check system | amazonlinux (push) Has been cancelled
CI / check system | embedded python3.10 on windows x86-64 (push) Has been cancelled
CI / benchmarks | walltime aarch64 linux (push) Has been cancelled
CI / benchmarks | instrumented (push) Has been cancelled
This came up in
[discussion](https://github.com/astral-sh/uv/pull/14387#issuecomment-3032223670)
on #14387.
2025-07-04 22:42:56 +02:00
Tim de Jager
eaf517efd8
Add method to get packages involved in a NoSolutionError (#14457)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

In pixi we overlay the PyPI packages over the conda packages and we
sometimes need to figure out what PyPI packages are involved in the
no-solution error. We could parse the error message, but this is pretty
error-prone, so it would be good to get access to more information. A
lot of information in this module is private and should probably stay
this way, but package names are easy enough to expose. This would help
us a lot!

I collect into a HashSet to remove duplication, and did not want to
expose a rustc_hash datastructure directly, thats's why I've chosen to
expose as an iterator :)

Let me know if any changes need to be done, and thanks!

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-07-04 18:08:23 +00:00
Charlie Marsh
e8bc3950ef
Remove transparent variants in uv-extract to enable retries (#14450)
Some checks are pending
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
## Summary

We think this is the culprit for the lack of retries in some settings
(e.g., Python downloads).

See: https://github.com/astral-sh/uv/issues/14425.
2025-07-03 23:32:07 +00:00
konsti
06af93fce7
Fix optional cfg gates (#14448)
Running `cargo clippy` in individual crates could raise warnings due to
unused imports as `Cow` is only used with `#[cfg(feature = "schemars")]`
2025-07-03 15:29:03 -05:00
Simon Sure
8afbd86f03
make ErrorTree for NoSolutionError externally accessible (#14444)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Hey, are you okay with exposing the `ErrorTree` for library consumers?

We have a use case that needs more information on conflicts. We need the
tree-structure of the conflict and be able to traverse it in particular.

Signed-off-by: Simon Sure <ssure@palantir.com>
2025-07-03 11:43:59 -05:00
konsti
a1cda6213c
Make "exit code" -> "exit status" a default filter (#14441)
Remove some test boilerplate.

Revival of https://github.com/astral-sh/uv/pull/14439 with main as base.
2025-07-03 13:50:40 +00:00
konsti
39cdfe9981
Add a test for --force-pep517 (#14310)
There was previously a gap in the test coverage in ensuring that
`--force-pep517` was respected.
2025-07-03 13:34:44 +00:00
Zanie Blue
85c0fc963b
Fix forced resolution with all extras in uv version (#14434)
Closes https://github.com/astral-sh/uv/issues/14433

Same as https://github.com/astral-sh/uv/pull/13380
2025-07-03 07:29:59 -05:00
Zanie Blue
c3f13d2505
Finish incomplete sentence in pip migration guide (#14432)
Some checks are pending
CI / ecosystem test | pallets/flask (push) Blocked by required conditions
CI / smoke test | linux (push) Blocked by required conditions
CI / smoke test | windows x86_64 (push) Blocked by required conditions
CI / smoke test | windows aarch64 (push) Blocked by required conditions
CI / integration test | conda on ubuntu (push) Blocked by required conditions
CI / integration test | deadsnakes python3.9 on ubuntu (push) Blocked by required conditions
CI / integration test | free-threaded on windows (push) Blocked by required conditions
CI / integration test | aarch64 windows implicit (push) Blocked by required conditions
CI / integration test | aarch64 windows explicit (push) Blocked by required conditions
CI / integration test | pypy on ubuntu (push) Blocked by required conditions
CI / integration test | pypy on windows (push) Blocked by required conditions
CI / integration test | graalpy on ubuntu (push) Blocked by required conditions
CI / integration test | determine publish changes (push) Blocked by required conditions
CI / integration test | registries (push) Blocked by required conditions
CI / integration test | uv publish (push) Blocked by required conditions
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Fixes https://github.com/astral-sh/uv/pull/12382#discussion_r2181237729
2025-07-03 01:02:17 +00:00
Zanie Blue
38ee6ec800
Bump version to 0.7.19 (#14431) 2025-07-02 21:19:52 +00:00
konsti
71b5ba13d7
Stabilize the uv build backend (#14311)
The uv build backend has gone through some feedback cycles, we expect no
more major configuration changes, and we're ready to take the next step:
The uv build backend in stable.

This PR stabilizes:

* Using `uv_build` as build backend
* The documentation of the uv build backend
* The direct build fast path, where uv doesn't use PEP 517 if you're
using `uv_build` in a compatible version.
* `uv build --list`, which is limited to `uv_build`.

It does not:
* Make `uv_build` the default on `uv init`
* Make `--package` the default on `uv init`
2025-07-02 15:37:43 -05:00
Zanie Blue
5f2857a1c7
Add linux aarch64 smoke tests (#14427)
Testing https://github.com/astral-sh/uv/pull/14426
2025-07-02 15:18:46 -05:00
Zanie Blue
a58969feef
Fix workspace_unsatisfiable_member_dependencies (#14429) 2025-07-02 15:11:50 -05:00
github-actions[bot]
3bb8ac610c
Sync latest Python releases (#14426)
Automated update for Python releases.

Co-authored-by: zanieb <2586601+zanieb@users.noreply.github.com>
2025-07-02 14:51:17 -05:00
Jack O'Connor
ec54dce919
Includes sys.prefix in cached environment keys to avoid --with collisions across projects (#14403)
Fixes https://github.com/astral-sh/uv/issues/12889.

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-07-02 14:40:18 -05:00
Zanie Blue
a6bb65c78d
Clarify behavior and hint on tool install when no executables are available (#14423)
Some checks are pending
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / build binary | macos aarch64 (push) Blocked by required conditions
CI / build binary | macos x86_64 (push) Blocked by required conditions
CI / build binary | windows x86_64 (push) Blocked by required conditions
CI / build binary | windows aarch64 (push) Blocked by required conditions
CI / ecosystem test | pydantic/pydantic-core (push) Blocked by required conditions
CI / ecosystem test | prefecthq/prefect (push) Blocked by required conditions
CI / ecosystem test | pallets/flask (push) Blocked by required conditions
CI / smoke test | linux (push) Blocked by required conditions
CI / integration test | free-threaded on windows (push) Blocked by required conditions
CI / integration test | aarch64 windows implicit (push) Blocked by required conditions
CI / integration test | aarch64 windows explicit (push) Blocked by required conditions
CI / integration test | pypy on ubuntu (push) Blocked by required conditions
CI / integration test | determine publish changes (push) Blocked by required conditions
CI / integration test | registries (push) Blocked by required conditions
CI / integration test | uv publish (push) Blocked by required conditions
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Closes https://github.com/astral-sh/uv/issues/14416
2025-07-02 13:11:17 -05:00
Charlie Marsh
743260b1f5
Make project and interpreter lock acquisition non-fatal (#14404)
## Summary

If we fail to acquire a lock on an environment, uv shouldn't fail; we
should just warn. In some cases, users run uv with read-only permissions
for their projects, etc.

For now, I kept any locks acquired _in the cache_ as hard failures,
since we always need write-access to the cache.

Closes https://github.com/astral-sh/uv/issues/14411.
2025-07-02 14:03:43 -04:00
Zanie Blue
2f53ea5c5c
Add a migration guide from pip to uv projects (#12382)
[Rendered](https://github.com/astral-sh/uv/blob/zb/pip-wip/docs/guides/migration/pip-to-project.md)

---------

Co-authored-by: samypr100 <3933065+samypr100@users.noreply.github.com>
Co-authored-by: Mathieu Kniewallner <mathieu.kniewallner@gmail.com>
Co-authored-by: Aria Desires <aria.desires@gmail.com>
2025-07-02 12:25:19 -05:00
Charlie Marsh
a9ea756d14
Ignore Python patch version for --universal pip compile (#14405)
## Summary

The idea here is that if a user runs `uv pip compile --universal`, we
should ignore the patch version on the current interpreter. I think this
makes sense... `--universal` tries to resolve for all future versions,
so it seems a bit odd that we'd start at the _current_ patch version.

Closes https://github.com/astral-sh/uv/issues/14397.
2025-07-02 11:11:51 -04:00
Zanie Blue
43f67a4a4c
Update the tilde version specifier warning to include more context (#14335)
Follows https://github.com/astral-sh/uv/pull/14008
2025-07-02 09:08:45 -05:00
konsti
a7aa46acc5
Add a "choosing a build backend" section to the docs (#14295)
I think the build backend docs as a whole are now ready for review. I
only made a small change here.

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-07-02 09:02:03 -05:00
Zanie Blue
b0db548c80
Bump the test timeout from 90s -> 120s (#14170)
In hopes of resolving https://github.com/astral-sh/uv/issues/14158

We should also see why the tests are so slow though.
2025-07-02 13:39:58 +00:00
konsti
bf5dcf9929
Reduce index credential stashing code duplication (#14419)
Reduces some duplicate code around index credentials.
2025-07-02 15:25:56 +02:00
John Mumm
e40d3d5dff
Re-enable Artifactory in the registries integration test (#14408)
Having worked out the account issue, I've re-enabled Artifactory in the
registries test.
2025-07-02 13:50:35 +02:00
Zanie Blue
87e9ccfb92
Bump version to 0.7.18 (#14402)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-07-01 15:30:44 -05:00
konsti
06df95adbf
Workaround for panic due to missing global validation in clap (#14368)
Clap does not perform global validation, so flag that are declared as
overriding can be set at the same time:
https://github.com/clap-rs/clap/issues/6049. This would previously cause
a panic. We work around this by choosing the yes-value always and
writing a warning.

An alternative would be erroring when both are set, but it's unclear to
me if this may break things we want to support. (`UV_OFFLINE=1 cargo run
-q pip --no-offline install tqdm --no-cache` is already banned).

Fixes https://github.com/astral-sh/uv/pull/14299

**Test Plan**

```
$ cargo run -q pip --offline install --no-offline tqdm --no-cache
  warning: Boolean flags on different levels are not correctly supported (https://github.com/clap-rs/clap/issues/6049)
    × No solution found when resolving dependencies:
    ╰─▶ Because tqdm was not found in the cache and you require tqdm, we can conclude that your requirements are unsatisfiable.

        hint: Packages were unavailable because the network was disabled. When the network is disabled, registry packages may only be read from the cache.
```
2025-07-01 13:39:46 -05:00
John Mumm
29fcd6faee
Fix test cases to match Cow variants (#14390)
Updates `without_trailing_slash` and `without_fragment` to separately
match values against `Cow` variants.

Closes #14350
2025-07-01 13:39:17 -05:00
Charlie Marsh
d9f9ed4aec
Reuse build (virtual) environments across resolution and installation (#14338)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
## Summary

The basic idea here is that we can (should) reuse a build environment
across resolution (`prepare_metadata_for_build_wheel`) and installation.
This also happens to solve the build-PyTorch-from-source problem, since
we use a consistent build environment between the invocations.

Since `SourceDistributionBuilder` is stateless, we instead store the
builds on `BuildContext`, and we key them by various properties: the
underlying interpreter, the configuration settings, etc. This just
ensures that if we build the same package twice within a process, we
don't accidentally reuse an incompatible build (virtual) environment.
(Note that still drop build environments at the end of the command, and
don't attempt to reuse them across processes.)

Closes #14269.
2025-07-01 13:15:47 -04:00
Jack O'Connor
85358fe9c6 Keep track of retries in ManagedPythonDownload::fetch_with_retry
If/when we see https://github.com/astral-sh/uv/issues/14171 again, this
should clarify whether our retry logic was skipped (i.e. a transient
error wasn't correctly identified as transient), or whether we exhausted
our retries. Previously, if you ran a local example fileserver as in
https://github.com/astral-sh/uv/issues/14171#issuecomment-3014580701 and
then you tried to install Python from it, you'd get:

```
$ export UV_TEST_NO_CLI_PROGRESS=1
$ uv python install 3.8.20 --mirror http://localhost:8000 2>&1 | cat
error: Failed to install cpython-3.8.20-linux-x86_64-gnu
  Caused by: Failed to extract archive: cpython-3.8.20-20241002-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
  Caused by: failed to unpack `/home/jacko/.local/share/uv/python/.temp/.tmpS4sHHZ/python/lib/libpython3.8.so.1.0`
  Caused by: failed to unpack `python/lib/libpython3.8.so.1.0` into `/home/jacko/.local/share/uv/python/.temp/.tmpS4sHHZ/python/lib/libpython3.8.so.1.0`
  Caused by: error decoding response body
  Caused by: request or response body error
  Caused by: error reading a body from connection
  Caused by: Connection reset by peer (os error 104)
```

With this change you get:

```
error: Failed to install cpython-3.8.20-linux-x86_64-gnu
  Caused by: Request failed after 3 retries
  Caused by: Failed to extract archive: cpython-3.8.20-20241002-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
  Caused by: failed to unpack `/home/jacko/.local/share/uv/python/.temp/.tmp4Ia24w/python/lib/libpython3.8.so.1.0`
  Caused by: failed to unpack `python/lib/libpython3.8.so.1.0` into `/home/jacko/.local/share/uv/python/.temp/.tmp4Ia24w/python/lib/libpython3.8.so.1.0`
  Caused by: error decoding response body
  Caused by: request or response body error
  Caused by: error reading a body from connection
  Caused by: Connection reset by peer (os error 104)
```

At the same time, I'm updating the way we handle the retry count to
avoid nested retry loops exceeding the intended number of attempts, as I
mentioned at
https://github.com/astral-sh/uv/issues/14069#issuecomment-3020634281.
It's not clear to me whether we actually want this part of the change,
and I need feedback here.
2025-07-01 09:52:19 -07:00
Charlie Marsh
c078683217
Only drop build directories on program exit (#14304)
## Summary

This PR ensures that we avoid cleaning up build directories until the
end of a resolve-and-install cycle. It's not bulletproof (since we could
still run into issues with `uv lock` followed by `uv sync` whereby a
build directory gets cleaned up that's still referenced in the `build`
artifacts), but it at least gets PyTorch building without error with `uv
pip install .`, which is a case that's been reported several times.

Closes https://github.com/astral-sh/uv/issues/14269.
2025-07-01 12:50:19 -04:00
Zanie Blue
c777491bf4
Use the insiders requirements when building docs in CI (#14379) 2025-07-01 11:29:10 -05:00
konsti
9af3e9b6ec
Remove unnecessary codspeed deps (#14396)
See https://github.com/CodSpeedHQ/codspeed-rust/pull/108
2025-07-01 11:00:30 -05:00
konsti
43745d2ecf
Fix equals-star and tilde-equals with python_version and python_full_version (#14271)
The marker display code assumes that all versions are normalized, in
that all trailing zeroes are stripped. This is not the case for
tilde-equals and equals-star versions, where the trailing zeroes (before
the `.*`) are semantically relevant. This would cause path
dependent-behavior where we would get a different marker string
depending on whether a version with or without a trailing zero was added
to the cache first.

To handle both equals-star and tilde-equals when converting
`python_version` to `python_full_version` markers, we have to merge the
version normalization (i.e. trimming the trailing zeroes) and the
conversion both to `python_full_version` and to `Ranges`, while special
casing equals-star and tilde-equals.

To avoid churn in lockfiles, we only trim in the conversion to `Ranges`
for markers, but keep using untrimmed versions for requires-python.
(Note that this behavior is technically also path dependent, as versions
with and without trailing zeroes have the same Hash and Eq. E.q.,
`requires-python == ">= 3.10.0"` and `requires-python == ">= 3.10"` in
the same workspace could lead to either value in `uv.lock`, and which
one it is could change if we make unrelated (performance) changes.
Always trimming however definitely changes lockfiles, a churn I wouldn't
do outside another breaking or lockfile-changing change.) Nevertheless,
there is a change for users who have `requires-python = "~= 3.12.0"` in
their `pyproject.toml`, as this now hits the correct normalization path.

Fixes #14231
Fixes #14270
2025-07-01 17:48:48 +02:00
Charlie Marsh
3774a656d7
Use parsed URLs for conflicting URL error message (#14380)
## Summary

There's a good example of the downside of using verbatim URLs here:
https://github.com/astral-sh/uv/pull/14197#discussion_r2163599625 (we
show two relative paths that point to the same directory, but it's not
clear from the error message).

The diff:

```
    2     2 │ ----- stdout -----
    3     3 │
    4     4 │ ----- stderr -----
    5     5 │ error: Requirements contain conflicting URLs for package `library` in all marker environments:
    6       │-- ../../library
    7       │-- ./library
          6 │+- file://[TEMP_DIR]/library
          7 │+- file://[TEMP_DIR]/library (editable)
```
2025-07-01 08:18:01 -04:00
Zanie Blue
b1812d111a
Edits to the build backend documentation (#14376)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Co-authored-by: konstin <konstin@mailbox.org>
2025-07-01 08:44:23 +00:00
github-actions[bot]
a3db9a9ae4
Sync latest Python releases (#14381)
Automated update for Python releases.

Co-authored-by: zanieb <2586601+zanieb@users.noreply.github.com>
2025-07-01 03:44:18 +00:00
renovate[bot]
c5ca240fb7
Update PyO3/maturin-action action to v1.49.3 (#14363)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | aarch64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-30 19:16:53 -04:00
renovate[bot]
7bbdc08dae
Update depot/build-push-action action to v1.15.0 (#14361) 2025-06-30 19:16:46 -04:00
renovate[bot]
5f8d4bbf02
Update Rust crate indexmap to v2.10.0 (#14365) 2025-06-30 19:16:31 -04:00
Adrien Cacciaguerra
9e9505df50
Bump CodSpeed to v3 (#14371)
## Summary

<!-- What's the purpose of the change? What does it do, and why? -->
As explained in the [`codspeed-rust` v3 release
notes](https://github.com/CodSpeedHQ/codspeed-rust/releases/tag/v3.0.0),
the `v3` of the compatibility layers is now required to work with the
latest version(`v3`) of `cargo-codspeed`.
2025-06-30 17:58:29 -05:00
Aria Desires
2f9061dcd0
Update python, add support for installing arm windows pythons (#14374) 2025-06-30 22:02:19 +00:00
Aria Desires
317ce6e245
disfavor aarch64 windows in its own house (#13724)
and prefer emulated x64 windows in its stead.

This is preparatory work for shipping support for uv downloading and
installing aarch64 (arm64) windows Pythons. We've [had builds for this
platform ready for a
while](https://github.com/astral-sh/python-build-standalone/pull/387),
but have held back on shipping them due to a fundamental problem:

**The Python packaging ecosystem does not have strong support for
aarch64 windows**, e.g., not many projects build aarch64 wheels yet. The
net effect of this is that, if we handed you an aarch64 python
interpreter on windows, you would have to build a lot more sdists, and
there's a high chance you will simply fail to build that sdist and be
sad.

Yes unfortunately, in this case a non-native Python interpreter simply
*works better* than the native one... in terms of working at all, today.
Of course, if the native interpreter works for your project, it should
presumably have better performance and platform compatibility.

We do not want to stand in the way of progress, as ideally this
situation is a temporary state of affairs as the ecosystem grows to
support aarch64 windows. To enable progress, on aarch64 Windows builds
of uv:

* We will still use a native python interpreter, e.g., if it's at the
front of your `PATH` or the only installed version.
* If we are choosing between equally good interpreters that differ in
architecture, x64 will be preferred.
* If the aarch64 version is newer, we will prefer the aarch64 one.
* We will emit a diagnostic on installation, and show the python request
to pass to uv to force aarch64 windows to be used.
* Will be shipping [aarch64 Windows Python
downloads](https://github.com/astral-sh/python-build-standalone/pull/387)
* Will probably add some kind of global override setting/env-var to
disable this behaviour.
* Will be shipping this behaviour in
[astral-sh/setup-uv](https://github.com/astral-sh/setup-uv)

We're coordinating with Microsoft, GitHub (for the `setup-python`
action), and the CPython team (for the `python.org` installers), to
ensure we're aligned on this default and the timing of toggling to
prefer native distributions in the future.

See discussion in 

- https://github.com/astral-sh/uv/issues/12906

---

This is an alternative to 

* #13719 

which uses sorting rather than filtering, as discussed in 

* #13721
2025-06-30 17:42:00 -04:00
Zanie Blue
1c7c174bc8
Include the canonical path in the interpreter query cache key (#14331)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
This fixes an obscure cache collision in Python interpreter queries,
which we believe to be the root cause of CI flakes we've been seeing
where a project environment is invalidated and recreated.

This work follows from the logs in [this CI
run](4495059999)
which captured one of the flakes with tracing enabled. There, we can see
that the project environment is invalidated because the Python
interpreter in the environment has a different version than expected:

```
DEBUG Checking for Python environment at `.venv`
TRACE Cached interpreter info for Python 3.12.9, skipping probing: .venv/bin/python3
DEBUG The interpreter in the project environment has different version (3.12.9) than it was created with (3.9.21)
```

(this message is updated to reflect #14329)

The flow is roughly:

- We create an environment with 3.12.9
- We query the environment, and cache the interpreter version for
`.venv/bin/python`
- We create an environment for 3.9.12, replacing the existing one
- We query the environment, and read the cached information

The Python cache entries are keyed by the absolute path to the
interpreter, and rely on the modification time (ctime, nsec resolution)
of the canonicalized path to determine if the cache entry should be
invalidated. The key is a hex representation of a u64 sea hasher output
— which is very unlikely to collide.

After an audit of the Python query caching logic, we determined that the
most likely cause of a collision in cache entries is that the
modification times of underlying interpreters are identical. This seems
pretty feasible, especially if the file system does not support
nanosecond precision — though it appears that the GitHub runners do
support it.

The fix here is to include the canonicalized path in the cache key,
which ensures we're looking at the modification time of the _same_
underlying interpreter.

This will "invalidate" all existing interpreter cache entries but that's
not a big deal.

This should also have the effect of reducing cache churn for
interpreters in virtual environments. Now, when you change Python
versions, we won't invalidate the previous cache entry so if you change
_back_ to the old version we can re-use our cached information.

It's a bit speculative, since we don't have a deterministic reproduction
in CI, but this is the strongest candidate given the logs and should
increase correctness regardless.

Closes https://github.com/astral-sh/uv/issues/14160
Closes https://github.com/astral-sh/uv/issues/13744
Closes https://github.com/astral-sh/uv/issues/13745

Once it's confirmed the flakes are resolved, we should revert

- https://github.com/astral-sh/uv/pull/14275
- #13817
2025-06-30 15:39:47 +00:00
konsti
0372a5b05d
Ignore invalid build backend settings when not building (#14372)
Fixes #14323
2025-06-30 16:32:28 +02:00
Ondrej Profant
ae500c95d2
Docs: add instructions for publishing to JFrog's Artifactory (#14253)
## Summary

Add instructions for publishing to JFrog's Artifactory into
[documentation](https://docs.astral.sh/uv/guides/integration/alternative-indexes/).

Related issues:
https://github.com/astral-sh/uv/issues/9845
https://github.com/astral-sh/uv/issues/10193

## Test Plan

I ran the documentation locally and use npx prettier.

---------

Co-authored-by: Ondrej Profant <ondrej.profant@datamole.ai>
Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-06-30 13:58:55 +00:00
renovate[bot]
5cfabd7085
Update Rust crate schemars to v1.0.3 (#14358)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [schemars](https://graham.cool/schemars/)
([source](https://redirect.github.com/GREsau/schemars)) |
workspace.dependencies | patch | `1.0.0` -> `1.0.3` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>GREsau/schemars (schemars)</summary>

###
[`v1.0.3`](https://redirect.github.com/GREsau/schemars/blob/HEAD/CHANGELOG.md#103---2025-06-28)

[Compare
Source](https://redirect.github.com/GREsau/schemars/compare/v1.0.2...v1.0.3)

##### Fixed

- Fix compile error when a doc comment is set on both a `transparent`
(or newtype) struct and its field
([https://github.com/GREsau/schemars/issues/446](https://redirect.github.com/GREsau/schemars/issues/446))
- Fix `json_schema!()` macro compatibility when used from pre-2021 rust
editions
([https://github.com/GREsau/schemars/pull/447](https://redirect.github.com/GREsau/schemars/pull/447))

###
[`v1.0.2`](https://redirect.github.com/GREsau/schemars/blob/HEAD/CHANGELOG.md#102---2025-06-26)

[Compare
Source](https://redirect.github.com/GREsau/schemars/compare/v1.0.1...v1.0.2)

##### Fixed

- Fix schema properties being incorrectly reordered during serialization
([https://github.com/GREsau/schemars/issues/444](https://redirect.github.com/GREsau/schemars/issues/444))

###
[`v1.0.1`](https://redirect.github.com/GREsau/schemars/blob/HEAD/CHANGELOG.md#101---2025-06-24)

[Compare
Source](https://redirect.github.com/GREsau/schemars/compare/v1.0.0...v1.0.1)

##### Fixed

- Deriving `JsonSchema` with `no_std` broken due to
`std::borrow::ToOwned` trait not being in scope
([https://github.com/GREsau/schemars/issues/441](https://redirect.github.com/GREsau/schemars/issues/441))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: konstin <konstin@mailbox.org>
2025-06-30 13:39:20 +00:00
renovate[bot]
15551a0201
Update Swatinem/rust-cache action to v2.8.0 (#14366)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [Swatinem/rust-cache](https://redirect.github.com/Swatinem/rust-cache)
| action | minor | `v2.7.8` -> `v2.8.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>Swatinem/rust-cache (Swatinem/rust-cache)</summary>

###
[`v2.8.0`](https://redirect.github.com/Swatinem/rust-cache/releases/tag/v2.8.0)

[Compare
Source](https://redirect.github.com/Swatinem/rust-cache/compare/v2.7.8...v2.8.0)

##### What's Changed

- Add cache-workspace-crates feature by
[@&#8203;jbransen](https://redirect.github.com/jbransen) in
[https://github.com/Swatinem/rust-cache/pull/246](https://redirect.github.com/Swatinem/rust-cache/pull/246)
- Feat: support warpbuild cache provider by
[@&#8203;stegaBOB](https://redirect.github.com/stegaBOB) in
[https://github.com/Swatinem/rust-cache/pull/247](https://redirect.github.com/Swatinem/rust-cache/pull/247)

##### New Contributors

- [@&#8203;jbransen](https://redirect.github.com/jbransen) made their
first contribution in
[https://github.com/Swatinem/rust-cache/pull/246](https://redirect.github.com/Swatinem/rust-cache/pull/246)
- [@&#8203;stegaBOB](https://redirect.github.com/stegaBOB) made their
first contribution in
[https://github.com/Swatinem/rust-cache/pull/247](https://redirect.github.com/Swatinem/rust-cache/pull/247)

**Full Changelog**:
https://github.com/Swatinem/rust-cache/compare/v2.7.8...v2.8.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 12:24:54 +02:00
renovate[bot]
61482da319
Update pre-commit hook astral-sh/ruff-pre-commit to v0.12.1 (#14362)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[astral-sh/ruff-pre-commit](https://redirect.github.com/astral-sh/ruff-pre-commit)
| repository | minor | `v0.11.13` -> `v0.12.1` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

Note: The `pre-commit` manager in Renovate is not supported by the
`pre-commit` maintainers or community. Please do not report any problems
there, instead [create a Discussion in the Renovate
repository](https://redirect.github.com/renovatebot/renovate/discussions/new)
if you have any questions.

---

### Release Notes

<details>
<summary>astral-sh/ruff-pre-commit (astral-sh/ruff-pre-commit)</summary>

###
[`v0.12.1`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.1)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.12.0...v0.12.1)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.1

###
[`v0.12.0`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.12.0)

[Compare
Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.11.13...v0.12.0)

See: https://github.com/astral-sh/ruff/releases/tag/0.12.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-30 12:13:33 +02:00
renovate[bot]
b2979d25a8
Update acj/freebsd-firecracker-action action to v0.5.1 (#14355)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-29 22:19:38 -04:00
renovate[bot]
e44a64ee13
Update Rust crate windows-registry to v0.5.3 (#14359) 2025-06-29 22:18:26 -04:00
renovate[bot]
e9533a0e29
Update aws-actions/configure-aws-credentials digest to 3d8cba3 (#14354) 2025-06-29 22:18:12 -04:00
renovate[bot]
40386e438f
Update Rust crate owo-colors to v4.2.2 (#14357) 2025-06-29 22:17:59 -04:00
renovate[bot]
a8b838dee9
Update astral-sh/setup-uv action to v6.3.1 (#14360) 2025-06-29 22:17:48 -04:00
renovate[bot]
d7e1fced43
Update Rust crate cargo-util to v0.2.21 (#14356) 2025-06-29 22:17:41 -04:00
Charlie Marsh
7603153f5b
Allow alpha, beta, and rc prefixes in tests (#14352)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
## Summary

A bunch of tests currently fail if you try to use a pre-release version.
This PR makes the regular expressions more lenient.
2025-06-29 19:30:52 +00:00
Charlie Marsh
d15efb7d91
Add an IntoIterator for FormMetadata (#14351)
## Summary

Clippy would lint for this if the symbol were public as a matter of API
hygiene, so adding it.
2025-06-29 15:07:07 -04:00
Zanie Blue
17b7eec287
Consistently normalize trailing slashes on URLs with no path segments (#14349)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Alternative to https://github.com/astral-sh/uv/pull/14348
2025-06-29 12:25:10 -05:00
Zanie Blue
c0ebe6871d
Improve trace message for cached Python interpreter query (#14328) 2025-06-29 09:40:29 -05:00
Charlie Marsh
41c218a89b
Bump version to 0.7.17 (#14347) 2025-06-29 09:58:33 -04:00
Charlie Marsh
734b228edf
Drop trailing slashes when converting index URL from URL (#14346)
## Summary

In #14245, we started normalizing index URLs by dropping the trailing
slash in the lockfile. We added tests to ensure that this didn't cause
existing lockfiles to be invalidated, but we missed one of the
constructors (specifically, the path that's used with
`tool.uv.sources`).
2025-06-29 09:36:13 -04:00
Zanie Blue
f9d3f8ea3b
Fix error message ordering for pyvenv.cfg version conflict (#14329)
These were reversed, and we're missing an "a".
2025-06-29 09:19:05 -04:00
Zanie Blue
ec18f4813a
Fix typo (#14341)
Some checks failed
CI / check cache | ubuntu (push) Has been cancelled
CI / check cache | macos aarch64 (push) Has been cancelled
CI / check system | python on debian (push) Has been cancelled
CI / check system | python on fedora (push) Has been cancelled
CI / check system | python on rocky linux 8 (push) Has been cancelled
CI / check system | python on rocky linux 9 (push) Has been cancelled
CI / check system | graalpy on ubuntu (push) Has been cancelled
CI / check system | pypy on ubuntu (push) Has been cancelled
CI / check system | pyston (push) Has been cancelled
CI / check system | python on macos aarch64 (push) Has been cancelled
CI / check system | homebrew python on macos aarch64 (push) Has been cancelled
CI / check system | python on macos x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86 (push) Has been cancelled
CI / check system | python3.13 on windows x86-64 (push) Has been cancelled
CI / check system | x86-64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | windows registry (push) Has been cancelled
CI / check system | python3.12 via chocolatey (push) Has been cancelled
CI / check system | python3.9 via pyenv (push) Has been cancelled
CI / check system | python3.13 (push) Has been cancelled
CI / check system | conda3.11 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.8 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.11 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.8 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.11 on windows x86-64 (push) Has been cancelled
CI / check system | conda3.8 on windows x86-64 (push) Has been cancelled
CI / check system | amazonlinux (push) Has been cancelled
CI / check system | embedded python3.10 on windows x86-64 (push) Has been cancelled
CI / benchmarks | walltime aarch64 linux (push) Has been cancelled
CI / benchmarks | instrumented (push) Has been cancelled
2025-06-28 11:32:03 -05:00
Zanie Blue
0cfbdcec09
Ignore UV_PYTHON_CACHE_DIR when empty (#14336)
To match our semantics elsewhere
2025-06-28 09:42:27 -05:00
Zanie Blue
608a1020c6
Update the Python query cache comment (#14330) 2025-06-28 09:42:23 -05:00
Zanie Blue
692667cbb0
Use the canonical ImplementationName -> &str implementation (#14337)
Motivated by some code duplication highlighted in
https://github.com/astral-sh/uv/pull/14201, I noticed we weren't taking
advantage of the existing implementation for casting to a str here.
Unfortunately, we do need a special case for CPython still.
2025-06-28 09:42:18 -05:00
github-actions[bot]
db14cc3005
Sync latest Python releases (#14339)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Automated update for Python releases.

Co-authored-by: zanieb <2586601+zanieb@users.noreply.github.com>
2025-06-28 03:08:53 +00:00
Charlie Marsh
731689e503
Apply build constraints when resolving --with dependencies (#14340)
## Summary

We were applying these at install time, but not resolve time.
2025-06-28 01:39:35 +00:00
Zanie Blue
b6b7409d13
Bump version to 0.7.16 (#14334)
Some checks are pending
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-27 16:46:36 -05:00
Aaron Ang
eab938b7b4
Warn users on ~= python version specifier (#14008)
Close #7426

## Summary

Picking up on #8284, I noticed that the `requires_python` object already
has its specifiers canonicalized in the `intersection` method, meaning
`~=3.12` is converted to `>=3.12, <4`. To fix this, we check and warn in
`intersection`.

## Test Plan

Used the same tests from #8284.
2025-06-27 15:48:41 -05:00
Charlie Marsh
6a5d2f1ec4
Share workspace cache between lock and sync operations (#14321)
## Summary

Closes #14316.
2025-06-27 14:48:40 -04:00
Charlie Marsh
4eef79e5e8
Avoid rendering desugared prefix matches in error messages (#14195)
## Summary

When the user provides a requirement like `==2.4.*`, we desugar that to
`>=2.4.dev0,<2.5.dev0`. These bounds then appear in error messages, and
worse, they also trick the error message reporter into thinking that the
user asked for a pre-release.

This PR adds logic to convert to the more-concise `==2.4.*`
representation when possible. We could probably do a similar thing for
the compatible release operator (`~=`).

Closes https://github.com/astral-sh/uv/issues/14177.

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-06-27 18:06:19 +00:00
John Mumm
f892b8564f
Return Cow from UrlString::with_ methods (#14319)
A minor performance improvement as a follow-up to #14245 (and an
accompanying test).
2025-06-27 13:54:52 -04:00
Zanie Blue
74468dac15
Bump python-build-standalone releases to include 3.14.0b3 (#14301)
See
20250626
2025-06-27 12:36:07 -05:00
John Mumm
880c5e4949
Ensure preview default Python installs are upgradeable (#14261)
Python `bin` installations installed with `uv python install --default
--preview` (no version specified) were not being installed as
upgradeable. Instead each link was pointed at the highest patch version
for a minor version. This change ensures that these preview default
installations are also treated as upgradeable.

The PR includes some updates to the related tests. First, it checks the
default install without specified version case. Second, since it's
adding more read link checks, it creates a new `read_link` helper method
to consolidate repeated logic and replace instances of
`#[cfg(unix/windows)` with `if cfg!(unix/windows)`.

Fixes #14247
2025-06-27 19:26:28 +02:00
John Mumm
5754f2f2db
Normalize index URLs to remove trailing slash (#14245)
Some checks are pending
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
This PR updates `IndexUrl` parsing to normalize non-file URLs by
removing trailing slashes. It also normalizes registry source URLs when
using them to validate the lockfile.

Prior to this change, when writing an index URL to the lockfile, uv
would use a trailing slash if present in the provided URL and no
trailing slash otherwise. This can cause surprising behavior. For
example, `uv lock --locked` will fail when a package is added with an
`--index` value without a trailing slash and then `uv lock --locked` is
run with a `pyproject.toml` version of the index URL that contains a
trailing slash. This PR fixes this and adds a test for the scenario.

It might be safe to normalize file URLs in the same way, but since
slashes have a well-defined meaning in the context of files and
directories, I chose not to normalize them here.

Closes #13707.
2025-06-27 17:11:21 +02:00
John Mumm
a824468c8b
Respect URL-encoded credentials in redirect location (#14315)
uv currently ignores URL-encoded credentials in a redirect location.
This PR adds a check for these credentials to the redirect handling
logic. If found, they are moved to the Authorization header in the
redirect request.

Closes #11097
2025-06-27 16:41:14 +02:00
Charlie Marsh
56266447e2
Bump MSRV and rust-toolchain version (#14303)
## Summary

Per our versioning policy, we stay two versions back (and 1.88 was
released today).
2025-06-27 10:27:45 -04:00
Jack O'Connor
efc361223c move the test buckets dir into the canonicalized temp dir
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Previously we were using the XDG data dir to avoid symlinks, but there's no
particular guarantee that that's not going to be a symlink too. Using the
canonicalized temp dir by default is also slightly nicer for a couple reasons:
It's sometimes faster (an in-memory tempfs on e.g. Arch), and it makes
overriding `$TMPDIR` or `%TMP%` sufficient to control where tests put temp
files, without needing to override `UV_INTERNAL__TEST_DIR` too.
2025-06-26 14:56:20 -07:00
Zanie Blue
9ee34dc69b
Fix Indexes::new doc (#14293) 2025-06-26 20:40:40 +00:00
Charlie Marsh
326e4497da
Allow local indexes to reference remote files (#14294)
## Summary

Previously, we assumed that local indexes only referenced local files.
However, it's fine for a local index (like, a `file://`-based Simple
API) to reference a remote file, and in fact Pyodide operates this way.

Closes https://github.com/astral-sh/uv/issues/14227.

## Test Plan

Ran `UV_INDEX=$(pyodide config get package_index) cargo run add anyio`,
which produced this lockfile:

```toml
version = 1
revision = 2
requires-python = ">=3.13.2"

[[package]]
name = "anyio"
version = "4.9.0"
source = { registry = "../../../Library/Caches/.pyodide-xbuildenv-0.30.5/0.27.7/xbuildenv/pyodide-root/package_index" }
dependencies = [
    { name = "idna" },
    { name = "sniffio" },
]
wheels = [
    { url = "https://cdn.jsdelivr.net/pyodide/v0.27.7/full/anyio-4.9.0-py3-none-any.whl", hash = "sha256:e1d9180d4361fd71d1bc4a7007fea6cae1d18792dba9d07eaad89f2a8562f71c" },
]

[[package]]
name = "foo"
version = "0.1.0"
source = { virtual = "." }
dependencies = [
    { name = "anyio" },
]

[package.metadata]
requires-dist = [{ name = "anyio", specifier = ">=4.9.0" }]

[[package]]
name = "idna"
version = "3.7"
source = { registry = "../../../Library/Caches/.pyodide-xbuildenv-0.30.5/0.27.7/xbuildenv/pyodide-root/package_index" }
wheels = [
    { url = "https://cdn.jsdelivr.net/pyodide/v0.27.7/full/idna-3.7-py3-none-any.whl", hash = "sha256:9d4685891e3e37434e09b1becda7e96a284e660c7aea9222564d88b6c3527c09" },
]

[[package]]
name = "sniffio"
version = "1.3.1"
source = { registry = "../../../Library/Caches/.pyodide-xbuildenv-0.30.5/0.27.7/xbuildenv/pyodide-root/package_index" }
wheels = [
    { url = "https://cdn.jsdelivr.net/pyodide/v0.27.7/full/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:9215f9917b34fc73152b134a3fc0a2eb0e4a49b0b956100cad75e84943412bb9" },
]
```
2025-06-26 20:17:42 +00:00
Charlie Marsh
05ab266200
Avoid using path URL for workspace Git dependencies in requirements.txt (#14288)
## Summary

Closes https://github.com/astral-sh/uv/issues/13020.
2025-06-26 19:48:12 +00:00
Charlie Marsh
c291d4329a
Include path or URL when failing to convert in lockfile (#14292)
## Summary

E.g., in #14227, we now get:

```
error: Failed to convert URL to path: https://cdn.jsdelivr.net/pyodide/v0.27.7/full/sniffio-1.3.1-py3-none-any.whl
```
2025-06-26 19:42:04 +00:00
Jack O'Connor
d4d6ede23b Lock the source tree when running setuptools, to protect concurrent builds
Fixes https://github.com/astral-sh/uv/issues/13703
2025-06-26 12:28:15 -07:00
Jack O'Connor
60528e3e25 Annotate LockedFile with #[must_use]
Standard lock guards have the same annotation, because creating them
without binding them to a local variable is almost always a mistake.
2025-06-26 12:28:15 -07:00
Zanie Blue
1ff8fc0947
Use Flit instead of Poetry for uninstall tests (#14285)
Investigating https://github.com/astral-sh/uv/issues/14158
2025-06-26 18:09:04 +00:00
Zanie Blue
8c27c2b494
Add verbose output on flake for run_groups_requires_python (#14275)
See https://github.com/astral-sh/uv/issues/14160

Same as https://github.com/astral-sh/uv/pull/13817
2025-06-26 12:11:34 -05:00
Zanie Blue
d27cec78b4
Restore snapshot for sync_dry_run (#14274)
In addition to our flake catch, keep a snapshot.

Extends https://github.com/astral-sh/uv/pull/13817
2025-06-26 16:23:37 +00:00
Aria Desires
1e02008d8b
add more proper docker login if (#14278) 2025-06-26 12:05:45 -04:00
Zanie Blue
469246d177
Fix emit_index_annotation_multiple_indexes test case (#14277)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
uv is taken on Test PyPI now, so the existing test fails
2025-06-26 14:58:24 +00:00
John Mumm
a27e60a22f
Temporarily disable Artifactory registry test (#14276)
I'm waiting on a response to get our subscription back up. Then I can
re-enable this. But for now, this would cause failing CI tests.
2025-06-26 09:47:18 -05:00
Daniel Vianna
4b348512c2
GCP Artifact Registry download URLs must have /simple path (#14251)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-25 17:35:41 +02:00
John Mumm
4ed9c5791b
Bump version to 0.7.15 (#14254)
Some checks are pending
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-25 12:06:41 +02:00
John Mumm
177df19f30
Add check for using minor version link when creating a venv on Windows (#14252)
There was a regression introduced in #13954 on Windows where creating a
venv behaved as if there was a minor version link even if none existed.
This PR adds a check to fix this.

Closes #14249.
2025-06-25 10:12:32 +02:00
John Mumm
5b2c3595a7
Require disambiguated relative paths for --index (#14152)
We do not currently support passing index names to `--index` for
installing packages. However, we do accept relative paths that can look
like index names. This PR adds the requirement that `--index` values
must be disambiguated with a prefix (`./` or `../` on Unix and Windows
or `.\\` or `..\\` on Windows). For now, if an ambiguous value is
provided, uv will warn that this will not be supported in the future.

Currently, if you provide an index name like `--index test` when there
is no `test` directory, uv will error with a `Directory not found...`
error. That's not very informative if you thought index names were
supported. The new warning makes the context clearer.

Closes #13921
2025-06-25 10:02:06 +02:00
konsti
283323a78a
Allow symlinks in the build backend (#14212)
In workspaces with multiple packages, you usually don't want to include
shared files such as the license repeatedly. Instead, we reading from
symlinked files. This would be supported if we had used std's `is_file`
and read methods, but walkdir's `is_file` does not consider symlinked
files as files.

See https://github.com/astral-sh/uv/issues/3957#issuecomment-2994675003
2025-06-25 07:44:22 +00:00
ya7010
ac788d7cde
Update schemars 1.0.0 (#13693)
Some checks are pending
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary
Update [schemars
0.9.0](https://github.com/GREsau/schemars/releases/tag/v0.9.0)

There are differences in the generated JSON Schema and I will [contact
the author](https://github.com/GREsau/schemars/issues/407).

## Test Plan

---------

Co-authored-by: konstin <konstin@mailbox.org>
2025-06-24 21:43:31 +02:00
Christopher Tee
9fba7a4768
Consistently use Ordering::Relaxed for standalone atomic use cases (#14190) 2025-06-24 12:30:26 -07:00
Christopher Tee
fe11ceedfa
Skip GitHub fast path when rate-limited (#13033) 2025-06-24 12:11:41 -07:00
dmitry-bychkov
61265b0c14
Add a link to PyPI FAQ to clarify what per-project token is. (#14242)
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

This change adds a link to PyPI FAQ about API tokens on the package
publishing guide page. To me it wasn't clear what are meant in this
section of the docs and it required a little bit of research. Adding
explicit link might help beginners.

<!-- What's the purpose of the change? What does it do, and why? -->

Co-authored-by: Dmitry Bychkov <dbychkov@alarislabs.com>
2025-06-24 11:56:36 -04:00
Charlie Marsh
606633d35f
Remove wheel filename benchmark (#14240)
## Summary

This flakes often and we don't really need it to be monitored
continuously. We can always revive it from Git later.

Closes https://github.com/astral-sh/uv/issues/13952.
2025-06-24 11:54:12 -04:00
konsti
f20659e1ce
Don't log GitHub fast path usage if it's cached (#14235)
Don't log that we resolved a reference through the GitHub fast path if
we didn't use GitHub at all but used the cached revision. This avoids
stating that the fast path works when it's blocked due to unrelated
reasons (e.g. rate limits).
2025-06-24 11:53:10 -04:00
Charlie Marsh
093e9d6ff0
Add "python-eol" feature to Sphinx tests (#14241)
## Summary

Closes https://github.com/astral-sh/uv/issues/14228.
2025-06-24 11:15:18 -04:00
Ben Beasley
19c58c7fbb
Update wiremock to 0.6.4 (#14238)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

In e10881d49c, `uv` started using a fork
of the `wiremock` crate, https://github.com/astral-sh/wiremock-rs,
linking companion PR
https://github.com/LukeMathWalker/wiremock-rs/pull/159. That PR was
merged in `wiremock` 0.6.4, so this PR switches back to the crates.io
version of `wiremock`, with a minimum version of 0.6.4.
<!-- What's the purpose of the change? What does it do, and why? -->

## Test Plan

```
$ cargo run python install
$ cargo test
````
2025-06-24 13:04:55 +00:00
Charlie Marsh
aa2448ef83
Strip query parameters when parsing source URL (#14224)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
## Summary

Closes https://github.com/astral-sh/uv/issues/14217.
2025-06-23 14:52:07 -04:00
Charlie Marsh
d9351d52fc
Remove wheel filename-from URL conversion (#14223)
## Summary

This appears to be unused.
2025-06-23 14:26:14 -04:00
Aria Desires
e7f5967111
Don't block docker logins on it being a pull-request (#14222) 2025-06-23 13:30:49 -04:00
Aria Desires
92de53f4eb
Bump version to 0.7.14 (#14218) 2025-06-23 12:48:51 -04:00
John Mumm
2d2dd0c1a3
Update upgrade tests to use 3.10.17 instead of 3.10.8 (#14219)
@oconnor663 discovered that executing `3.10.8` on Arch Linux ran into an
error loading `libcrypt.so.1`. This caused uv to install the latest
patch version on `uv venv` operations during upgrade tests, which
undermined their purpose (since they are checking that if you first
install `3.10.8` and then upgrade, virtual environments are
transparently upgraded). This PR updates the test to use `3.10.17`
instead to avoid this issue.
2025-06-23 18:19:36 +02:00
John Mumm
b06dec8398
Improve Python uninstall perf by removing unnecessary call to installations.find_all() (#14180)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
#13954 introduced an unnecessary slow-down to Python uninstall by
calling `installations.find_all()` to discover remaining installations
after an uninstall. Instead, we can filter all initial installations
against those in `uninstalled`.

As part of this change, I've updated `uninstalled` from a `Vec` to an
`IndexSet` in order to do efficient lookups in the filter. This required
a change I call out below to how we were retrieving them for messaging.
2025-06-23 15:51:44 +02:00
John Mumm
6481aa3e64
Consolidate logic for checking for a virtual environment (#14214)
We were checking whether a path was an executable in a virtual
environment or the base directory of a virtual environment in multiple
places in the codebase. This PR consolidates this logic into one place.

Closes #13947.
2025-06-23 15:12:43 +02:00
renovate[bot]
a9a9e71481
Update google-github-actions/setup-gcloud digest to a8b5801 (#14205)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| google-github-actions/setup-gcloud | action | digest | `77e7a55` ->
`a8b5801` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-23 10:20:59 +00:00
renovate[bot]
ac1405e06c
Update Rust crate syn to v2.0.104 (#14208)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [syn](https://redirect.github.com/dtolnay/syn) |
workspace.dependencies | patch | `2.0.103` -> `2.0.104` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>dtolnay/syn (syn)</summary>

###
[`v2.0.104`](https://redirect.github.com/dtolnay/syn/releases/tag/2.0.104)

[Compare
Source](https://redirect.github.com/dtolnay/syn/compare/2.0.103...2.0.104)

- Disallow attributes on range expression
([#&#8203;1872](https://redirect.github.com/dtolnay/syn/issues/1872))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-23 12:07:36 +02:00
renovate[bot]
3e9dbe8b7d
Update Rust crate mimalloc to v0.1.47 (#14207)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [mimalloc](https://redirect.github.com/purpleprotocol/mimalloc_rust) |
dependencies | patch | `0.1.46` -> `0.1.47` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>purpleprotocol/mimalloc_rust (mimalloc)</summary>

###
[`v0.1.47`](https://redirect.github.com/purpleprotocol/mimalloc_rust/releases/tag/v0.1.47):
Version 0.1.47

[Compare
Source](https://redirect.github.com/purpleprotocol/mimalloc_rust/compare/v0.1.46...v0.1.47)

##### Changes

- Mimalloc `v2.2.4`

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-23 12:07:09 +02:00
renovate[bot]
46221b40c3
Update EmbarkStudios/cargo-deny-action action to v2.0.12 (#14206)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[EmbarkStudios/cargo-deny-action](https://redirect.github.com/EmbarkStudios/cargo-deny-action)
| action | patch | `v2.0.11` -> `v2.0.12` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>EmbarkStudios/cargo-deny-action
(EmbarkStudios/cargo-deny-action)</summary>

###
[`v2.0.12`](https://redirect.github.com/EmbarkStudios/cargo-deny-action/releases/tag/v2.0.12):
Release 2.0.12 - cargo-deny 0.18.3

[Compare
Source](https://redirect.github.com/EmbarkStudios/cargo-deny-action/compare/v2.0.11...v2.0.12)

##### Changed

-
[PR#773](https://redirect.github.com/EmbarkStudios/cargo-deny/pull/773)
changed cargo-deny's duplicate detection to automatically ignore
versions whose only dependent is another version of the same crate.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-23 12:06:20 +02:00
renovate[bot]
a52595b61a
Update google-github-actions/auth digest to 0920706 (#14204)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| google-github-actions/auth | action | digest | `ba79af0` -> `0920706`
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-23 12:04:50 +02:00
renovate[bot]
7fce3a88b8
Update aws-actions/configure-aws-credentials digest to 3bb878b (#14203)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| aws-actions/configure-aws-credentials | action | digest | `b475783` ->
`3bb878b` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-23 12:04:10 +02:00
Charlie Marsh
a82c210cab
Add auto-detection for AMD GPUs (#14176)
Some checks failed
CI / check cache | macos aarch64 (push) Has been cancelled
CI / check system | python on debian (push) Has been cancelled
CI / check system | python on fedora (push) Has been cancelled
CI / check system | python on ubuntu (push) Has been cancelled
CI / check system | python on rocky linux 8 (push) Has been cancelled
CI / check system | python on rocky linux 9 (push) Has been cancelled
CI / check system | graalpy on ubuntu (push) Has been cancelled
CI / check system | pypy on ubuntu (push) Has been cancelled
CI / check system | pyston (push) Has been cancelled
CI / check system | python on macos aarch64 (push) Has been cancelled
CI / check system | homebrew python on macos aarch64 (push) Has been cancelled
CI / check system | python on macos x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86 (push) Has been cancelled
CI / check system | python3.13 on windows x86-64 (push) Has been cancelled
CI / check system | x86-64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | windows registry (push) Has been cancelled
CI / check system | python3.12 via chocolatey (push) Has been cancelled
CI / check system | python3.9 via pyenv (push) Has been cancelled
CI / check system | python3.13 (push) Has been cancelled
CI / check system | conda3.11 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.8 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.11 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.8 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.11 on windows x86-64 (push) Has been cancelled
CI / check system | conda3.8 on windows x86-64 (push) Has been cancelled
CI / check system | amazonlinux (push) Has been cancelled
CI / check system | embedded python3.10 on windows x86-64 (push) Has been cancelled
CI / benchmarks | walltime aarch64 linux (push) Has been cancelled
CI / benchmarks | instrumented (push) Has been cancelled
## Summary

Allows `--torch-backend=auto` to detect AMD GPUs. The approach is fairly
well-documented inline, but I opted for `rocm_agent_enumerator` over
(e.g.) `rocminfo` since it seems to be the recommended approach for
scripting:
https://rocm.docs.amd.com/projects/rocminfo/en/latest/how-to/use-rocm-agent-enumerator.html.

Closes https://github.com/astral-sh/uv/issues/14086.

## Test Plan

```
root@rocm-jupyter-gpu-mi300x1-192gb-devcloud-atl1:~# ./uv-linux-libc-11fb582c5c046bae09766ceddd276dcc5bb41218/uv pip install torch --torch-backend=auto
Resolved 11 packages in 251ms
Prepared 2 packages in 6ms
Installed 11 packages in 257ms
 + filelock==3.18.0
 + fsspec==2025.5.1
 + jinja2==3.1.6
 + markupsafe==3.0.2
 + mpmath==1.3.0
 + networkx==3.5
 + pytorch-triton-rocm==3.3.1
 + setuptools==80.9.0
 + sympy==1.14.0
 + torch==2.7.1+rocm6.3
 + typing-extensions==4.14.0
```

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-06-21 15:21:06 +00:00
Zanie Blue
f0407e4b6f
Only use the release environment in Docker builds on push (#14189)
This drastically reduces the `release` deployment noise, e.g., as seen
in https://github.com/astral-sh/uv/pull/14165
2025-06-21 14:36:07 +00:00
Zanie Blue
b18f45db14
Restore Docker image annotations and fix attestations (#14165)
More follow-up to #13459 

- Depot doesn't support annotations, so we push those manually
- Docker push for the re-tag was breaking the manifest, since we need to
annotate manually, we just do that instead
- We attest after the annotation

A bit of an aside

- We test building the extra images, it's very fast and I don't see why
it's better to gate it

I tested this on my fork then cleaned it up a bit for a commit here. You
can see the images at

- https://github.com/zanieb/uv/pkgs/container/uv 
- https://hub.docker.com/r/astral/uv/tags

---------

Co-authored-by: samypr100 <3933065+samypr100@users.noreply.github.com>
2025-06-21 09:01:20 -05:00
John Mumm
8352560b98
Only update existing symlink directories on preview uninstall (#14179)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
On preview uninstall, we should not create a new minor version symlink
directory if one doesn't exist. We should only update existing ones.
2025-06-21 09:50:41 +02:00
Charlie Marsh
0fef253c4b
Use a dedicated type for form metadata (#14175)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-20 20:33:29 -04:00
Charlie Marsh
e59835d50c
Add XPU to --torch-backend (#14172)
## Summary

Like ROCm, no auto-detection for now.
2025-06-20 20:33:20 -04:00
Jack O'Connor
0133bcc8ca
(f)lock during uv run (#14153)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
This is very similar to the locking added for `uv sync`, `uv add`, and
`uv remove` in https://github.com/astral-sh/uv/pull/13869. Improving our
(f)locking in general is tracked in
https://github.com/astral-sh/uv/issues/13883.
2025-06-20 20:34:45 +00:00
Zanie Blue
1dbe750452
Refactor PythonVersionFile global loading (#14107)
I was looking into `uv tool` not supporting version files, and noticed
this implementation was confusing and skipped handling like a tracing
log if `--no-config` excludes selection a file. I've refactored it in
preparation for the next change.
2025-06-20 15:31:32 -05:00
Lucas Vittor
563e9495ba
Replace cuda124 with cuda128 (#14168)
## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Replace wrong `cuda124` version to the correct `cuda128` version in
torch docs

## Test Plan

<!-- How was it tested? -->
2025-06-20 16:05:17 -04:00
Aria Desires
c710246d76
Update cargo-dist (#14156)
Also took the time to migrate to the external config format to normalize
our projects for team comfort (`ty` *has* to use this format for its
workspace structure).
2025-06-20 14:20:01 -04:00
John Mumm
e9d5780369
Support transparent Python patch version upgrades (#13954)
> NOTE: The PRs that were merged into this feature branch have all been
independently reviewed. But it's also useful to see all of the changes
in their final form. I've added comments to significant changes
throughout the PR to aid discussion.

This PR introduces transparent Python version upgrades to uv, allowing
for a smoother experience when upgrading to new patch versions.
Previously, upgrading Python patch versions required manual updates to
each virtual environment. Now, virtual environments can transparently
upgrade to newer patch versions.

Due to significant changes in how uv installs and executes managed
Python executables, this functionality is initially available behind a
`--preview` flag. Once an installation has been made upgradeable through
`--preview`, subsequent operations (like `uv venv -p 3.10` or patch
upgrades) will work without requiring the flag again. This is
accomplished by checking for the existence of a minor version symlink
directory (or junction on Windows).

### Features

* New `uv python upgrade` command to upgrade installed Python versions
to the latest available patch release:
``` 
# Upgrade specific minor version 
uv python upgrade 3.12 --preview
# Upgrade all installed minor versions
uv python upgrade --preview
```
* Transparent upgrades also occur when installing newer patch versions: 
```
uv python install 3.10.8 --preview
# Automatically upgrades existing 3.10 environments
uv python install 3.10.18
```
* Support for transparently upgradeable Python `bin` installations via
`--preview` flag
```
uv python install 3.13 --preview
# Automatically upgrades the `bin` installation if there is a newer patch version available
uv python upgrade 3.13 --preview
```
* Virtual environments can still be tied to a patch version if desired
(ignoring patch upgrades):
```
uv venv -p 3.10.8
```

### Implementation

Transparent upgrades are implemented using:
* Minor version symlink directories (Unix) or junctions (Windows)
* On Windows, trampolines simulate paths with junctions
* Symlink directory naming follows Python build standalone format: e.g.,
`cpython-3.10-macos-aarch64-none`
* Upgrades are scoped to the minor version key (as represented in the
naming format: implementation-minor version+variant-os-arch-libc)
* If the context does not provide a patch version request and the
interpreter is from a managed CPython installation, the `Interpreter`
used by `uv python run` will use the full symlink directory executable
path when available, enabling transparently upgradeable environments
created with the `venv` module (`uv run python -m venv`)

New types:
* `PythonMinorVersionLink`: in a sense, the core type for this PR, this
is a representation of a minor version symlink directory (or junction on
Windows) that points to the highest installed managed CPython patch
version for a minor version key.
* `PythonInstallationMinorVersionKey`: provides a view into a
`PythonInstallationKey` that excludes the patch and prerelease. This is
used for grouping installations by minor version key (e.g., to find the
highest available patch installation for that minor version key) and for
minor version directory naming.

### Compatibility

* Supports virtual environments created with:
  * `uv venv`
* `uv run python -m venv` (using managed Python that was installed or
upgraded with `--preview`)
  * Virtual environments created within these environments
* Existing virtual environments from before these changes continue to
work but aren't transparently upgradeable without being recreated
* Supports both standard Python (`python3.10`) and freethreaded Python
(`python3.10t`)
* Support for transparently upgrades is currently only available for
managed CPython installations

Closes #7287
Closes #7325
Closes #7892
Closes #9031
Closes #12977

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-06-20 16:17:13 +02:00
John Mumm
62365d4ec8
Support netrc and same-origin credential propagation on index redirects (#14126)
Some checks are pending
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
This PR is a combination of #12920 and #13754. Prior to these changes,
following a redirect when searching indexes would bypass our
authentication middleware. This PR updates uv to support propagating
credentials through our middleware on same-origin redirects and to
support netrc credentials for both same- and cross-origin redirects. It
does not handle the case described in #11097 where the redirect location
itself includes credentials (e.g.,
`https://user:pass@redirect-location.com`). That will be addressed in
follow-up work.

This includes unit tests for the new redirect logic and integration
tests for credential propagation. The automated external registries test
is also passing for AWS CodeArtifact, Azure Artifacts, GCP Artifact
Registry, JFrog Artifactory, GitLab, Cloudsmith, and Gemfury.
2025-06-20 09:21:32 +02:00
Jack O'Connor
cc8d5a9215
handle an existing shebang in uv init --script (#14141)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Closes https://github.com/astral-sh/uv/issues/14085.
2025-06-19 14:47:22 -07:00
Jack O'Connor
c3e4b63806 document the way member sources shadow workspace sources
Some checks are pending
CI / cargo test | windows (push) Blocked by required conditions
CI / check windows trampoline | aarch64 (push) Blocked by required conditions
CI / check windows trampoline | i686 (push) Blocked by required conditions
CI / check windows trampoline | x86_64 (push) Blocked by required conditions
CI / test windows trampoline | i686 (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / ecosystem test | pallets/flask (push) Blocked by required conditions
CI / smoke test | linux (push) Blocked by required conditions
CI / smoke test | macos (push) Blocked by required conditions
CI / smoke test | windows x86_64 (push) Blocked by required conditions
CI / smoke test | windows aarch64 (push) Blocked by required conditions
CI / integration test | conda on ubuntu (push) Blocked by required conditions
CI / integration test | deadsnakes python3.9 on ubuntu (push) Blocked by required conditions
CI / integration test | free-threaded on windows (push) Blocked by required conditions
CI / integration test | pypy on ubuntu (push) Blocked by required conditions
CI / integration test | pypy on windows (push) Blocked by required conditions
CI / integration test | graalpy on ubuntu (push) Blocked by required conditions
CI / integration test | graalpy on windows (push) Blocked by required conditions
CI / integration test | pyodide on ubuntu (push) Blocked by required conditions
CI / integration test | github actions (push) Blocked by required conditions
CI / integration test | free-threaded python on github actions (push) Blocked by required conditions
CI / integration test | determine publish changes (push) Blocked by required conditions
CI / Determine changes (push) Waiting to run
CI / lint (push) Waiting to run
CI / cargo clippy | ubuntu (push) Blocked by required conditions
CI / cargo clippy | windows (push) Blocked by required conditions
CI / cargo dev generate-all (push) Blocked by required conditions
CI / cargo shear (push) Waiting to run
CI / cargo test | ubuntu (push) Blocked by required conditions
CI / cargo test | macos (push) Blocked by required conditions
Closes https://github.com/astral-sh/uv/issues/14093.
2025-06-18 15:31:23 -07:00
Zanie Blue
e1046242e7
Fix Docker attestations (#14133)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
These regressed in #14088 and were found during my test publish from a
fork.
2025-06-18 18:46:30 +00:00
Zanie Blue
1fc65a1d9d
Publish to DockerHub (#14088)
The primary motivation here is to avoid confusion with non-official
repositories, e.g., https://github.com/astral-sh/uv/issues/13958 which
could lead to attacks against our users.

Resolves

- https://github.com/astral-sh/uv/issues/12679
- #8699
2025-06-18 11:30:37 -05:00
Zanie Blue
75d4cd30d6
Use Depot for Windows cargo test (#14122)
Replaces https://github.com/astral-sh/uv/pull/12320

Switches to Depot for the large Windows runner we use for `cargo test`.

The runtime goes from 8m 20s -> 6m 44s (total) and 7m 18s -> 4m 41s
(test run) which are 20% and 35% speedups respectively.

A few things got marginally slower, like Python installs went from 11s
-> 38s, the Rust cache went from 15s -> 30s, and drive setup went from
7s -> 20s.
2025-06-18 09:55:09 -05:00
John Mumm
611a13c841
Fix benchmark compilation failure: cannot find attribute clap in this scope (#14128)
[Two benchmark
jobs](4433771099)
were failing with `error: cannot find attribute clap in this scope`
based on #14120. This updates the recently added `#[clap(name = rocm...`
lines to use `cfg_attr(feature = "clap",`.
2025-06-18 16:30:12 +02:00
konsti
ee0ba65eb2
Unify test venv python command creation (#14117)
Refactoring in preparation for
https://github.com/astral-sh/uv/pull/14080
2025-06-18 15:06:09 +02:00
Charlie Marsh
4d9c9a1e76
Add ROCm backends to --torch-backend (#14120)
Some checks are pending
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
We don't yet support automatic detection, but this at least allows
explicit selection (e.g., `uv pip install --torch-backend rocm5.3`).

Closes #14087.
2025-06-18 07:35:05 -04:00
Aaron Ang
0cac73dc1f
Warn on empty index directory (#13940)
Close #13922

## Summary

Add a warning if the directory given by the `--index` argument is empty.

## Test Plan

Added test case `add_index_empty_directory` in `edit.rs`
2025-06-18 10:48:21 +02:00
konsti
499c8aa808
Fix PyPI publish test script (#14116)
The script stumbled over a newline introduced in
https://github.com/pypi/warehouse/pull/18266 (which is valid).

Also fixed: Don't read versions for the same package from other indexes.
We were using `project_name` here instead of `target`, while using the
latter and only reading from a single index simplifies the code too.
2025-06-18 09:51:53 +02:00
John Mumm
2fc922144a
Add script for testing uv against different registries (#13615)
This PR provides a script that uses environment variables to determine
which registries to test. This script is being used to run automated
registry tests in CI for AWS, Azure, GCP, Artifactory, GitLab,
Cloudsmith, and Gemfury.

You must configure the following required env vars for each registry:
```
    UV_TEST_<registry_name>_URL            URL for the registry
    UV_TEST_<registry_name>_TOKEN       authentication token
    UV_TEST_<registry_name>_PKG          private package to install
```

The username defaults to "\_\_token\_\_" but can be optionally set with:
```
    UV_TEST_<registry_name>_USERNAME
```

For each configured registry, the test will attempt to install the
specified package. Some registries can fall back to PyPI internally, so
it's important to choose a package that only exists in the registry you
are testing.

Currently, a successful test means that it finds the line “ +
<package_name>” in the output. This is because in its current form we
don’t know ahead of time what package it is and hence what the exact
expected output would be. The advantage if that anyone can run this
locally, though they would have to have access to the registries they
want to test.

You can also use the `--use-op` command line argument to derive these
test env vars from a 1Password vault (default is "RegistryTests" but can
be configured with `--op-vault`). It will look at all items in the vault
with names following the pattern `UV_TEST_<registry_name>` and will
derive the env vars as follows:

```
    `UV_TEST_<registry_name>_USERNAME` from the `username` field
    `UV_TEST_<registry_name>_TOKEN` from the `password` field
    `UV_TEST_<registry_name>_URL` from a field with the label `url`
    `UV_TEST_<registry_name>_PKG` from a field with the label `pkg`
```
2025-06-18 09:43:45 +02:00
Zanie Blue
47c522f9be
Serialize Python requests for tools as canonicalized strings (#14109)
Some checks are pending
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
When working on support for reading global Python pins in tool
operations, I noticed that we weren't using the canonicalized Python
request in receipts — we were using the raw string provided by the user.
Since we'll need to compare these values, we should be using the
canonicalized string.

The `Tool` and `ToolReceipt` types have been updated to hold a
`PythonRequest` instead of a `String`, and `Serialize` was implemented
for `PythonRequest` so canonicalization can happen at the edge instead
of being the caller's responsibility.
2025-06-17 17:45:11 -05:00
Charlie Marsh
6c096246d8
Remove preview label from --torch-backend (#14119)
This is now used in enough places that I'm comfortable committing to
maintaining it under our versioning policy.

Closes #14091.
2025-06-17 16:49:00 -04:00
Zanie Blue
8808e67cff
Add a docker-plan step to consolidate push and tag logic (#14083)
The dist plan parsing is pretty hard to understand, and I want to add
more images, e.g., for DockerHub in #14088. As a simplifying
precursor... move the dist plan processing into a dedicated step.
2025-06-17 15:04:39 -05:00
Zanie Blue
c25c800367
Fix Ruff linting (#14111)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-17 17:28:23 +00:00
konsti
10e1d17cfc
Don't use walrus operator in interpreter query script (#14108)
Fix `uv run -p 3.7` by not using a walrus operator. Python 3.7 isn't
really supported anymore, but there's no reason to break interpreter
discovery for it.
2025-06-17 12:18:08 -05:00
Andrew Gallant
3d4f0c934e
Fix handling of changes to requires-python (#14076)
When using `uv lock --upgrade-package=python` after changing
`requires-python`, it was possible to get into a state where the fork
markers produced corresponded to the empty set. This in turn resulted in
an empty lock file.

There was already some infrastructure in place that I think was perhaps
intended to handle this. In particular, `Lock::check_marker_coverage`
checks whether the fork markers have some overlap with the supported
environments (including the `requires-python`). But there were two
problems with this.

First is that in lock validation, this marker coverage check came
_after_ a path that returned `Preferable` (meaning that the fork markers
should be kept) when `--upgrade-package` was used. Second is that the
marker coverage check used the `requires-python` in the lock file and
_not_ the `requires-python` in the now updated `pyproject.toml`.

We attempt to solve this conundrum by slightly re-arranging lock file
validation and by explicitly checking whether the *new*
`requires-python` is disjoint from the fork markers in the lock file. If
it is, then we return `Versions` from lock file validation (indicating
that the fork markers should be dropped).

Fixes #13951
2025-06-17 11:50:05 -04:00
FishAlchemist
d653fbb133
doc: Sync PyTorch integration index for CUDA and ROCm versions from PyTorch website. (#14100)
## Summary
Just to sync the documentation with PyTorch's officially recommended
installation method.

![image](https://github.com/user-attachments/assets/7be0aea6-b51b-4083-acae-fbe8aa79c363)
**Change:**
* CUDA 12.1 to CUDA 12.6
* CUDA 12.4 to CUDA 12.8
* ROCm 6.2 to ROCm 6.3
## Test Plan
Run doc server in local.
Result:
<details><summary>CUDA 12.6</summary>
<p>


![image](https://github.com/user-attachments/assets/962a1058-3922-4709-a57f-200939d0a397)

![image](https://github.com/user-attachments/assets/0dae693f-2dc1-4d3e-9186-d26aa84c7d73)

</p>
</details> 
<details><summary>CUDA 12.8</summary>
<p>


![image](https://github.com/user-attachments/assets/dc12d09b-f8f2-41d3-b185-cb1e4e8b062b)

![image](https://github.com/user-attachments/assets/1b5490e3-6265-4bad-8af2-eab0a207adab)

</p>
</details> 
<details><summary>ROCm6</summary>
<p>


![image](https://github.com/user-attachments/assets/3b17a24f-3210-4a99-a81b-f8f2f5578b73)

![image](https://github.com/user-attachments/assets/c7baae55-14e5-45d0-94a0-164ab31bbffc)

</p>
</details>
2025-06-17 09:55:03 -04:00
John Mumm
e02cd74e64
Turn off clippy::struct_excessive_bools rule (#14102)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
We always ignore the `clippy::struct_excessive_bools` rule and formerly
annotated this at the function level. This PR specifies the allow in
`workspace.lints.clippy` in `Cargo.toml`.
2025-06-17 12:18:54 +02:00
Zanie Blue
cf67d9c633
Clear XDG_DATA_HOME during test runs (#14075)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-16 15:01:17 -05:00
Kyle Galbraith
d73d3e8b53
Refactor Docker image builds to use Depot (#13459)
Some checks are pending
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
## Summary

Simplify the Docker image build process to leverage Depot container
builders for faster layer caching and native multi-platform image
builds. The combo of the two removes the need to save cache to
registries and do complex merge operations across GHA runners to get a
multi-platform image.

## Test Plan

UV team will need to add a trust relationship in Depot so that the
container builds can authenticate and run. This can be done following
these docs:
https://depot.dev/docs/cli/authentication#adding-a-trust-relationship-for-github-actions.

Once that is done, this should just work as before, but without all of
the extra work around manifests. We should double that all of the
tagging still makes sense for you all, as some bits of that were
unclear.

Additional context in this draft PR:
https://github.com/astral-sh/uv/pull/9156

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-06-16 13:32:35 -05:00
konsti
423cfaabf5
Show backtraces for CI crashes (#14081)
I only just realized that we can get backtraces for crashes with
`RUST_BACKTRACE: 1`, even non-panics
(https://github.com/astral-sh/uv/pull/14079). This is much better than
trying to analyze crash dumps.
2025-06-16 17:10:21 +00:00
Andrew Gallant
5c1ebf902b
Add rustfmt.toml (#14072)
We've gotten away without this file for a while. In particular, we
explicitly use its default settings.

However, this is occasionally problematic in certain contexts where
`rustfmt` is invoked directly. Or in contexts where the Rust Edition is
otherwise not specified. At least, this happens when using the Rust vim
plugin. When an edition isn't explicitly specified, it defaults back to
the 2015 edition.

I think that there aren't a lot of rustfmt changes, and so we've been
able to get away with this for a while. But it looks like something in
the 2024 edition changes how imports are ordered. So to make it explicit
that we want to use the 2024 edition of rustfmt, we opt into it.

This is analogous to a change made to the Ruff repository somewhat
recently: https://github.com/astral-sh/ruff/pull/18197
2025-06-16 10:05:56 -04:00
konsti
cd71ad1672
Show retries for HTTP status code errors (#13897)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Using a companion change in the middleware
(https://github.com/TrueLayer/reqwest-middleware/pull/235, forked&tagged
pending review), we can check and show retries for HTTP status core
errors, to consistently report retries again.

We fix two cases:
* Show retries for status code errors for cache client requests
* Show retries for status code errors for Python download requests

Not handled:
* Show previous retries when a distribution download fails mid-streaming
* Perform retries when a distribution download fails mid-streaming
* Show previous retries when a Python download fails mid-streaming
* Perform retries when a Python download fails mid-streaming
2025-06-16 10:14:00 +00:00
renovate[bot]
7c90c5be02
Update conda-incubator/setup-miniconda action to v3.2.0 (#14061)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[conda-incubator/setup-miniconda](https://redirect.github.com/conda-incubator/setup-miniconda)
| action | minor | `v3.1.1` -> `v3.2.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>conda-incubator/setup-miniconda
(conda-incubator/setup-miniconda)</summary>

###
[`v3.2.0`](https://redirect.github.com/conda-incubator/setup-miniconda/blob/HEAD/CHANGELOG.md#v320-2025-06-04)

[Compare
Source](https://redirect.github.com/conda-incubator/setup-miniconda/compare/v3.1.1...v3.2.0)

##### Fixes

- Check all `.condarc` files when removing `defaults` by
[@&#8203;marcoesters](https://redirect.github.com/marcoesters) in

[https://github.com/conda-incubator/setup-miniconda/pull/398](https://redirect.github.com/conda-incubator/setup-miniconda/pull/398)/398
- Add version normalization for minicondaVersion in input validation by
[@&#8203;jezdez](https://redirect.github.com/jezdez)

[https://github.com/conda-incubator/setup-miniconda/pull/397](https://redirect.github.com/conda-incubator/setup-miniconda/pull/397)/397
- Workaround for auto_activate_base deprecation by
[@&#8203;jaimergp](https://redirect.github.com/jaimergp) in

[https://github.com/conda-incubator/setup-miniconda/pull/402](https://redirect.github.com/conda-incubator/setup-miniconda/pull/402)/402

##### Tasks and Maintenance

- Bump conda-incubator/setup-miniconda from 3.1.0 to 3.1.1 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in

[https://github.com/conda-incubator/setup-miniconda/pull/391](https://redirect.github.com/conda-incubator/setup-miniconda/pull/391)/391
- Bump undici from 5.28.4 to 5.28.5 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in

[https://github.com/conda-incubator/setup-miniconda/pull/390](https://redirect.github.com/conda-incubator/setup-miniconda/pull/390)/390
- Bump semver and
[@&#8203;types/semver](https://redirect.github.com/types/semver) by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in

[https://github.com/conda-incubator/setup-miniconda/pull/399](https://redirect.github.com/conda-incubator/setup-miniconda/pull/399)/399

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC41MC4wIiwidXBkYXRlZEluVmVyIjoiNDAuNTAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-16 11:53:56 +02:00
renovate[bot]
77ec5f9b17
Update actions/setup-python action to v5.6.0 (#14060)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[actions/setup-python](https://redirect.github.com/actions/setup-python)
| action | minor | `v5` -> `v5.6.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>actions/setup-python (actions/setup-python)</summary>

###
[`v5.6.0`](https://redirect.github.com/actions/setup-python/releases/tag/v5.6.0)

[Compare
Source](https://redirect.github.com/actions/setup-python/compare/v5.5.0...v5.6.0)

##### What's Changed

- Workflow updates related to Ubuntu 20.04 by
[@&#8203;aparnajyothi-y](https://redirect.github.com/aparnajyothi-y) in
[https://github.com/actions/setup-python/pull/1065](https://redirect.github.com/actions/setup-python/pull/1065)
- Fix for Candidate Not Iterable Error by
[@&#8203;aparnajyothi-y](https://redirect.github.com/aparnajyothi-y) in
[https://github.com/actions/setup-python/pull/1082](https://redirect.github.com/actions/setup-python/pull/1082)
- Upgrade semver and
[@&#8203;types/semver](https://redirect.github.com/types/semver) by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/actions/setup-python/pull/1091](https://redirect.github.com/actions/setup-python/pull/1091)
- Upgrade prettier from 2.8.8 to 3.5.3 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/actions/setup-python/pull/1046](https://redirect.github.com/actions/setup-python/pull/1046)
- Upgrade ts-jest from 29.1.2 to 29.3.2 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/actions/setup-python/pull/1081](https://redirect.github.com/actions/setup-python/pull/1081)

**Full Changelog**:
https://github.com/actions/setup-python/compare/v5...v5.6.0

###
[`v5.5.0`](https://redirect.github.com/actions/setup-python/releases/tag/v5.5.0)

[Compare
Source](https://redirect.github.com/actions/setup-python/compare/v5.4.0...v5.5.0)

##### What's Changed

##### Enhancements:

- Support free threaded Python versions like '3.13t' by
[@&#8203;colesbury](https://redirect.github.com/colesbury) in
[https://github.com/actions/setup-python/pull/973](https://redirect.github.com/actions/setup-python/pull/973)
- Enhance Workflows: Include ubuntu-arm runners, Add e2e Testing for
free threaded and Upgrade
[@&#8203;action/cache](https://redirect.github.com/action/cache) from
4.0.0 to 4.0.3 by
[@&#8203;priya-kinthali](https://redirect.github.com/priya-kinthali) in
[https://github.com/actions/setup-python/pull/1056](https://redirect.github.com/actions/setup-python/pull/1056)
- Add support for .tool-versions file in setup-python by
[@&#8203;mahabaleshwars](https://redirect.github.com/mahabaleshwars) in
[https://github.com/actions/setup-python/pull/1043](https://redirect.github.com/actions/setup-python/pull/1043)

##### Bug fixes:

- Fix architecture for pypy on Linux ARM64 by
[@&#8203;mayeut](https://redirect.github.com/mayeut) in
[https://github.com/actions/setup-python/pull/1011](https://redirect.github.com/actions/setup-python/pull/1011)
This update maps arm64 to aarch64 for Linux ARM64 PyPy installations.

##### Dependency updates:

- Upgrade [@&#8203;vercel/ncc](https://redirect.github.com/vercel/ncc)
from 0.38.1 to 0.38.3 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/actions/setup-python/pull/1016](https://redirect.github.com/actions/setup-python/pull/1016)
- Upgrade
[@&#8203;actions/glob](https://redirect.github.com/actions/glob) from
0.4.0 to 0.5.0 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/actions/setup-python/pull/1015](https://redirect.github.com/actions/setup-python/pull/1015)

##### New Contributors

- [@&#8203;colesbury](https://redirect.github.com/colesbury) made their
first contribution in
[https://github.com/actions/setup-python/pull/973](https://redirect.github.com/actions/setup-python/pull/973)
- [@&#8203;mahabaleshwars](https://redirect.github.com/mahabaleshwars)
made their first contribution in
[https://github.com/actions/setup-python/pull/1043](https://redirect.github.com/actions/setup-python/pull/1043)

**Full Changelog**:
https://github.com/actions/setup-python/compare/v5...v5.5.0

###
[`v5.4.0`](https://redirect.github.com/actions/setup-python/releases/tag/v5.4.0)

[Compare
Source](https://redirect.github.com/actions/setup-python/compare/v5.3.0...v5.4.0)

##### What's Changed

##### Enhancements:

- Update cache error message by
[@&#8203;aparnajyothi-y](https://redirect.github.com/aparnajyothi-y) in
[https://github.com/actions/setup-python/pull/968](https://redirect.github.com/actions/setup-python/pull/968)
- Enhance Workflows: Add Ubuntu-24, Remove Python 3.8 by
[@&#8203;priya-kinthali](https://redirect.github.com/priya-kinthali) in
[https://github.com/actions/setup-python/pull/985](https://redirect.github.com/actions/setup-python/pull/985)
- Configure Dependabot settings by
[@&#8203;HarithaVattikuti](https://redirect.github.com/HarithaVattikuti)
in
[https://github.com/actions/setup-python/pull/1008](https://redirect.github.com/actions/setup-python/pull/1008)

##### Documentation changes:

- Readme update - recommended permissions by
[@&#8203;benwells](https://redirect.github.com/benwells) in
[https://github.com/actions/setup-python/pull/1009](https://redirect.github.com/actions/setup-python/pull/1009)
- Improve Advanced Usage examples by
[@&#8203;lrq3000](https://redirect.github.com/lrq3000) in
[https://github.com/actions/setup-python/pull/645](https://redirect.github.com/actions/setup-python/pull/645)

##### Dependency updates:

- Upgrade `undici` from 5.28.4 to 5.28.5 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/actions/setup-python/pull/1012](https://redirect.github.com/actions/setup-python/pull/1012)
- Upgrade `urllib3` from 1.25.9 to 1.26.19 in /**tests**/data by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/actions/setup-python/pull/895](https://redirect.github.com/actions/setup-python/pull/895)
- Upgrade `actions/publish-immutable-action` from 0.0.3 to 0.0.4 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/actions/setup-python/pull/1014](https://redirect.github.com/actions/setup-python/pull/1014)
- Upgrade `@actions/http-client` from 2.2.1 to 2.2.3 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/actions/setup-python/pull/1020](https://redirect.github.com/actions/setup-python/pull/1020)
- Upgrade `requests` from 2.24.0 to 2.32.2 in /**tests**/data by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/actions/setup-python/pull/1019](https://redirect.github.com/actions/setup-python/pull/1019)
- Upgrade `@actions/cache` to `^4.0.0` by
[@&#8203;priyagupta108](https://redirect.github.com/priyagupta108) in
[https://github.com/actions/setup-python/pull/1007](https://redirect.github.com/actions/setup-python/pull/1007)

##### New Contributors

- [@&#8203;benwells](https://redirect.github.com/benwells) made their
first contribution in
[https://github.com/actions/setup-python/pull/1009](https://redirect.github.com/actions/setup-python/pull/1009)
-
[@&#8203;HarithaVattikuti](https://redirect.github.com/HarithaVattikuti)
made their first contribution in
[https://github.com/actions/setup-python/pull/1008](https://redirect.github.com/actions/setup-python/pull/1008)
- [@&#8203;lrq3000](https://redirect.github.com/lrq3000) made their
first contribution in
[https://github.com/actions/setup-python/pull/645](https://redirect.github.com/actions/setup-python/pull/645)

**Full Changelog**:
https://github.com/actions/setup-python/compare/v5...v5.4.0

###
[`v5.3.0`](https://redirect.github.com/actions/setup-python/releases/tag/v5.3.0)

[Compare
Source](https://redirect.github.com/actions/setup-python/compare/v5.2.0...v5.3.0)

#### What's Changed

- Add workflow file for publishing releases to immutable action package
by [@&#8203;Jcambass](https://redirect.github.com/Jcambass) in
[https://github.com/actions/setup-python/pull/941](https://redirect.github.com/actions/setup-python/pull/941)
- Upgrade IA publish by
[@&#8203;Jcambass](https://redirect.github.com/Jcambass) in
[https://github.com/actions/setup-python/pull/943](https://redirect.github.com/actions/setup-python/pull/943)

##### Bug Fixes:

- Normalise Line Endings to Ensure Cross-Platform Consistency by
[@&#8203;priya-kinthali](https://redirect.github.com/priya-kinthali) in
[https://github.com/actions/setup-python/pull/938](https://redirect.github.com/actions/setup-python/pull/938)
- Revise `isGhes` logic by
[@&#8203;jww3](https://redirect.github.com/jww3) in
[https://github.com/actions/setup-python/pull/963](https://redirect.github.com/actions/setup-python/pull/963)
- Bump pillow from 7.2 to 10.2.0 by
[@&#8203;aparnajyothi-y](https://redirect.github.com/aparnajyothi-y) in
[https://github.com/actions/setup-python/pull/956](https://redirect.github.com/actions/setup-python/pull/956)

##### Enhancements:

- Enhance workflows and documentation updates by
[@&#8203;priya-kinthali](https://redirect.github.com/priya-kinthali) in
[https://github.com/actions/setup-python/pull/965](https://redirect.github.com/actions/setup-python/pull/965)
- Bump default versions to latest by
[@&#8203;jeffwidman](https://redirect.github.com/jeffwidman) in
[https://github.com/actions/setup-python/pull/905](https://redirect.github.com/actions/setup-python/pull/905)

#### New Contributors

- [@&#8203;Jcambass](https://redirect.github.com/Jcambass) made their
first contribution in
[https://github.com/actions/setup-python/pull/941](https://redirect.github.com/actions/setup-python/pull/941)
- [@&#8203;jww3](https://redirect.github.com/jww3) made their first
contribution in
[https://github.com/actions/setup-python/pull/963](https://redirect.github.com/actions/setup-python/pull/963)

**Full Changelog**:
https://github.com/actions/setup-python/compare/v5...v5.3.0

###
[`v5.2.0`](https://redirect.github.com/actions/setup-python/releases/tag/v5.2.0)

[Compare
Source](https://redirect.github.com/actions/setup-python/compare/v5.1.1...v5.2.0)

#### What's Changed

##### Bug fixes:

- Add `.zip` extension to Windows package downloads for `Expand-Archive`
Compatibility by
[@&#8203;priyagupta108](https://redirect.github.com/priyagupta108) in
[https://github.com/actions/setup-python/pull/916](https://redirect.github.com/actions/setup-python/pull/916)
This addresses compatibility issues on Windows self-hosted runners by
ensuring that the filenames for Python and PyPy package downloads
explicitly include the .zip extension, allowing the Expand-Archive
command to function correctly.
- Add arch to cache key by
[@&#8203;Zxilly](https://redirect.github.com/Zxilly) in
[https://github.com/actions/setup-python/pull/896](https://redirect.github.com/actions/setup-python/pull/896)
This addresses issues with caching by adding the architecture (arch) to
the cache key, ensuring that cache keys are accurate to prevent
conflicts.
Note: This change may break previous cache keys as they will no longer
be compatible with the new format.

##### Documentation changes:

- Fix display of emojis in contributors doc by
[@&#8203;sciencewhiz](https://redirect.github.com/sciencewhiz) in
[https://github.com/actions/setup-python/pull/899](https://redirect.github.com/actions/setup-python/pull/899)
- Documentation update for caching poetry dependencies by
[@&#8203;gowridurgad](https://redirect.github.com/gowridurgad) in
[https://github.com/actions/setup-python/pull/908](https://redirect.github.com/actions/setup-python/pull/908)

##### Dependency updates:

- Bump [@&#8203;iarna/toml](https://redirect.github.com/iarna/toml)
version from 2.2.5 to 3.0.0 by
[@&#8203;priya-kinthali](https://redirect.github.com/priya-kinthali) in
[https://github.com/actions/setup-python/pull/912](https://redirect.github.com/actions/setup-python/pull/912)
- Bump pyinstaller from 3.6 to 5.13.1 by
[@&#8203;aparnajyothi-y](https://redirect.github.com/aparnajyothi-y) in
[https://github.com/actions/setup-python/pull/923](https://redirect.github.com/actions/setup-python/pull/923)

#### New Contributors

- [@&#8203;sciencewhiz](https://redirect.github.com/sciencewhiz) made
their first contribution in
[https://github.com/actions/setup-python/pull/899](https://redirect.github.com/actions/setup-python/pull/899)
- [@&#8203;priyagupta108](https://redirect.github.com/priyagupta108)
made their first contribution in
[https://github.com/actions/setup-python/pull/916](https://redirect.github.com/actions/setup-python/pull/916)
- [@&#8203;Zxilly](https://redirect.github.com/Zxilly) made their first
contribution in
[https://github.com/actions/setup-python/pull/896](https://redirect.github.com/actions/setup-python/pull/896)
- [@&#8203;aparnajyothi-y](https://redirect.github.com/aparnajyothi-y)
made their first contribution in
[https://github.com/actions/setup-python/pull/923](https://redirect.github.com/actions/setup-python/pull/923)

**Full Changelog**:
https://github.com/actions/setup-python/compare/v5...v5.2.0

###
[`v5.1.1`](https://redirect.github.com/actions/setup-python/releases/tag/v5.1.1)

[Compare
Source](https://redirect.github.com/actions/setup-python/compare/v5.1.0...v5.1.1)

#### What's Changed

##### Bug fixes:

- fix(ci): update all failing workflows by
[@&#8203;mayeut](https://redirect.github.com/mayeut) in
[https://github.com/actions/setup-python/pull/863](https://redirect.github.com/actions/setup-python/pull/863)
This update ensures compatibility and optimal performance of workflows
on the latest macOS version.

##### Documentation changes:

- Documentation update for cache by
[@&#8203;gowridurgad](https://redirect.github.com/gowridurgad) in
[https://github.com/actions/setup-python/pull/873](https://redirect.github.com/actions/setup-python/pull/873)

##### Dependency updates:

- Bump braces from 3.0.2 to 3.0.3 and undici from 5.28.3 to 5.28.4 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/actions/setup-python/pull/893](https://redirect.github.com/actions/setup-python/pull/893)

#### New Contributors

- [@&#8203;gowridurgad](https://redirect.github.com/gowridurgad) made
their first contribution in
[https://github.com/actions/setup-python/pull/873](https://redirect.github.com/actions/setup-python/pull/873)

**Full Changelog**:
https://github.com/actions/setup-python/compare/v5...v5.1.1

###
[`v5.1.0`](https://redirect.github.com/actions/setup-python/releases/tag/v5.1.0)

[Compare
Source](https://redirect.github.com/actions/setup-python/compare/v5...v5.1.0)

#### What's Changed

- Leveraging the raw API to retrieve the version-manifest, as it does
not impose a rate limit and hence facilitates unrestricted consumption
without the need for a token for Github Enterprise Servers by
[@&#8203;Shegox](https://redirect.github.com/Shegox) in
[https://github.com/actions/setup-python/pull/766](https://redirect.github.com/actions/setup-python/pull/766).
- Dependency updates by
[@&#8203;dependabot](https://redirect.github.com/dependabot) and
[@&#8203;HarithaVattikuti](https://redirect.github.com/HarithaVattikuti)
in
[https://github.com/actions/setup-python/pull/817](https://redirect.github.com/actions/setup-python/pull/817)
- Documentation changes for version in README by
[@&#8203;basnijholt](https://redirect.github.com/basnijholt) in
[https://github.com/actions/setup-python/pull/776](https://redirect.github.com/actions/setup-python/pull/776)
- Documentation changes for link in README by
[@&#8203;ukd1](https://redirect.github.com/ukd1) in
[https://github.com/actions/setup-python/pull/793](https://redirect.github.com/actions/setup-python/pull/793)
- Documentation changes for link in Advanced Usage by
[@&#8203;Jamim](https://redirect.github.com/Jamim) in
[https://github.com/actions/setup-python/pull/782](https://redirect.github.com/actions/setup-python/pull/782)
- Documentation changes for avoiding rate limit issues on GHES by
[@&#8203;priya-kinthali](https://redirect.github.com/priya-kinthali) in
[https://github.com/actions/setup-python/pull/835](https://redirect.github.com/actions/setup-python/pull/835)

#### New Contributors

- [@&#8203;basnijholt](https://redirect.github.com/basnijholt) made
their first contribution in
[https://github.com/actions/setup-python/pull/776](https://redirect.github.com/actions/setup-python/pull/776)
- [@&#8203;ukd1](https://redirect.github.com/ukd1) made their first
contribution in
[https://github.com/actions/setup-python/pull/793](https://redirect.github.com/actions/setup-python/pull/793)
- [@&#8203;Jamim](https://redirect.github.com/Jamim) made their first
contribution in
[https://github.com/actions/setup-python/pull/782](https://redirect.github.com/actions/setup-python/pull/782)
- [@&#8203;Shegox](https://redirect.github.com/Shegox) made their first
contribution in
[https://github.com/actions/setup-python/pull/766](https://redirect.github.com/actions/setup-python/pull/766)
- [@&#8203;priya-kinthali](https://redirect.github.com/priya-kinthali)
made their first contribution in
[https://github.com/actions/setup-python/pull/835](https://redirect.github.com/actions/setup-python/pull/835)

**Full Changelog**:
https://github.com/actions/setup-python/compare/v5.0.0...v5.1.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC41MC4wIiwidXBkYXRlZEluVmVyIjoiNDAuNTAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-16 09:48:19 +00:00
renovate[bot]
5beeda7cdc
Update acj/freebsd-firecracker-action action to v0.5.0 (#14057)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[acj/freebsd-firecracker-action](https://redirect.github.com/acj/freebsd-firecracker-action)
| action | minor | `v0.4.2` -> `v0.5.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>acj/freebsd-firecracker-action
(acj/freebsd-firecracker-action)</summary>

###
[`v0.5.0`](https://redirect.github.com/acj/freebsd-firecracker-action/releases/tag/v0.5.0)

[Compare
Source](https://redirect.github.com/acj/freebsd-firecracker-action/compare/v0.4.2...v0.5.0)

Changes:

- Add `disk-size` option to control the size of the VM's disk and root
filesystem
- Retired obsolete workaround that disabled TCP segmentation offload
(TSO)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC41MC4wIiwidXBkYXRlZEluVmVyIjoiNDAuNTAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-16 07:42:01 +00:00
renovate[bot]
9d0d612131
Update Rust crate which to v8 (#14066)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [which](https://redirect.github.com/harryfei/which-rs) |
workspace.dependencies | major | `7.0.0` -> `8.0.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>harryfei/which-rs (which)</summary>

###
[`v8.0.0`](https://redirect.github.com/harryfei/which-rs/blob/HEAD/CHANGELOG.md#800)

[Compare
Source](https://redirect.github.com/harryfei/which-rs/compare/7.0.3...8.0.0)

- Add new `Sys` trait to allow abstracting over the underlying
filesystem. Particularly useful for `wasm32-unknown-unknown` targets.
Thanks [@&#8203;dsherret](https://redirect.github.com/dsherret) for this
contribution to which!
-   Add more debug level tracing for otherwise silent I/O errors.
- Call the `NonFatalHandler` in more places to catch previously ignored
I/O errors.
-   Remove use of the `either` dependency.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC41MC4wIiwidXBkYXRlZEluVmVyIjoiNDAuNTAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-16 09:37:38 +02:00
Frazer McLean
87827b6d82
Fix implied platform_machine marker for win_amd64 platform tag (#14041)
Some checks are pending
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
## Summary

Fixes #14040

## Test Plan

Added a test using required-environments
2025-06-15 22:37:40 -04:00
renovate[bot]
67c0f93e37
Update actions/checkout action to v4.2.2 (#14059) 2025-06-16 02:27:03 +00:00
renovate[bot]
ba6413f81f
Update actions/attest-build-provenance action to v2.4.0 (#14058) 2025-06-16 02:17:30 +00:00
renovate[bot]
a0ea520fe3
Update taiki-e/install-action action to v2.52.8 (#14056) 2025-06-16 02:05:57 +00:00
renovate[bot]
4d104dd004
Update Rust crate windows to v0.61.3 (#14055) 2025-06-16 02:05:34 +00:00
renovate[bot]
4b5da24fe0
Update Rust crate unicode-width to v0.2.1 (#14054) 2025-06-16 02:04:46 +00:00
renovate[bot]
f38e96bddd
Update actions/setup-python digest to a26af69 (#14045) 2025-06-15 21:47:33 -04:00
renovate[bot]
59f1b4bee4
Update Rust crate syn to v2.0.103 (#14051) 2025-06-15 21:41:54 -04:00
renovate[bot]
96cb90a1fd
Update Rust crate toml to v0.8.23 (#14052) 2025-06-15 21:41:31 -04:00
renovate[bot]
f0e0ad4d09
Update Rust crate smallvec to v1.15.1 (#14050) 2025-06-15 21:37:31 -04:00
renovate[bot]
3c00d6d7df
Update Rust crate memchr to v2.7.5 (#14048) 2025-06-15 21:37:22 -04:00
renovate[bot]
ce16a0fc9b
Update Rust crate boxcar to v0.2.13 (#14046) 2025-06-15 21:37:16 -04:00
renovate[bot]
15256722d8
Update Rust crate clap to v4.5.40 (#14047) 2025-06-15 21:36:58 -04:00
Aria Desires
ff9c2c35d7
Support reading dependency-groups from pyproject.tomls with no project (#13742)
Some checks failed
CI / check cache | macos aarch64 (push) Has been cancelled
CI / check system | python on debian (push) Has been cancelled
CI / check system | python on fedora (push) Has been cancelled
CI / check system | python on ubuntu (push) Has been cancelled
CI / check system | python on rocky linux 8 (push) Has been cancelled
CI / check system | python on rocky linux 9 (push) Has been cancelled
CI / check system | graalpy on ubuntu (push) Has been cancelled
CI / check system | pypy on ubuntu (push) Has been cancelled
CI / check system | pyston (push) Has been cancelled
CI / check system | python on macos aarch64 (push) Has been cancelled
CI / check system | python3.10 on windows x86 (push) Has been cancelled
CI / check system | python3.13 on windows x86-64 (push) Has been cancelled
CI / check system | x86-64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | windows registry (push) Has been cancelled
CI / check system | python3.12 via chocolatey (push) Has been cancelled
CI / check system | python3.9 via pyenv (push) Has been cancelled
CI / check system | python3.13 (push) Has been cancelled
CI / check system | homebrew python on macos aarch64 (push) Has been cancelled
CI / check system | python on macos x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86-64 (push) Has been cancelled
CI / benchmarks | walltime aarch64 linux (push) Has been cancelled
CI / benchmarks | instrumented (push) Has been cancelled
CI / check system | conda3.11 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.8 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.11 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.8 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.11 on windows x86-64 (push) Has been cancelled
CI / check system | conda3.8 on windows x86-64 (push) Has been cancelled
CI / check system | amazonlinux (push) Has been cancelled
CI / check system | embedded python3.10 on windows x86-64 (push) Has been cancelled
(or legacy tool.uv.workspace).

This cleaves out a dedicated SourcedDependencyGroups type based on
RequiresDist but with only the DependencyGroup handling implemented.
This allows `uv pip` to read `dependency-groups` from pyproject.tomls
that only have that table defined, per PEP 735, and as implemented by
`pip`.

However we want our implementation to respect various uv features when
they're available:

* `tool.uv.sources`
* `tool.uv.index`
* `tool.uv.dependency-groups.mygroup.requires-python` (#13735)

As such we want to opportunistically detect "as much as possible" while
doing as little as possible when things are missing. The issue with the
old RequiresDist path was that it fundamentally wanted to build the
package, and if `[project]` was missing it would try to desperately run
setuptools on the pyproject.toml to try to find metadata and make a hash
of things.

At the same time, the old code also put in a lot of effort to try to
pretend that `uv pip` dependency-groups worked like `uv`
dependency-groups with defaults and non-only semantics, only to separate
them back out again. By explicitly separating them out, we confidently
get the expected behaviour.

Note that dependency-group support is still included in RequiresDist, as
some `uv` paths still use it. It's unclear to me if those paths want
this same treatment -- for now I conclude no.

Fixes #13138
2025-06-13 22:16:48 +00:00
Aria Desires
5021840919
Add [tool.uv.dependency-groups].mygroup.requires-python (#13735)
This allows you to specify requires-python on individual dependency-groups,
with the intended usecase being "oh my dev-dependencies have a higher
requires-python than my actual project".

This includes a large driveby move of the RequiresPython type to
uv-distribution-types to allow us to generate the appropriate markers at
this point in the code. It also migrates RequiresPython from
pubgrub::Range to version_ranges::Ranges, and makes several pub(crate)
items pub, as it's no longer defined in uv_resolver.

Fixes #11606
2025-06-13 22:04:13 +00:00
Andrew Gallant
26db29caac
Update to jiff 0.2.15 (#14035)
This brings in https://github.com/BurntSushi/jiff/pull/385, which makes
cold resolves about 10% faster. Or, stated differently, as fast as they
were a few weeks ago before the perf regression introduced by
`jiff 0.2.14`.
2025-06-13 14:43:46 -04:00
konsti
d9b76b97c2
Remove unreachable pub function (#14032)
Not sure how this landed on main, but rustc complained.
2025-06-13 18:04:13 +00:00
Aria Desires
49b450109b
filter out riscv64 wheels before publishing to pypi (#14009)
Some checks are pending
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
An alternative to #14006
2025-06-13 09:57:04 -04:00
konsti
da2eca4e2c
Make wiremock filter a default filter (#14024)
Make `127.0.0.1:<PORT>` -> ``[LOCALHOST]`` a general test filter, it
simplifies using wiremock and I'm not aware of any false positives.
2025-06-13 08:51:17 -05:00
Zanie Blue
881e17600f
Filter managed Python distributions by platform before querying when included in request (#13936)
In the case where we have platform information in a Python request, we
should filter managed Python distributions by it prior to querying them.

Closes https://github.com/astral-sh/uv/issues/13935

---------

Co-authored-by: Aria Desires <aria.desires@gmail.com>
2025-06-13 08:50:44 -05:00
renovate[bot]
b95e66019d
Update Rust crate hyper-util to v0.1.14 (#13912)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [hyper-util](https://hyper.rs)
([source](https://redirect.github.com/hyperium/hyper-util)) |
dev-dependencies | patch | `0.1.12` -> `0.1.14` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>hyperium/hyper-util (hyper-util)</summary>

###
[`v0.1.14`](https://redirect.github.com/hyperium/hyper-util/blob/HEAD/CHANGELOG.md#0114-2025-06-04)

[Compare
Source](https://redirect.github.com/hyperium/hyper-util/compare/v0.1.13...v0.1.14)

- Fix `HttpConnector` to defer address family order to resolver sort
order.
-   Fix `proxy::Matcher` to find HTTPS system proxies on Windows.

###
[`v0.1.13`](https://redirect.github.com/hyperium/hyper-util/blob/HEAD/CHANGELOG.md#0113-2025-05-27)

[Compare
Source](https://redirect.github.com/hyperium/hyper-util/compare/v0.1.12...v0.1.13)

- Fix `HttpConnector` to always prefer IPv6 addresses first, if happy
eyeballs is enabled.
- Fix `legacy::Client` to return better errors if available on the
connection.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC40MC4zIiwidXBkYXRlZEluVmVyIjoiNDAuNDAuMyIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-13 13:37:11 +02:00
konsti
e10881d49c
Add tests for IO Error retries (#13627)
Often, HTTP requests don't fail due to server errors, but from spurious
network errors such as connection resets. reqwest surfaces these as
`io::Error`, and we have to handle their retrying separately.

Companion PR: https://github.com/LukeMathWalker/wiremock-rs/pull/159
2025-06-13 09:57:45 +00:00
Zanie Blue
62ed17b230
Bump version to 0.7.13 (#14002)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-12 14:33:31 -05:00
konsti
7316bd01a3
Build backend: Support namespace packages (#13833)
Unlike regular packages, specifying all `__init__.py` directories for a
namespace package would be very verbose There is e.g.
https://github.com/python-poetry/poetry/tree/main/src/poetry, which has
18 modules, or https://github.com/googleapis/api-common-protos which is
inconsistently nested. For both the Google Cloud SDK, there are both
packages with a single module and those with complex structures, with
many having multiple modules due to versioning through `<module>_v1`
versioning. The Azure SDK seems to use one module per package (it's not
explicitly documented but seems to follow from the process in
https://azure.github.io/azure-sdk/python_design.html#azure-sdk-distribution-packages
and
ccb0e03a3d/doc/dev/packaging.md).

For simplicity with complex projects, we add a `namespace = true` switch
which disabled checking for an `__init__.py`. We only check that there's
no `<module_root>/<module_name>/__init__.py` and otherwise add the whole
`<module_root>/<module_name>` folder. This comes at the cost of
`namespace = true` effectively creating an opt-out from our usual checks
that allows creating an almost entirely arbitrary package.

For simple projects with only a single module, the module name can be
dotted to point to the target module, so the build still gets checked:

```toml
[tool.uv.build-backend]
module-name = "poetry.core"
```

## Alternatives

### Declare all packages

We could make `module-name` a list and allow or require declaring all
packages:

```toml
[tool.uv.build-backend]
module-name = ["cloud_sdk.service.storage", "cloud_sdk.service.storage_v1", "cloud_sdk.billing.storage"]
```

Or for Poetry:

```toml
[tool.uv.build-backend]
module-name = [
    "poetry.config",
    "poetry.console",
    "poetry.inspection",
    "poetry.installation",
    "poetry.json",
    "poetry.layouts",
    "poetry.masonry",
    "poetry.mixology",
    "poetry.packages",
    "poetry.plugins",
    "poetry.publishing",
    "poetry.puzzle",
    "poetry.pyproject",
    "poetry.repositories",
    "poetry.toml",
    "poetry.utils",
    "poetry.vcs",
    "poetry.version"
]
```

### Support multiple namespaces

We could also allow namespace packages with multiple root level module:

```toml
[tool.uv.build-backend]
module-name = ["cloud_sdk.my_ext", "local_sdk.my_ext"]
```

For lack of use cases, we delegate this to creating a workspace with one
package per module.

## Implementation

Due to the more complex options for the module name, I'm moving
verification on deserialization later, dropping the source span we'd get
from serde. We also don't show similarly named directories anymore.

---------

Co-authored-by: Andrew Gallant <andrew@astral.sh>
2025-06-12 17:23:58 +00:00
Zanie Blue
95ad8e5e82
Add 3.14 to the supported platform reference (#13990)
Some checks are pending
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-12 15:18:48 +00:00
Zanie Blue
e067118700
Add supported macOS version to the platform reference (#13993)
Closes https://github.com/astral-sh/uv/issues/13807
2025-06-12 15:16:24 +00:00
Zanie Blue
f82909ad68
Update platform support reference to include Python implementation list (#13991) 2025-06-12 10:14:42 -05:00
github-actions[bot]
d9c34259b3
Sync latest Python releases (#13996)
Automated update for Python releases.

---------

Co-authored-by: zanieb <2586601+zanieb@users.noreply.github.com>
Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-06-12 09:47:28 -05:00
Ahmed Ilyas
a1f9f28762
Do not allow uv add --group ... --script (#13997)
## Summary

Closes #13988 

## Test Plan

`cargo test`
2025-06-12 10:45:12 -04:00
Zanie Blue
806cc5cad9
Debug sync_dry_run flake by panicking with verbose output on failure (#13817)
Investigating #13744 

I tried reproducing here by running the test in a loop, but could not. I
presume it's an interaction with other tests.

This drops the snapshot, but I think it's worth it to try to examine the
flake?
2025-06-12 08:05:25 -05:00
Zanie Blue
87ab57e902
Update the CLI help and reference to include references to the Python bin directory (#13978)
Closes https://github.com/astral-sh/uv/issues/13977
2025-06-12 10:02:51 +00:00
konsti
b3d7f79770
Universal sync_required_environment_hint test (#13975)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Use the packse case `no-sdist-no-wheels-with-matching-platform` for a
platform independent `sync_required_environment_hint` test.

Fixes #13890
2025-06-11 22:47:40 -04:00
Charlie Marsh
81aebf921d
Add zstd and deflate to Accept-Encoding (#13982)
## Summary

We already pull in these dependencies, so it costs us nothing.
2025-06-11 22:42:47 -04:00
Zanie Blue
f530565323
Use TTY detection to determine if SIGINT forwarding is enabled (#13925)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
Use TTY detection to determine when we should forward SIGINT instead of
counting signals, which can lead to various problems where multiple
SIGINTs are sent to a child after the first signal. Counting does not
make sense in interactive situations that do not exit on interrupt,
e.g., the Python REPL.

Closes https://github.com/astral-sh/uv/issues/13919
Closes https://github.com/astral-sh/uv/issues/12108
2025-06-11 13:28:34 +00:00
github-actions[bot]
357fc91c3c
Sync latest Python releases (#13956)
Some checks are pending
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
Automated update for Python releases.

Co-authored-by: zanieb <2586601+zanieb@users.noreply.github.com>
2025-06-10 21:13:32 -05:00
Zanie Blue
4c877b7dc6
Fix generated file change (#13957)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Regressed in https://github.com/astral-sh/uv/pull/13620

ref failure at
4384658790
2025-06-10 22:03:01 +00:00
Zanie Blue
21986c3fc9
Bump timeout on Python download release fetch (#13955)
Failing with a 504 at
1556959077

Successful at
4384626025
2025-06-10 16:54:42 -05:00
Lan, Jian
90a7208a73
Fix and improve docs (#13620)
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

I follow the advices from the IDE spell checker and grammar checker, fix
some typos, and improve the docs.
2025-06-10 13:15:38 -05:00
Zanie Blue
f20a25f91f
Download versions in uv python pin if not found (#13946)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
As another follow-up in the vein of
https://github.com/astral-sh/uv/pull/13944, I noticed `uv python pin`
doesn't download Python versions, which is a bit weird because we'll
warn it's not found.
2025-06-10 10:56:22 -05:00
Xeonacid
210b579188
build-binaries for riscv64 (#12688)
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

Build riscv64 binary so it can get released in the GitHub Releases,
which is used by many high-level apps.

A copy-paste from linux-s390x, with only target and arch changed.

maturin-action added riscv64 support in v1.48.0, this PR also bumps it
to the latest version, v1.48.1.

## Test Plan

<!-- How was it tested? -->

Let CI test itself :P

Already tested in [my
fork](4004817230)
2025-06-10 11:18:28 -04:00
Zanie Blue
1b82a3ac99
Ignore Python discovery errors during uv python pin (#13944)
See https://github.com/astral-sh/uv/issues/13935#issuecomment-2957300516
where we fail to write a pin file because we encounter an unusable
interpreter. This is actually a special case where `MissingPython` is
not raised because we want to show why we failed to find a usable
interpreter, which is useful in commands where you _need_ an interpreter
to use, but here we don't actually need it. Here, we just log the
failure and move on.

Related https://github.com/astral-sh/uv/pull/13936
2025-06-10 09:52:23 -05:00
Charlie Marsh
28685633c0
Add an llms.txt to uv (#13929)
## Summary

The generated file looks like:

```
# uv

> uv is an extremely fast Python package and project manager, written in Rust.

You can use uv to install Python dependencies, run scripts, manage virtual environments,
build and publish packages, and even install Python itself. uv is capable of replacing
`pip`, `pip-tools`, `pipx`, `poetry`, `pyenv`, `twine`, `virtualenv`, and more.

uv includes both a pip-compatible CLI (prepend `uv` to a pip command, e.g., `uv pip install ruff`)
and a first-class project interface (e.g., `uv add ruff`) complete with lockfiles and
workspace support.


## Getting started

- [Features](https://docs.astral.sh/uv/getting-started/features/index.md)
- [First steps](https://docs.astral.sh/uv/getting-started/first-steps/index.md)
- [Installation](https://docs.astral.sh/uv/getting-started/installation/index.md)

## Guides

- [Installing Python](https://docs.astral.sh/uv/guides/install-python/index.md)
- [Publishing packages](https://docs.astral.sh/uv/guides/package/index.md)
- [Working on projects](https://docs.astral.sh/uv/guides/projects/index.md)
- [Running scripts](https://docs.astral.sh/uv/guides/scripts/index.md)
- [Using tools](https://docs.astral.sh/uv/guides/tools/index.md)

## Integrations

- [Alternative indexes](https://docs.astral.sh/uv/guides/integration/alternative-indexes/index.md)
- [AWS Lambda](https://docs.astral.sh/uv/guides/integration/aws-lambda/index.md)
- [Dependency bots](https://docs.astral.sh/uv/guides/integration/dependency-bots/index.md)
- [Docker](https://docs.astral.sh/uv/guides/integration/docker/index.md)
- [FastAPI](https://docs.astral.sh/uv/guides/integration/fastapi/index.md)
- [GitHub Actions](https://docs.astral.sh/uv/guides/integration/github/index.md)
- [GitLab CI/CD](https://docs.astral.sh/uv/guides/integration/gitlab/index.md)
- [Jupyter](https://docs.astral.sh/uv/guides/integration/jupyter/index.md)
- [marimo](https://docs.astral.sh/uv/guides/integration/marimo/index.md)
- [Pre-commit](https://docs.astral.sh/uv/guides/integration/pre-commit/index.md)
- [PyTorch](https://docs.astral.sh/uv/guides/integration/pytorch/index.md)

## Projects

- [Building distributions](https://docs.astral.sh/uv/concepts/projects/build/index.md)
- [Configuring projects](https://docs.astral.sh/uv/concepts/projects/config/index.md)
- [Managing dependencies](https://docs.astral.sh/uv/concepts/projects/dependencies/index.md)
- [Creating projects](https://docs.astral.sh/uv/concepts/projects/init/index.md)
- [Structure and files](https://docs.astral.sh/uv/concepts/projects/layout/index.md)
- [Running commands](https://docs.astral.sh/uv/concepts/projects/run/index.md)
- [Locking and syncing](https://docs.astral.sh/uv/concepts/projects/sync/index.md)
- [Using workspaces](https://docs.astral.sh/uv/concepts/projects/workspaces/index.md)

## Features

- [Authentication](https://docs.astral.sh/uv/concepts/authentication/index.md)
- [Build backend](https://docs.astral.sh/uv/concepts/build-backend/index.md)
- [Caching](https://docs.astral.sh/uv/concepts/cache/index.md)
- [Configuration files](https://docs.astral.sh/uv/concepts/configuration-files/index.md)
- [Package indexes](https://docs.astral.sh/uv/concepts/indexes/index.md)
- [Python versions](https://docs.astral.sh/uv/concepts/python-versions/index.md)
- [Resolution](https://docs.astral.sh/uv/concepts/resolution/index.md)
- [Tools](https://docs.astral.sh/uv/concepts/tools/index.md)

## The pip interface

- [Compatibility with pip](https://docs.astral.sh/uv/pip/compatibility/index.md)
- [Locking environments](https://docs.astral.sh/uv/pip/compile/index.md)
- [Declaring dependencies](https://docs.astral.sh/uv/pip/dependencies/index.md)
- [Using environments](https://docs.astral.sh/uv/pip/environments/index.md)
- [Inspecting environments](https://docs.astral.sh/uv/pip/inspection/index.md)
- [Managing packages](https://docs.astral.sh/uv/pip/packages/index.md)

## Reference

- [Commands](https://docs.astral.sh/uv/reference/cli/index.md)
- [Environment variables](https://docs.astral.sh/uv/reference/environment/index.md)
- [Installer](https://docs.astral.sh/uv/reference/installer/index.md)
- [Settings](https://docs.astral.sh/uv/reference/settings/index.md)
```

Closes #13901.
2025-06-10 07:46:49 -04:00
konsti
c54f131500
Add basic network error tests (#13585)
Add basic tests for error messages on retryable network errors.

This test mod is intended to grow to ensure that we handle retryable
errors correctly and that we show the appropriate error message if we
failed after retrying.

The starter tests show some common cases we've seen download errors in:
simple and find links indexes, file downloads and Python installs.

For `io::Error` fault injection to test the reqwest `Err` path besides
the HTTP status code `Ok` path, see
https://github.com/LukeMathWalker/wiremock-rs/issues/149.
2025-06-10 12:00:04 +02:00
Jack O'Connor
9129d2a9a3 avoid fetching an exact, cached commit, even if it isn't locked
Some checks are pending
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Fixes #13513 and #12746.
2025-06-09 23:50:36 +00:00
Jack O'Connor
619132bd8a make GitOid strict about parsing exactly 40 hex characters 2025-06-09 23:50:36 +00:00
Jack O'Connor
9ff07c113f add a snapshot test for uv pip install with an exact Git commit 2025-06-09 23:50:36 +00:00
Jack O'Connor
dc455bfc26 add UV_NO_GITHUB_FAST_PATH 2025-06-09 23:50:36 +00:00
renovate[bot]
f5382c010b
Update acj/freebsd-firecracker-action action to v0.4.2 (#13906)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[acj/freebsd-firecracker-action](https://redirect.github.com/acj/freebsd-firecracker-action)
| action | patch | `v0.4.1` -> `v0.4.2` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>acj/freebsd-firecracker-action
(acj/freebsd-firecracker-action)</summary>

###
[`v0.4.2`](https://redirect.github.com/acj/freebsd-firecracker-action/releases/tag/v0.4.2)

[Compare
Source](https://redirect.github.com/acj/freebsd-firecracker-action/compare/v0.4.1...v0.4.2)

[Firecracker
1.12.0](https://redirect.github.com/firecracker-microvm/firecracker/releases/tag/v1.12.0)
[FreeBSD 14.3-RELEASE](https://www.freebsd.org/releases/14.3R/relnotes/)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC40MC4zIiwidXBkYXRlZEluVmVyIjoiNDAuNDAuMyIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-06-09 18:06:31 -05:00
renovate[bot]
e67dff85cc
Update Rust crate fs-err to v3.1.1 (#13910) 2025-06-09 14:29:12 -04:00
renovate[bot]
dd46985e27
Update Rust crate petgraph to v0.8.2 (#13913) 2025-06-09 14:28:51 -04:00
renovate[bot]
2940122e69
Update Rust crate hashbrown to v0.15.4 (#13911) 2025-06-09 14:28:39 -04:00
renovate[bot]
1f2bba9b3d
Update Rust crate flate2 to v1.1.2 (#13909) 2025-06-09 14:28:26 -04:00
renovate[bot]
64dacab913
Update pre-commit dependencies (#13907) 2025-06-09 14:28:16 -04:00
renovate[bot]
00b517dbd1
Update Rust crate anstream to v0.6.19 (#13908) 2025-06-09 14:28:09 -04:00
John Mumm
2a66349e96
Check if relative URL is valid directory before treating as index (#13917)
As per #13874, passing a relative URL like `test` to `--index` for `uv
add` causes unexpected behavior if the directory does not exist. The
non-existent index is effectively ignored and uv falls back to PyPI. If
a package is found there, the spurious index is then written to
`pyproject.toml`. This doesn't happen for `--default-index` since
resolution will fail without fallback to PyPI.

This PR adds a validation step for indexes provided on the command line.
If a directory does not exist, uv will fail with an error.

Closes #13874
2025-06-09 19:28:39 +02:00
Zanie Blue
619a0eafa1
Fix use of deprecated black_box function (#13926) 2025-06-09 16:57:27 +00:00
Zanie Blue
ea45acaf4f
Fix extra newline in changelog (#13918) 2025-06-09 11:48:59 -05:00
Karim Abou Zeid
5dae9ac5f2
Update referenced CUDA version in pytorch.md (#13899)
Some checks are pending
CI / integration test | free-threaded on windows (push) Blocked by required conditions
CI / integration test | github actions (push) Blocked by required conditions
CI / integration test | free-threaded python on github actions (push) Blocked by required conditions
CI / integration test | determine publish changes (push) Blocked by required conditions
CI / integration test | uv publish (push) Blocked by required conditions
CI / integration test | uv_build (push) Blocked by required conditions
CI / check cache | ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
## Summary

torch 2.7 by default targets CUDA 12.6
2025-06-08 20:21:07 -04:00
Zanie Blue
dc3fd46472
Bump version to 0.7.12 (#13892)
Some checks failed
CI / check cache | macos aarch64 (push) Has been cancelled
CI / check system | python on debian (push) Has been cancelled
CI / check system | python on fedora (push) Has been cancelled
CI / check system | python on ubuntu (push) Has been cancelled
CI / check system | python on rocky linux 8 (push) Has been cancelled
CI / check system | python on rocky linux 9 (push) Has been cancelled
CI / check system | graalpy on ubuntu (push) Has been cancelled
CI / check system | pypy on ubuntu (push) Has been cancelled
CI / check system | pyston (push) Has been cancelled
CI / check system | python on macos aarch64 (push) Has been cancelled
CI / check system | homebrew python on macos aarch64 (push) Has been cancelled
CI / check system | python on macos x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86 (push) Has been cancelled
CI / check system | python3.13 on windows x86-64 (push) Has been cancelled
CI / check system | x86-64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | windows registry (push) Has been cancelled
CI / check system | python3.12 via chocolatey (push) Has been cancelled
CI / check system | python3.9 via pyenv (push) Has been cancelled
CI / check system | python3.13 (push) Has been cancelled
CI / check system | conda3.11 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.8 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.11 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.8 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.11 on windows x86-64 (push) Has been cancelled
CI / check system | conda3.8 on windows x86-64 (push) Has been cancelled
CI / check system | amazonlinux (push) Has been cancelled
CI / check system | embedded python3.10 on windows x86-64 (push) Has been cancelled
CI / benchmarks | walltime aarch64 linux (push) Has been cancelled
CI / benchmarks | instrumented (push) Has been cancelled
2025-06-06 19:42:06 +00:00
konsti
5b0133c0ec
Hint at tool.uv.required-environments (#13575)
For the case where there was no matching wheel on sync, we previously
added a note about which wheels are available vs. on which platform you
are on. We extend this error message to link directly towards
`tool.uv.required-environments`, which otherwise has a discovery
problem.

On Linux (Setting `tool.uv.required-environments` doesn't help here
either, but it's a clear example):

```
[project]
name = "debug"
version = "0.1.0"
requires-python = "==3.10.*"
dependencies = ["tensorflow-macos>=2.13.1"]
```

```
Resolved 41 packages in 24ms
error: Distribution `tensorflow-macos==2.16.2 @ registry+https://pypi.org/simple` can't be installed because it doesn't have a source distribution or wheel for the current platform

hint: You're on Linux (`manylinux_2_39_x86_64`), but there are no wheels for the current platform, consider configuring `tool.uv.required-environments`.
hint: `tensorflow-macos` (v2.16.2) only has wheels for the following platform: `macosx_12_0_arm64`.
```


![image](https://github.com/user-attachments/assets/b6b49461-10d6-4e1d-bc0a-5d35d98e33d0)

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-06-06 19:15:52 +00:00
konsti
7aefbe8dc5
Don't hint at versions removed by excluded-newer (#13884)
General small hint false positives that shows up as CI failure in our
snapshots.

Fixes #13867
2025-06-06 18:35:18 +00:00
konsti
0109af1aa5
Hint at tool.uv.environments on resolution error (#13455)
Users are not (yet) properly familiar with the concept of universal
resolution and its implication that we need to resolve for all possible
platforms and Python versions. Some projects only target a specific
platform or Python version, and users experience resolution errors due
to failures for other platforms. Indicated by the number of questions we
get about it, `tool.uv.environments` for restricting environments is not
well discoverable.

We add a special hint when resolution failed on a fork disjoint with the
current environment, hinting the user to constrain `requires-python` and
`tool.uv.environments` respectively.

The hint has false positives for cases where the resolution failed on a
different platform, but equally fails on the current platform, in cases
where the non-current fork was tried earlier. Given that conflicts can
be based on `requires-python`, afaik we can't parse whether the current
platform would also be affected from the derivation tree.

Two cases not covered by this are build errors as well as install errors
that need `tool.uv.required-environments`.
2025-06-06 14:17:52 +00:00
Yury Fedotov
8a88ab2c70
Minor internal README enhancement for Markdown list in PEP440 (#13880)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
- Define all list elements using `-`: it used to be a mix of `*` and
`-`. `-` is what Prettier linter formats it to by default.
- Removed unnecessary blank line between 2 list elements. Other elements
were stitched together without blank lines in between.
- Only the first list element started in sentence case (capital letter
first) - I made all start like so.
2025-06-06 08:45:37 -05:00
konsti
bf96c60e3e
Lock during uv sync, uv add and uv remove to avoid race conditions (#13869)
Surprisingly, we weren't locking during `uv sync` so far, so running `uv
sync` in parallel could cause errors in filesystem races.

I've also added locks to `uv add` and `uv remove` which concurrently
modify `pyproject.toml`. These locks only apply after we determined the
interpreter, so they don't actually prevent computing the same thing
twice when running `uv add` in parallel.

All other subcommands that I checked were already locking (with no claim
to exhaustiveness)

Fixes #12751

# Test Plan

I don't have CI-sized reproducer for this.

```toml
[project]
name = "debug"
version = "0.1.0"
requires-python = ">=3.12"
dependencies = [
  "boto3>=1.38.30",
  "fastapi>=0.115.12",
  "numba>=0.61.2",
  "polars>=1.30.0",
  "protobuf>=6.31.1",
  "pyarrow>=20.0.0",
  "pydantic>=2.11.5",
  "requests>=2.32.3",
  "urllib3>=2.4.0",
  "scikit-learn>=1.6.1",
  "jupyter>=1.1.1",
]

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
```

```
rm -rf .venv && parallel -n0 "uv sync -q" ::: {1..10}
```
2025-06-06 14:16:40 +02:00
Micha Reiser
b865f76b78
Change GlobDirFilter fallback to true (#13882)
## Summary

I think `GlobDirFilter::match_directory` should return `true` if it
failed to construct a DFA
to force a full directory traversal. Returning `false` means that the
build backend excludes all files which
is unlikely what we want in that situation.
2025-06-06 14:07:34 +02:00
samypr100
a68fb53c4b
PEP 751 uv export supports --no-editable (#13852)
## Summary

When trying out `uv export --no-editable --format pylock.toml` the
exported contents would still retain `editable = true` regardless.

## Test Plan

Added additional test. Tested locally on few projects where I was
previously using `uv export --no-editable --format requirements.txt` to
ensure the output aligns.
2025-06-06 07:52:14 -04:00
Zanie Blue
7dd564d3d0
Improve python pin error messages (#13862)
Some checks are pending
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Addresses https://github.com/astral-sh/uv/issues/13854
2025-06-05 21:48:34 +00:00
Jack O'Connor
94ed6f880f
update Git and GitHub Actions docs to mention gh auth login (#13850) 2025-06-05 20:44:59 +00:00
Zanie Blue
b1a50c33cf
Fix lock_requires_python test case (#13866)
Tracking https://github.com/astral-sh/uv/issues/13867
2025-06-05 17:27:52 +00:00
Zanie Blue
262ca73965
Remove the configuration section in favor of concepts / reference (#13842)
Extends https://github.com/astral-sh/uv/pull/13841 — I'll drop that
commit later after that pull request merges but it's small.

I find the split into a "Configuration" section awkward and don't think
it's helping us. Everything moved into the "Concepts" section, except
the "Environment variables" page which definitely belongs in the
reference and the "Installer" page which is fairly niche and seems
better in the reference.

Before / After


<img
src="https://github.com/user-attachments/assets/80d8304b-17da-4900-a5f4-c3ccac96fcc5"
width="400">
2025-06-05 17:09:49 +00:00
Zanie Blue
062b6ab743
Add uv python pin --rm to remove .python-version pins (#13860)
I realized it is non-trivial to delete the global Python version pin,
and that we were missing this simple functionality
2025-06-05 12:05:42 -05:00
Zanie Blue
789a246cf3
Move the pip interface documentation into the concepts section (#13841)
The motivation here being a reduction in the length of the navigation. I
don't think we did this in the first place because we didn't have the
capability to do another nested level, but now we're doing that for
Projects.
2025-06-05 11:53:27 -05:00
github-actions[bot]
cd89a592ea
Update tag for Python sysconfig metadata (#13851)
Automated update for Python releases.

Co-authored-by: zanieb <2586601+zanieb@users.noreply.github.com>
2025-06-05 11:21:35 -05:00
Abhishek Upadhye
efd4652faa
List .gitignore in project init files (#13855)
Fixes https://github.com/astral-sh/uv/issues/13639

Just added a missing `.gitignore` file in the docs, so that there would
be no confusion for readers referring it.

Thank you!
2025-06-05 09:21:15 -05:00
Michał Górny
3a29bb0986
Switch sync_dry_run test to Python 3.9 (#13858)
Some checks are pending
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
## Summary

Switch the `sync_dry_run` test to use Python 3.9 instead of Python 3.8.
I didn't previously catch it since it was marked as `python_managed`.

## Test Plan

`cargo test --no-default-features --features git,pypi,python` without
Python 3.8 installed.
2025-06-05 08:15:00 -05:00
Michał Górny
042e8a448b
Make requirements_txt_ssh_git_username not write into homedir (#13857)
## Summary

Modify `requirements_txt_ssh_git_username` test to pass `-o
UserKnownHostsFile=/dev/null` to OpenSSH, in order to prevent it from
trying to write into the known-hosts in user's home directory. To add
insult to the injury, OpenSSH ignores `HOME` and writes into the actual
home when it is explicitly overridden for the purpose of testing.

## Test Plan

`cargo test --no-default-features --features=git,pypi,python` in an
environment where the user can't write to `pw_home` (but can to
`${HOME}`). Previously the test would fail:

```
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ Snapshot Summary ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Snapshot: requirements_txt_ssh_git_username
Source: crates/uv/tests/it/export.rs:1252
────────────────────────────────────────────────────────────────────────────────
Expression: snapshot
────────────────────────────────────────────────────────────────────────────────
-old snapshot
+new results
────────────┬───────────────────────────────────────────────────────────────────
    7     7 │   ├─▶ failed to clone into: [PATH]
    8     8 │   ├─▶ failed to fetch branch, tag, or commit `d780faf0ac91257d4d5a4f0c5a0e4509608c0071`
    9     9 │   ╰─▶ process didn't exit successfully: [GIT_COMMAND_ERROR]
   10    10 │       --- stderr
         11 │+      Could not create directory '/var/lib/portage/home/.ssh' (Permission denied).
         12 │+      Failed to add the host to the list of known hosts (/var/lib/portage/home/.ssh/known_hosts).
   11    13 │       Load key "[TEMP_DIR]/fake_deploy_key": [ERROR]
   12    14 │       git@github.com: Permission denied (publickey).
   13    15 │       fatal: Could not read from remote repository.
   14    16 │ 
────────────┴───────────────────────────────────────────────────────────────────
```

---------

Co-authored-by: konstin <konstin@mailbox.org>
2025-06-05 10:35:21 +00:00
Jack O'Connor
90a4416ab8
Bump version to 0.7.11 (#13844)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
2025-06-04 12:35:56 -07:00
github-actions[bot]
8e8b7449dc
Sync latest Python releases (#13848)
20250604

Co-authored-by: zanieb <2586601+zanieb@users.noreply.github.com>
2025-06-04 19:07:04 +00:00
Zanie Blue
f168802ba4
Bump cargo-test-macos timeout to 15m (#13847)
Closes https://github.com/astral-sh/uv/issues/13846

15m is fine, we should definitely take action if it runs that long
normally though.
2025-06-04 18:24:40 +00:00
konsti
794b3ec750
Downgrade reqwest and hyper-util (#13835)
Workaround for https://github.com/astral-sh/uv/issues/13831, see also
https://github.com/hyperium/hyper/issues/3900
2025-06-04 18:48:53 +02:00
Aria Desires
d8636a4088
Prefer the binary's version when checking if it's up to date (#13840)
Fixes #13741
2025-06-04 11:32:59 -04:00
konsti
a748336a09
Follow-up #13814, git+ssh test case (#13839)
Follow-ups to review on #13814
2025-06-04 15:02:26 +00:00
konsti
e7c066cf16
Test case: Don't redact username git for SSH (#13814)
Some checks are pending
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
2025-06-04 09:57:24 +02:00
Zanie Blue
3ca8d074a4
Use "terminal driver" instead of "shell" in SIGINT docs (#13787)
Some checks are pending
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
CI / benchmarks | instrumented (push) Blocked by required conditions
Addressing the comment at
https://github.com/astral-sh/uv/issues/12108#issuecomment-2925703719
2025-06-03 22:07:03 +00:00
Jacob Woliver
f5c3601445
Better error message for version specifier with missing operator (#13803)
Some checks are pending
CI / benchmarks | instrumented (push) Blocked by required conditions
CI / check cache | macos aarch64 (push) Blocked by required conditions
CI / check system | python on debian (push) Blocked by required conditions
CI / check system | python on fedora (push) Blocked by required conditions
CI / check system | python on ubuntu (push) Blocked by required conditions
CI / check system | python on rocky linux 8 (push) Blocked by required conditions
CI / check system | python on rocky linux 9 (push) Blocked by required conditions
CI / check system | graalpy on ubuntu (push) Blocked by required conditions
CI / check system | pypy on ubuntu (push) Blocked by required conditions
CI / check system | pyston (push) Blocked by required conditions
CI / check system | python on macos aarch64 (push) Blocked by required conditions
CI / check system | homebrew python on macos aarch64 (push) Blocked by required conditions
CI / check system | python on macos x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86-64 (push) Blocked by required conditions
CI / check system | python3.10 on windows x86 (push) Blocked by required conditions
CI / check system | python3.13 on windows x86-64 (push) Blocked by required conditions
CI / check system | x86-64 python3.13 on windows aarch64 (push) Blocked by required conditions
CI / check system | windows registry (push) Blocked by required conditions
CI / check system | python3.12 via chocolatey (push) Blocked by required conditions
CI / check system | python3.9 via pyenv (push) Blocked by required conditions
CI / check system | python3.13 (push) Blocked by required conditions
CI / check system | conda3.11 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.8 on macos aarch64 (push) Blocked by required conditions
CI / check system | conda3.11 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on linux x86-64 (push) Blocked by required conditions
CI / check system | conda3.11 on windows x86-64 (push) Blocked by required conditions
CI / check system | conda3.8 on windows x86-64 (push) Blocked by required conditions
CI / check system | amazonlinux (push) Blocked by required conditions
CI / check system | embedded python3.10 on windows x86-64 (push) Blocked by required conditions
CI / benchmarks | walltime aarch64 linux (push) Blocked by required conditions
## Summary

When missing an operator for version parsing, it would give an error
that was hard to know how to fix if you were not familiar with version
specifiers / PEP-440:

```
Unexpected end of version specifier, expected operator
```

Now, it will attempt to provide a more useful hint if it can parse the
version from the remaining scanner:

```
Unexpected end of version specifier, expected operator (did you mean "==3.12"?)
```
## Test Plan

Unit tests in `version_specifier.rs` were added/updated for the
following cases:
- `test_parse_specifier_missing_operator_error`
- `test_parse_specifier_missing_operator_invalid_version_error`
- `test_invalid_word`
- `test_invalid_specifier`
- `error_message_version_specifiers_parse_error`

A test in `edit.rs` for failing to parse the `pyproject.toml` when using
`add` was also included to match the request in the original issue:
- `add_invalid_requires_python`

I didn't add cases where no version specifier is provided because it
seemed like it doesn't get to the point of parsing in that case, so it
should not happen.


## Reference

Fixes #13126

---------

Co-authored-by: Jacob Woliver <jacob@jmw.sh>
Co-authored-by: konstin <konstin@mailbox.org>
2025-06-03 19:18:19 +00:00
Hood Chatham
f9d3f24728
Add Pyodide support (#12731)
This includes some initial work on adding Pyodide support (issue
#12729). It is enough to get
```
uv pip compile -p /path/to/pyodide --extra-index-url file:/path/to/simple-index
```
to work which should already be quite useful.

## Test Plan

* added a unit test for `pyodide_platform`
* integration tested manually with:
```
cargo run pip install \
-p /home/rchatham/Documents/programming/tmp/pyodide-venv-test/.pyodide-xbuildenv-0.29.3/0.27.4/xbuildenv/pyodide-root/dist/python \
--extra-index-url file:/home/rchatham/Documents/programming/tmp/pyodide-venv-test/.pyodide-xbuildenv-0.29.3/0.27.4/xbuildenv/pyodide-root/package_index \
--index-strategy unsafe-best-match --target blah --no-build \
numpy pydantic
```

---------

Co-authored-by: konsti <konstin@mailbox.org>
Co-authored-by: Zanie Blue <contact@zanie.dev>
2025-06-03 12:01:26 -05:00
Zanie Blue
37e22e4da6
Remove python-managed marker from sync_dry_run test (#13816)
Not sure why this is present, it does not seem to use a managed
interpreter
2025-06-03 11:08:09 -05:00
315 changed files with 24276 additions and 6019 deletions

View file

@ -1,4 +1,4 @@
[profile.default]
# Mark tests that take longer than 10s as slow.
# Terminate after 90s as a stop-gap measure to terminate on deadlock.
slow-timeout = { period = "10s", terminate-after = 9 }
# Terminate after 120s as a stop-gap measure to terminate on deadlock.
slow-timeout = { period = "10s", terminate-after = 12 }

View file

@ -54,7 +54,7 @@ jobs:
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
- name: "Build sdist"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
command: sdist
args: --out dist
@ -74,7 +74,7 @@ jobs:
# uv-build
- name: "Build sdist uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
command: sdist
args: --out crates/uv-build/dist -m crates/uv-build/Cargo.toml
@ -103,7 +103,7 @@ jobs:
# uv
- name: "Build wheels - x86_64"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: x86_64
args: --release --locked --out dist --features self-update
@ -133,7 +133,7 @@ jobs:
# uv-build
- name: "Build wheels uv-build - x86_64"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: x86_64
args: --profile minimal-size --locked --out crates/uv-build/dist -m crates/uv-build/Cargo.toml
@ -157,7 +157,7 @@ jobs:
# uv
- name: "Build wheels - aarch64"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: aarch64
args: --release --locked --out dist --features self-update
@ -193,7 +193,7 @@ jobs:
# uv-build
- name: "Build wheels uv-build - aarch64"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: aarch64
args: --profile minimal-size --locked --out crates/uv-build/dist -m crates/uv-build/Cargo.toml
@ -231,7 +231,7 @@ jobs:
# uv
- name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
args: --release --locked --out dist --features self-update,windows-gui-bin
@ -267,7 +267,7 @@ jobs:
# uv-build
- name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
args: --profile minimal-size --locked --out crates/uv-build/dist -m crates/uv-build/Cargo.toml
@ -303,7 +303,7 @@ jobs:
# uv
- name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.target }}
# Generally, we try to build in a target docker container. In this case however, a
@ -368,7 +368,7 @@ jobs:
# uv-build
- name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.target }}
manylinux: auto
@ -412,7 +412,7 @@ jobs:
# uv
- name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
# On `aarch64`, use `manylinux: 2_28`; otherwise, use `manylinux: auto`.
@ -461,7 +461,7 @@ jobs:
# uv-build
- name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
# On `aarch64`, use `manylinux: 2_28`; otherwise, use `manylinux: auto`.
@ -509,7 +509,7 @@ jobs:
# uv
- name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: auto
@ -561,7 +561,7 @@ jobs:
# uv-build
- name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: auto
@ -614,7 +614,7 @@ jobs:
# uv
- name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: auto
@ -671,7 +671,7 @@ jobs:
# uv-build
- name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: auto
@ -691,6 +691,103 @@ jobs:
name: wheels_uv_build-${{ matrix.platform.target }}
path: crates/uv-build/dist
# Like `linux-arm`.
linux-riscv64:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
timeout-minutes: 30
runs-on: depot-ubuntu-latest-4
strategy:
matrix:
platform:
- target: riscv64gc-unknown-linux-gnu
arch: riscv64
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: "Prep README.md"
run: python scripts/transform_readme.py --target pypi
# uv
- name: "Build wheels"
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: auto
docker-options: ${{ matrix.platform.maturin_docker_options }}
args: --release --locked --out dist --features self-update
- uses: uraimo/run-on-arch-action@ac33288c3728ca72563c97b8b88dda5a65a84448 # v2
name: "Test wheel"
with:
arch: ${{ matrix.platform.arch }}
distro: ubuntu20.04
githubToken: ${{ github.token }}
install: |
apt-get update
apt-get install -y --no-install-recommends python3 python3-pip python-is-python3
pip3 install -U pip
run: |
pip install ${{ env.PACKAGE_NAME }} --no-index --find-links dist/ --force-reinstall
${{ env.MODULE_NAME }} --help
# TODO(konsti): Enable this test on all platforms, currently `find_uv_bin` is failing to discover uv here.
# python -m ${{ env.MODULE_NAME }} --help
uvx --help
- name: "Upload wheels"
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: wheels_uv-${{ matrix.platform.target }}
path: dist
- name: "Archive binary"
shell: bash
run: |
TARGET=${{ matrix.platform.target }}
ARCHIVE_NAME=uv-$TARGET
ARCHIVE_FILE=$ARCHIVE_NAME.tar.gz
mkdir -p $ARCHIVE_NAME
cp target/$TARGET/release/uv $ARCHIVE_NAME/uv
cp target/$TARGET/release/uvx $ARCHIVE_NAME/uvx
tar czvf $ARCHIVE_FILE $ARCHIVE_NAME
shasum -a 256 $ARCHIVE_FILE > $ARCHIVE_FILE.sha256
- name: "Upload binary"
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: artifacts-${{ matrix.platform.target }}
path: |
*.tar.gz
*.sha256
# uv-build
- name: "Build wheels uv-build"
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: auto
docker-options: ${{ matrix.platform.maturin_docker_options }}
args: --profile minimal-size --locked --out crates/uv-build/dist -m crates/uv-build/Cargo.toml
- uses: uraimo/run-on-arch-action@ac33288c3728ca72563c97b8b88dda5a65a84448 # v2
name: "Test wheel uv-build"
with:
arch: ${{ matrix.platform.arch }}
distro: ubuntu20.04
githubToken: ${{ github.token }}
install: |
apt-get update
apt-get install -y --no-install-recommends python3 python3-pip python-is-python3
pip3 install -U pip
run: |
pip install ${{ env.PACKAGE_NAME }}-build --no-index --find-links crates/uv-build/dist --force-reinstall
${{ env.MODULE_NAME }}-build --help
# TODO(konsti): Enable this test on all platforms, currently `find_uv_bin` is failing to discover uv here.
# python -m ${{ env.MODULE_NAME }}-build --help
- name: "Upload wheels uv-build"
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: wheels_uv_build-${{ matrix.platform.target }}
path: crates/uv-build/dist
musllinux:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
runs-on: ubuntu-latest
@ -710,7 +807,7 @@ jobs:
# uv
- name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.target }}
manylinux: musllinux_1_1
@ -757,7 +854,7 @@ jobs:
# uv-build
- name: "Build wheels uv-build"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.target }}
manylinux: musllinux_1_1
@ -804,7 +901,7 @@ jobs:
# uv
- name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: musllinux_1_1
@ -869,7 +966,7 @@ jobs:
# uv-build
- name: "Build wheels"
uses: PyO3/maturin-action@44479ae1b6b1a57f561e03add8832e62c185eb17 # v1.48.1
uses: PyO3/maturin-action@e10f6c464b90acceb5f640d31beda6d586ba7b4a # v1.49.3
with:
target: ${{ matrix.platform.target }}
manylinux: musllinux_1_1

View file

@ -1,11 +1,19 @@
# Build and publish a Docker image.
# Build and publish Docker images.
#
# Assumed to run as a subworkflow of .github/workflows/release.yml; specifically, as a local
# artifacts job within `cargo-dist`.
# Uses Depot for multi-platform builds. Includes both a `uv` base image, which
# is just the binary in a scratch image, and a set of extra, common images with
# the uv binary installed.
#
# TODO(charlie): Ideally, the publish step would happen as a publish job within `cargo-dist`, but
# sharing the built image as an artifact between jobs is challenging.
name: "Build Docker image"
# Images are built on all runs.
#
# On release, assumed to run as a subworkflow of .github/workflows/release.yml;
# specifically, as a local artifacts job within `cargo-dist`. In this case,
# images are published based on the `plan`.
#
# TODO(charlie): Ideally, the publish step would happen as a publish job within
# `cargo-dist`, but sharing the built image as an artifact between jobs is
# challenging.
name: "Docker images"
on:
workflow_call:
@ -29,35 +37,67 @@ on:
- .github/workflows/build-docker.yml
env:
UV_BASE_IMG: ghcr.io/${{ github.repository_owner }}/uv
UV_GHCR_IMAGE: ghcr.io/${{ github.repository_owner }}/uv
UV_DOCKERHUB_IMAGE: docker.io/astral/uv
jobs:
docker-build:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
name: Build Docker image (ghcr.io/astral-sh/uv) for ${{ matrix.platform }}
docker-plan:
name: plan
runs-on: ubuntu-latest
outputs:
login: ${{ steps.plan.outputs.login }}
push: ${{ steps.plan.outputs.push }}
tag: ${{ steps.plan.outputs.tag }}
action: ${{ steps.plan.outputs.action }}
steps:
- name: Set push variable
env:
DRY_RUN: ${{ inputs.plan == '' || fromJson(inputs.plan).announcement_tag_is_implicit }}
TAG: ${{ inputs.plan != '' && fromJson(inputs.plan).announcement_tag }}
IS_LOCAL_PR: ${{ github.event.pull_request.head.repo.full_name == 'astral-sh/uv' }}
id: plan
run: |
if [ "${{ env.DRY_RUN }}" == "false" ]; then
echo "login=true" >> "$GITHUB_OUTPUT"
echo "push=true" >> "$GITHUB_OUTPUT"
echo "tag=${{ env.TAG }}" >> "$GITHUB_OUTPUT"
echo "action=build and publish" >> "$GITHUB_OUTPUT"
else
echo "login=${{ env.IS_LOCAL_PR }}" >> "$GITHUB_OUTPUT"
echo "push=false" >> "$GITHUB_OUTPUT"
echo "tag=dry-run" >> "$GITHUB_OUTPUT"
echo "action=build" >> "$GITHUB_OUTPUT"
fi
docker-publish-base:
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-build') }}
name: ${{ needs.docker-plan.outputs.action }} uv
needs:
- docker-plan
runs-on: ubuntu-latest
permissions:
contents: read
id-token: write # for Depot OIDC and GHCR signing
packages: write # for GHCR image pushes
attestations: write # for GHCR attestations
environment:
name: release
strategy:
fail-fast: false
matrix:
platform:
- linux/amd64
- linux/arm64
name: ${{ needs.docker-plan.outputs.push == 'true' && 'release' || '' }}
outputs:
image-tags: ${{ steps.meta.outputs.tags }}
image-annotations: ${{ steps.meta.outputs.annotations }}
image-digest: ${{ steps.build.outputs.digest }}
image-version: ${{ steps.meta.outputs.version }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
submodules: recursive
# Login to DockerHub first, to avoid rate-limiting
# Login to DockerHub (when not pushing, it's to avoid rate-limiting)
- uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
# PRs from forks don't have access to secrets, disable this step in that case.
if: ${{ github.event.pull_request.head.repo.full_name == 'astral-sh/uv' }}
if: ${{ needs.docker-plan.outputs.login == 'true' }}
with:
username: astralshbot
password: ${{ secrets.DOCKERHUB_TOKEN_RO }}
- uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
username: ${{ needs.docker-plan.outputs.push == 'true' && 'astral' || 'astralshbot' }}
password: ${{ needs.docker-plan.outputs.push == 'true' && secrets.DOCKERHUB_TOKEN_RW || secrets.DOCKERHUB_TOKEN_RO }}
- uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
@ -65,13 +105,15 @@ jobs:
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: depot/setup-action@b0b1ea4f69e92ebf5dea3f8713a1b0c37b2126a5
- name: Check tag consistency
if: ${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
if: ${{ needs.docker-plan.outputs.push == 'true' }}
run: |
version=$(grep "version = " pyproject.toml | sed -e 's/version = "\(.*\)"/\1/g')
if [ "${{ fromJson(inputs.plan).announcement_tag }}" != "${version}" ]; then
if [ "${{ needs.docker-plan.outputs.tag }}" != "${version}" ]; then
echo "The input tag does not match the version from pyproject.toml:" >&2
echo "${{ fromJson(inputs.plan).announcement_tag }}" >&2
echo "${{ needs.docker-plan.outputs.tag }}" >&2
echo "${version}" >&2
exit 1
else
@ -81,107 +123,50 @@ jobs:
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@902fa8ec7d6ecbf8d84d538b9b233a880e428804 # v5.7.0
env:
DOCKER_METADATA_ANNOTATIONS_LEVELS: index
with:
images: ${{ env.UV_BASE_IMG }}
images: |
${{ env.UV_GHCR_IMAGE }}
${{ env.UV_DOCKERHUB_IMAGE }}
# Defining this makes sure the org.opencontainers.image.version OCI label becomes the actual release version and not the branch name
tags: |
type=raw,value=dry-run,enable=${{ inputs.plan == '' || fromJson(inputs.plan).announcement_tag_is_implicit }}
type=pep440,pattern={{ version }},value=${{ inputs.plan != '' && fromJson(inputs.plan).announcement_tag || 'dry-run' }},enable=${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
type=raw,value=dry-run,enable=${{ needs.docker-plan.outputs.push == 'false' }}
type=pep440,pattern={{ version }},value=${{ needs.docker-plan.outputs.tag }},enable=${{ needs.docker-plan.outputs.push }}
type=pep440,pattern={{ major }}.{{ minor }},value=${{ needs.docker-plan.outputs.tag }},enable=${{ needs.docker-plan.outputs.push }}
- name: Normalize Platform Pair (replace / with -)
run: |
platform=${{ matrix.platform }}
echo "PLATFORM_TUPLE=${platform//\//-}" >> $GITHUB_ENV
# Adapted from https://docs.docker.com/build/ci/github-actions/multi-platform/
- name: Build and push by digest
id: build
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: depot/build-push-action@2583627a84956d07561420dcc1d0eb1f2af3fac0 # v1.15.0
with:
project: 7hd4vdzmw5 # astral-sh/uv
context: .
platforms: ${{ matrix.platform }}
cache-from: type=gha,scope=uv-${{ env.PLATFORM_TUPLE }}
cache-to: type=gha,mode=min,scope=uv-${{ env.PLATFORM_TUPLE }}
platforms: linux/amd64,linux/arm64
push: ${{ needs.docker-plan.outputs.push }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
outputs: type=image,name=${{ env.UV_BASE_IMG }},push-by-digest=true,name-canonical=true,push=${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
# TODO(zanieb): Annotations are not supported by Depot yet and are ignored
annotations: ${{ steps.meta.outputs.annotations }}
- name: Export digests
run: |
mkdir -p /tmp/digests
digest="${{ steps.build.outputs.digest }}"
touch "/tmp/digests/${digest#sha256:}"
- name: Upload digests
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
- name: Generate artifact attestation for base image
if: ${{ needs.docker-plan.outputs.push == 'true' }}
uses: actions/attest-build-provenance@e8998f949152b193b063cb0ec769d69d929409be # v2.4.0
with:
name: digests-${{ env.PLATFORM_TUPLE }}
path: /tmp/digests/*
if-no-files-found: error
retention-days: 1
docker-publish:
name: Publish Docker image (ghcr.io/astral-sh/uv)
runs-on: ubuntu-latest
environment:
name: release
needs:
- docker-build
if: ${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
steps:
# Login to DockerHub first, to avoid rate-limiting
- uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
username: astralshbot
password: ${{ secrets.DOCKERHUB_TOKEN_RO }}
- name: Download digests
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
path: /tmp/digests
pattern: digests-*
merge-multiple: true
- uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@902fa8ec7d6ecbf8d84d538b9b233a880e428804 # v5.7.0
with:
images: ${{ env.UV_BASE_IMG }}
# Order is on purpose such that the label org.opencontainers.image.version has the first pattern with the full version
tags: |
type=pep440,pattern={{ version }},value=${{ fromJson(inputs.plan).announcement_tag }}
type=pep440,pattern={{ major }}.{{ minor }},value=${{ fromJson(inputs.plan).announcement_tag }}
- uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
# Adapted from https://docs.docker.com/build/ci/github-actions/multi-platform/
- name: Create manifest list and push
working-directory: /tmp/digests
# The jq command expands the docker/metadata json "tags" array entry to `-t tag1 -t tag2 ...` for each tag in the array
# The printf will expand the base image with the `<UV_BASE_IMG>@sha256:<sha256> ...` for each sha256 in the directory
# The final command becomes `docker buildx imagetools create -t tag1 -t tag2 ... <UV_BASE_IMG>@sha256:<sha256_1> <UV_BASE_IMG>@sha256:<sha256_2> ...`
run: |
docker buildx imagetools create \
$(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
$(printf '${{ env.UV_BASE_IMG }}@sha256:%s ' *)
subject-name: ${{ env.UV_GHCR_IMAGE }}
subject-digest: ${{ steps.build.outputs.digest }}
docker-publish-extra:
name: Publish additional Docker image based on ${{ matrix.image-mapping }}
name: ${{ needs.docker-plan.outputs.action }} ${{ matrix.image-mapping }}
runs-on: ubuntu-latest
environment:
name: release
name: ${{ needs.docker-plan.outputs.push == 'true' && 'release' || '' }}
needs:
- docker-publish
if: ${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
- docker-plan
- docker-publish-base
permissions:
packages: write
attestations: write # needed to push image attestations to the Github attestation store
id-token: write # needed for signing the images with GitHub OIDC Token
id-token: write # for Depot OIDC and GHCR signing
packages: write # for GHCR image pushes
attestations: write # for GHCR attestations
strategy:
fail-fast: false
matrix:
@ -213,13 +198,12 @@ jobs:
- python:3.9-slim-bookworm,python3.9-bookworm-slim
- python:3.8-slim-bookworm,python3.8-bookworm-slim
steps:
# Login to DockerHub first, to avoid rate-limiting
# Login to DockerHub (when not pushing, it's to avoid rate-limiting)
- uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
if: ${{ needs.docker-plan.outputs.login == 'true' }}
with:
username: astralshbot
password: ${{ secrets.DOCKERHUB_TOKEN_RO }}
- uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
username: ${{ needs.docker-plan.outputs.push == 'true' && 'astral' || 'astralshbot' }}
password: ${{ needs.docker-plan.outputs.push == 'true' && secrets.DOCKERHUB_TOKEN_RW || secrets.DOCKERHUB_TOKEN_RO }}
- uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
@ -227,6 +211,8 @@ jobs:
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: depot/setup-action@b0b1ea4f69e92ebf5dea3f8713a1b0c37b2126a5
- name: Generate Dynamic Dockerfile Tags
shell: bash
run: |
@ -238,7 +224,7 @@ jobs:
# Generate Dockerfile content
cat <<EOF > Dockerfile
FROM ${BASE_IMAGE}
COPY --from=${{ env.UV_BASE_IMG }}:latest /uv /uvx /usr/local/bin/
COPY --from=${{ env.UV_GHCR_IMAGE }}:latest /uv /uvx /usr/local/bin/
ENTRYPOINT []
CMD ["/usr/local/bin/uv"]
EOF
@ -249,17 +235,14 @@ jobs:
# Loop through all base tags and append its docker metadata pattern to the list
# Order is on purpose such that the label org.opencontainers.image.version has the first pattern with the full version
IFS=','; for TAG in ${BASE_TAGS}; do
TAG_PATTERNS="${TAG_PATTERNS}type=pep440,pattern={{ version }},suffix=-${TAG},value=${{ fromJson(inputs.plan).announcement_tag }}\n"
TAG_PATTERNS="${TAG_PATTERNS}type=pep440,pattern={{ major }}.{{ minor }},suffix=-${TAG},value=${{ fromJson(inputs.plan).announcement_tag }}\n"
TAG_PATTERNS="${TAG_PATTERNS}type=pep440,pattern={{ version }},suffix=-${TAG},value=${{ needs.docker-plan.outputs.tag }}\n"
TAG_PATTERNS="${TAG_PATTERNS}type=pep440,pattern={{ major }}.{{ minor }},suffix=-${TAG},value=${{ needs.docker-plan.outputs.tag }}\n"
TAG_PATTERNS="${TAG_PATTERNS}type=raw,value=${TAG}\n"
done
# Remove the trailing newline from the pattern list
TAG_PATTERNS="${TAG_PATTERNS%\\n}"
# Export image cache name
echo "IMAGE_REF=${BASE_IMAGE//:/-}" >> $GITHUB_ENV
# Export tag patterns using the multiline env var syntax
{
echo "TAG_PATTERNS<<EOF"
@ -274,7 +257,9 @@ jobs:
env:
DOCKER_METADATA_ANNOTATIONS_LEVELS: index
with:
images: ${{ env.UV_BASE_IMG }}
images: |
${{ env.UV_GHCR_IMAGE }}
${{ env.UV_DOCKERHUB_IMAGE }}
flavor: |
latest=false
tags: |
@ -282,67 +267,84 @@ jobs:
- name: Build and push
id: build-and-push
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: depot/build-push-action@2583627a84956d07561420dcc1d0eb1f2af3fac0 # v1.15.0
with:
context: .
project: 7hd4vdzmw5 # astral-sh/uv
platforms: linux/amd64,linux/arm64
# We do not really need to cache here as the Dockerfile is tiny
#cache-from: type=gha,scope=uv-${{ env.IMAGE_REF }}
#cache-to: type=gha,mode=min,scope=uv-${{ env.IMAGE_REF }}
push: true
push: ${{ needs.docker-plan.outputs.push }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
# TODO(zanieb): Annotations are not supported by Depot yet and are ignored
annotations: ${{ steps.meta.outputs.annotations }}
- name: Generate artifact attestation
uses: actions/attest-build-provenance@db473fddc028af60658334401dc6fa3ffd8669fd # v2.3.0
if: ${{ needs.docker-plan.outputs.push == 'true' }}
uses: actions/attest-build-provenance@e8998f949152b193b063cb0ec769d69d929409be # v2.4.0
with:
subject-name: ${{ env.UV_BASE_IMG }}
subject-name: ${{ env.UV_GHCR_IMAGE }}
subject-digest: ${{ steps.build-and-push.outputs.digest }}
# push-to-registry is explicitly not enabled to maintain full control over the top image
# This is effectively a duplicate of `docker-publish` to make https://github.com/astral-sh/uv/pkgs/container/uv
# show the uv base image first since GitHub always shows the last updated image digests
# This works by annotating the original digests (previously non-annotated) which triggers an update to ghcr.io
docker-republish:
name: Annotate Docker image (ghcr.io/astral-sh/uv)
# Push annotations manually.
# See `docker-annotate-base` for details.
- name: Add annotations to images
if: ${{ needs.docker-plan.outputs.push == 'true' }}
env:
IMAGES: "${{ env.UV_GHCR_IMAGE }} ${{ env.UV_DOCKERHUB_IMAGE }}"
DIGEST: ${{ steps.build-and-push.outputs.digest }}
TAGS: ${{ steps.meta.outputs.tags }}
ANNOTATIONS: ${{ steps.meta.outputs.annotations }}
run: |
set -x
readarray -t lines <<< "$ANNOTATIONS"; annotations=(); for line in "${lines[@]}"; do annotations+=(--annotation "$line"); done
for image in $IMAGES; do
readarray -t lines < <(grep "^${image}:" <<< "$TAGS"); tags=(); for line in "${lines[@]}"; do tags+=(-t "$line"); done
docker buildx imagetools create \
"${annotations[@]}" \
"${tags[@]}" \
"${image}@${DIGEST}"
done
# See `docker-annotate-base` for details.
- name: Export manifest digest
id: manifest-digest
if: ${{ needs.docker-plan.outputs.push == 'true' }}
env:
IMAGE: ${{ env.UV_GHCR_IMAGE }}
VERSION: ${{ steps.meta.outputs.version }}
run: |
digest="$(
docker buildx imagetools inspect \
"${IMAGE}:${VERSION}" \
--format '{{json .Manifest}}' \
| jq -r '.digest'
)"
echo "digest=${digest}" >> "$GITHUB_OUTPUT"
# See `docker-annotate-base` for details.
- name: Generate artifact attestation
if: ${{ needs.docker-plan.outputs.push == 'true' }}
uses: actions/attest-build-provenance@e8998f949152b193b063cb0ec769d69d929409be # v2.4.0
with:
subject-name: ${{ env.UV_GHCR_IMAGE }}
subject-digest: ${{ steps.manifest-digest.outputs.digest }}
# Annotate the base image
docker-annotate-base:
name: annotate uv
runs-on: ubuntu-latest
environment:
name: release
name: ${{ needs.docker-plan.outputs.push == 'true' && 'release' || '' }}
needs:
- docker-plan
- docker-publish-base
- docker-publish-extra
if: ${{ inputs.plan != '' && !fromJson(inputs.plan).announcement_tag_is_implicit }}
permissions:
packages: write
attestations: write # needed to push image attestations to the Github attestation store
id-token: write # needed for signing the images with GitHub OIDC Token
if: ${{ needs.docker-plan.outputs.push == 'true' }}
steps:
# Login to DockerHub first, to avoid rate-limiting
- uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
username: astralshbot
password: ${{ secrets.DOCKERHUB_TOKEN_RO }}
- name: Download digests
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
path: /tmp/digests
pattern: digests-*
merge-multiple: true
- uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@902fa8ec7d6ecbf8d84d538b9b233a880e428804 # v5.7.0
env:
DOCKER_METADATA_ANNOTATIONS_LEVELS: index
with:
images: ${{ env.UV_BASE_IMG }}
# Order is on purpose such that the label org.opencontainers.image.version has the first pattern with the full version
tags: |
type=pep440,pattern={{ version }},value=${{ fromJson(inputs.plan).announcement_tag }}
type=pep440,pattern={{ major }}.{{ minor }},value=${{ fromJson(inputs.plan).announcement_tag }}
username: astral
password: ${{ secrets.DOCKERHUB_TOKEN_RW }}
- uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
@ -350,22 +352,37 @@ jobs:
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
# Adapted from https://docs.docker.com/build/ci/github-actions/multi-platform/
- name: Create manifest list and push
working-directory: /tmp/digests
# Depot doesn't support annotating images, so we need to do so manually
# afterwards. Mutating the manifest is desirable regardless, because we
# want to bump the base image to appear at the top of the list on GHCR.
# However, once annotation support is added to Depot, this step can be
# minimized to just touch the GHCR manifest.
- name: Add annotations to images
env:
IMAGES: "${{ env.UV_GHCR_IMAGE }} ${{ env.UV_DOCKERHUB_IMAGE }}"
DIGEST: ${{ needs.docker-publish-base.outputs.image-digest }}
TAGS: ${{ needs.docker-publish-base.outputs.image-tags }}
ANNOTATIONS: ${{ needs.docker-publish-base.outputs.image-annotations }}
# The readarray part is used to make sure the quoting and special characters are preserved on expansion (e.g. spaces)
# The jq command expands the docker/metadata json "tags" array entry to `-t tag1 -t tag2 ...` for each tag in the array
# The printf will expand the base image with the `<UV_BASE_IMG>@sha256:<sha256> ...` for each sha256 in the directory
# The final command becomes `docker buildx imagetools create -t tag1 -t tag2 ... <UV_BASE_IMG>@sha256:<sha256_1> <UV_BASE_IMG>@sha256:<sha256_2> ...`
# The final command becomes `docker buildx imagetools create --annotation 'index:foo=1' --annotation 'index:bar=2' ... -t tag1 -t tag2 ... <IMG>@sha256:<sha256>`
run: |
readarray -t lines <<< "$DOCKER_METADATA_OUTPUT_ANNOTATIONS"; annotations=(); for line in "${lines[@]}"; do annotations+=(--annotation "$line"); done
docker buildx imagetools create \
"${annotations[@]}" \
$(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
$(printf '${{ env.UV_BASE_IMG }}@sha256:%s ' *)
set -x
readarray -t lines <<< "$ANNOTATIONS"; annotations=(); for line in "${lines[@]}"; do annotations+=(--annotation "$line"); done
for image in $IMAGES; do
readarray -t lines < <(grep "^${image}:" <<< "$TAGS"); tags=(); for line in "${lines[@]}"; do tags+=(-t "$line"); done
docker buildx imagetools create \
"${annotations[@]}" \
"${tags[@]}" \
"${image}@${DIGEST}"
done
- name: Share manifest digest
# Now that we've modified the manifest, we need to attest it again.
# Note we only generate an attestation for GHCR.
- name: Export manifest digest
id: manifest-digest
env:
IMAGE: ${{ env.UV_GHCR_IMAGE }}
VERSION: ${{ needs.docker-publish-base.outputs.image-version }}
# To sign the manifest, we need it's digest. Unfortunately "docker
# buildx imagetools create" does not (yet) have a clean way of sharing
# the digest of the manifest it creates (see docker/buildx#2407), so
@ -377,15 +394,14 @@ jobs:
run: |
digest="$(
docker buildx imagetools inspect \
"${UV_BASE_IMG}:${DOCKER_METADATA_OUTPUT_VERSION}" \
"${IMAGE}:${VERSION}" \
--format '{{json .Manifest}}' \
| jq -r '.digest'
)"
echo "digest=${digest}" >> "$GITHUB_OUTPUT"
- name: Generate artifact attestation
uses: actions/attest-build-provenance@db473fddc028af60658334401dc6fa3ffd8669fd # v2.3.0
uses: actions/attest-build-provenance@e8998f949152b193b063cb0ec769d69d929409be # v2.4.0
with:
subject-name: ${{ env.UV_BASE_IMG }}
subject-name: ${{ env.UV_GHCR_IMAGE }}
subject-digest: ${{ steps.manifest-digest.outputs.digest }}
# push-to-registry is explicitly not enabled to maintain full control over the top image

View file

@ -14,8 +14,9 @@ env:
CARGO_INCREMENTAL: 0
CARGO_NET_RETRY: 10
CARGO_TERM_COLOR: always
RUSTUP_MAX_RETRIES: 10
PYTHON_VERSION: "3.12"
RUSTUP_MAX_RETRIES: 10
RUST_BACKTRACE: 1
jobs:
determine_changes:
@ -39,7 +40,7 @@ jobs:
while IFS= read -r file; do
# Generated markdown and JSON files are checked during test runs.
if [[ "${file}" =~ ^docs/ && ! "${file}" =~ ^docs/reference/(cli|settings).md && ! "${file}" =~ ^docs/configuration/environment.md ]]; then
if [[ "${file}" =~ ^docs/ && ! "${file}" =~ ^docs/reference/(cli|settings).md && ! "${file}" =~ ^docs/reference/environment.md ]]; then
echo "Skipping ${file} (matches docs/ pattern)"
continue
fi
@ -81,7 +82,7 @@ jobs:
run: rustup component add rustfmt
- name: "Install uv"
uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "rustfmt"
run: cargo fmt --all --check
@ -125,11 +126,11 @@ jobs:
name: "cargo clippy | ubuntu"
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
- name: "Check uv_build dependencies"
uses: EmbarkStudios/cargo-deny-action@34899fc7ba81ca6268d5947a7a16b4649013fea1 # v2.0.11
uses: EmbarkStudios/cargo-deny-action@30f817c6f72275c6d54dc744fbca09ebc958599f # v2.0.12
with:
command: check bans
manifest-path: crates/uv-build/Cargo.toml
@ -155,7 +156,7 @@ jobs:
run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
workspaces: ${{ env.UV_WORKSPACE }}
@ -174,7 +175,7 @@ jobs:
name: "cargo dev generate-all"
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
- name: "Generate all"
@ -187,7 +188,7 @@ jobs:
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: "Install cargo shear"
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
uses: taiki-e/install-action@7b20dfd705618832f20d29066e34aa2f2f6194c2 # v2.52.8
with:
tool: cargo-shear
- run: cargo shear
@ -207,17 +208,17 @@ jobs:
- uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup show
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install required Python versions"
run: uv python install
- name: "Install cargo nextest"
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
uses: taiki-e/install-action@7b20dfd705618832f20d29066e34aa2f2f6194c2 # v2.52.8
with:
tool: cargo-nextest
@ -229,7 +230,7 @@ jobs:
--status-level skip --failure-output immediate-final --no-fail-fast -j 20 --final-status-level slow
cargo-test-macos:
timeout-minutes: 10
timeout-minutes: 15
needs: determine_changes
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
runs-on: macos-latest-xlarge # github-macos-14-aarch64-6
@ -239,17 +240,17 @@ jobs:
- uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup show
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install required Python versions"
run: uv python install
- name: "Install cargo nextest"
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
uses: taiki-e/install-action@7b20dfd705618832f20d29066e34aa2f2f6194c2 # v2.52.8
with:
tool: cargo-nextest
@ -265,7 +266,7 @@ jobs:
timeout-minutes: 15
needs: determine_changes
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
runs-on: github-windows-2025-x86_64-16
runs-on: depot-windows-2022-16
name: "cargo test | windows"
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
@ -278,11 +279,11 @@ jobs:
run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- name: "Install required Python versions"
run: uv python install
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
workspaces: ${{ env.UV_WORKSPACE }}
@ -291,27 +292,11 @@ jobs:
run: rustup show
- name: "Install cargo nextest"
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
uses: taiki-e/install-action@7b20dfd705618832f20d29066e34aa2f2f6194c2 # v2.52.8
with:
tool: cargo-nextest
# Get crash dumps to debug the `exit_code: -1073741819` failures
- name: Configure crash dumps
if: runner.os == 'Windows'
shell: powershell
run: |
$dumps = "$env:GITHUB_WORKSPACE\dumps"
New-Item -Path $dumps -ItemType Directory -Force
# https://github.com/microsoft/terminal/wiki/Troubleshooting-Tips#capture-automatically
$reg = "HKLM:\SOFTWARE\Microsoft\Windows\Windows Error Reporting\LocalDumps"
New-Item -Path $reg -Force | Out-Null
Set-ItemProperty -Path $reg -Name "DumpFolder" -Value $dumps
Set-ItemProperty -Path $reg -Name "DumpType" -Value 2
- name: "Cargo test"
id: test
continue-on-error: true
working-directory: ${{ env.UV_WORKSPACE }}
env:
# Avoid permission errors during concurrent tests
@ -325,42 +310,6 @@ jobs:
--workspace \
--status-level skip --failure-output immediate-final --no-fail-fast -j 20 --final-status-level slow
# Get crash dumps to debug the `exit_code: -1073741819` failures (contd.)
- name: Analyze crashes
if: steps.test.outcome == 'failure'
shell: powershell
run: |
$dumps = Get-ChildItem "$env:GITHUB_WORKSPACE\dumps\*.dmp" -ErrorAction SilentlyContinue
if (!$dumps) { exit 0 }
Write-Host "Found $($dumps.Count) crash dump(s)"
# Download cdb if needed
$cdb = "C:\Program Files (x86)\Windows Kits\10\Debuggers\x64\cdb.exe"
if (!(Test-Path $cdb)) {
# https://github.com/microsoft/react-native-windows/blob/f1570a5ef1c4fc1e78d0a0ad5af848ab91a4061c/vnext/Scripts/Analyze-Crash.ps1#L44-L56
Invoke-WebRequest "https://go.microsoft.com/fwlink/?linkid=2173743" -OutFile "$env:TEMP\sdk.exe"
Start-Process "$env:TEMP\sdk.exe" -ArgumentList "/features OptionId.WindowsDesktopDebuggers /quiet" -Wait
}
# Analyze each dump
foreach ($dump in $dumps) {
Write-Host "`n=== $($dump.Name) ==="
& $cdb -z $dump -c "!analyze -v; .ecxr; k; q" 2>&1 | Select-String -Pattern "(ExceptionCode:|SYMBOL_NAME:|IMAGE_NAME:|STACK_TEXT:)" -Context 0,2
}
- name: Upload crash dumps
if: steps.test.outcome == 'failure'
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: crash-dumps-${{ github.run_number }}
path: dumps/*.dmp
if-no-files-found: ignore
- name: Fail if tests failed
if: steps.test.outcome == 'failure'
run: exit 1
# Separate jobs for the nightly crate
windows-trampoline-check:
timeout-minutes: 15
@ -383,7 +332,7 @@ jobs:
run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
workspaces: ${{ env.UV_WORKSPACE }}/crates/uv-trampoline
@ -394,7 +343,7 @@ jobs:
rustup component add rust-src --target ${{ matrix.target-arch }}-pc-windows-msvc
- name: "Install cargo-bloat"
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
uses: taiki-e/install-action@7b20dfd705618832f20d29066e34aa2f2f6194c2 # v2.52.8
with:
tool: cargo-bloat
@ -439,7 +388,7 @@ jobs:
- name: Copy Git Repo to Dev Drive
run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
workspaces: ${{ env.UV_WORKSPACE }}/crates/uv-trampoline
- name: "Install Rust toolchain"
@ -481,7 +430,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- name: "Add SSH key"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
@ -494,7 +443,7 @@ jobs:
- name: "Build docs (insiders)"
if: ${{ env.MKDOCS_INSIDERS_SSH_KEY_EXISTS == 'true' }}
run: uvx --with-requirements docs/requirements.txt mkdocs build --strict -f mkdocs.insiders.yml
run: uvx --with-requirements docs/requirements-insiders.txt mkdocs build --strict -f mkdocs.insiders.yml
build-binary-linux-libc:
timeout-minutes: 10
@ -507,7 +456,7 @@ jobs:
- uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Build"
run: cargo build
@ -521,6 +470,31 @@ jobs:
./target/debug/uvx
retention-days: 1
build-binary-linux-aarch64:
timeout-minutes: 10
needs: determine_changes
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && (needs.determine_changes.outputs.code == 'true' || github.ref == 'refs/heads/main') }}
runs-on: github-ubuntu-24.04-aarch64-4
name: "build binary | linux aarch64"
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Build"
run: cargo build
- name: "Upload binary"
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: uv-linux-aarch64-${{ github.sha }}
path: |
./target/debug/uv
./target/debug/uvx
retention-days: 1
build-binary-linux-musl:
timeout-minutes: 10
needs: determine_changes
@ -537,7 +511,7 @@ jobs:
sudo apt-get install musl-tools
rustup target add x86_64-unknown-linux-musl
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Build"
run: cargo build --target x86_64-unknown-linux-musl --bin uv --bin uvx
@ -562,7 +536,7 @@ jobs:
- uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Build"
run: cargo build --bin uv --bin uvx
@ -586,7 +560,7 @@ jobs:
- uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Build"
run: cargo build --bin uv --bin uvx
@ -616,7 +590,7 @@ jobs:
run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
workspaces: ${{ env.UV_WORKSPACE }}
@ -651,7 +625,7 @@ jobs:
run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.UV_WORKSPACE }}" -Recurse
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
with:
workspaces: ${{ env.UV_WORKSPACE }}
@ -688,7 +662,7 @@ jobs:
run: rustup default ${{ steps.msrv.outputs.value }}
- name: "Install mold"
uses: rui314/setup-mold@v1
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- run: cargo +${{ steps.msrv.outputs.value }} build
- run: ./target/debug/uv --version
@ -701,7 +675,7 @@ jobs:
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Cross build"
run: |
# Install cross from `freebsd-firecracker`
@ -712,7 +686,7 @@ jobs:
cross build --target x86_64-unknown-freebsd
- name: Test in Firecracker VM
uses: acj/freebsd-firecracker-action@5b4c9938e8b5ff1041c58e21515909a0e1500d59 # v0.4.1
uses: acj/freebsd-firecracker-action@136ca0bce2adade21e526ceb07db643ad23dd2dd # v0.5.1
with:
verbose: false
checkout: false
@ -821,6 +795,33 @@ jobs:
eval "$(./uv generate-shell-completion bash)"
eval "$(./uvx --generate-shell-completion bash)"
smoke-test-linux-aarch64:
timeout-minutes: 10
needs: build-binary-linux-aarch64
name: "smoke test | linux aarch64"
runs-on: github-ubuntu-24.04-aarch64-2
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: "Download binary"
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: uv-linux-aarch64-${{ github.sha }}
- name: "Prepare binary"
run: |
chmod +x ./uv
chmod +x ./uvx
- name: "Smoke test"
run: |
./uv run scripts/smoke-test
- name: "Test shell completions"
run: |
eval "$(./uv generate-shell-completion bash)"
eval "$(./uvx --generate-shell-completion bash)"
smoke-test-linux-musl:
timeout-minutes: 10
needs: build-binary-linux-musl
@ -903,7 +904,7 @@ jobs:
timeout-minutes: 10
needs: build-binary-windows-aarch64
name: "smoke test | windows aarch64"
runs-on: github-windows-11-aarch64-4
runs-on: windows-11-arm
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
@ -934,7 +935,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: conda-incubator/setup-miniconda@505e6394dae86d6a5c7fbb6e3fb8938e3e863830 # v3.1.1
- uses: conda-incubator/setup-miniconda@835234971496cad1653abb28a638a281cf32541f # v3.2.0
with:
miniconda-version: latest
activate-environment: uv
@ -1051,6 +1052,96 @@ jobs:
./uv run python -c ""
./uv run -p 3.13t python -c ""
integration-test-windows-aarch64-implicit:
timeout-minutes: 10
needs: build-binary-windows-aarch64
name: "integration test | aarch64 windows implicit"
runs-on: windows-11-arm
steps:
- name: "Download binary"
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: uv-windows-aarch64-${{ github.sha }}
- name: "Install Python via uv (implicitly select x64)"
run: |
./uv python install -v 3.13
- name: "Create a virtual environment (stdlib)"
run: |
& (./uv python find 3.13) -m venv .venv
- name: "Check version (stdlib)"
run: |
.venv/Scripts/python --version
- name: "Create a virtual environment (uv)"
run: |
./uv venv -p 3.13 --managed-python
- name: "Check version (uv)"
run: |
.venv/Scripts/python --version
- name: "Check is x64"
run: |
.venv/Scripts/python -c "import sys; exit(1) if 'AMD64' not in sys.version else exit(0)"
- name: "Check install"
run: |
./uv pip install -v anyio
- name: "Check uv run"
run: |
./uv run python -c ""
./uv run -p 3.13 python -c ""
integration-test-windows-aarch64-explicit:
timeout-minutes: 10
needs: build-binary-windows-aarch64
name: "integration test | aarch64 windows explicit"
runs-on: windows-11-arm
steps:
- name: "Download binary"
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: uv-windows-aarch64-${{ github.sha }}
- name: "Install Python via uv (explicitly select aarch64)"
run: |
./uv python install -v cpython-3.13-windows-aarch64-none
- name: "Create a virtual environment (stdlib)"
run: |
& (./uv python find 3.13) -m venv .venv
- name: "Check version (stdlib)"
run: |
.venv/Scripts/python --version
- name: "Create a virtual environment (uv)"
run: |
./uv venv -p 3.13 --managed-python
- name: "Check version (uv)"
run: |
.venv/Scripts/python --version
- name: "Check is NOT x64"
run: |
.venv/Scripts/python -c "import sys; exit(1) if 'AMD64' in sys.version else exit(0)"
- name: "Check install"
run: |
./uv pip install -v anyio
- name: "Check uv run"
run: |
./uv run python -c ""
./uv run -p 3.13 python -c ""
integration-test-pypy-linux:
timeout-minutes: 10
needs: build-binary-linux-libc
@ -1294,6 +1385,45 @@ jobs:
run: |
.\uv.exe pip install anyio
integration-test-pyodide-linux:
timeout-minutes: 10
needs: build-binary-linux-libc
name: "integration test | pyodide on ubuntu"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: "Download binary"
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: uv-linux-libc-${{ github.sha }}
- name: "Prepare binary"
run: chmod +x ./uv
- name: "Create a native virtual environment"
run: |
./uv venv venv-native -p 3.12
# We use features added in 0.30.3 but there is no known breakage in
# newer versions.
./uv pip install -p venv-native/bin/python pyodide-build==0.30.3 pip
- name: "Install pyodide interpreter"
run: |
source ./venv-native/bin/activate
pyodide xbuildenv install 0.27.5
PYODIDE_PYTHON=$(pyodide config get interpreter)
PYODIDE_INDEX=$(pyodide config get package_index)
echo "PYODIDE_PYTHON=$PYODIDE_PYTHON" >> $GITHUB_ENV
echo "PYODIDE_INDEX=$PYODIDE_INDEX" >> $GITHUB_ENV
- name: "Create pyodide virtual environment"
run: |
./uv venv -p $PYODIDE_PYTHON venv-pyodide
source ./venv-pyodide/bin/activate
./uv pip install --extra-index-url=$PYODIDE_INDEX --no-build numpy
python -c 'import numpy'
integration-test-github-actions:
timeout-minutes: 10
needs: build-binary-linux-libc
@ -1428,6 +1558,90 @@ jobs:
done <<< "${CHANGED_FILES}"
echo "code_any_changed=${CODE_CHANGED}" >> "${GITHUB_OUTPUT}"
integration-test-registries:
timeout-minutes: 10
needs: build-binary-linux-libc
name: "integration test | registries"
runs-on: ubuntu-latest
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-test') && github.event.pull_request.head.repo.fork != true }}
environment: uv-test-registries
env:
PYTHON_VERSION: 3.12
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: "${{ env.PYTHON_VERSION }}"
- name: "Download binary"
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: uv-linux-libc-${{ github.sha }}
- name: "Prepare binary"
run: chmod +x ./uv
- name: "Configure AWS credentials"
uses: aws-actions/configure-aws-credentials@f503a1870408dcf2c35d5c2b8a68e69211042c7d
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: "Get AWS CodeArtifact token"
run: |
UV_TEST_AWS_TOKEN=$(aws codeartifact get-authorization-token \
--domain tests \
--domain-owner ${{ secrets.AWS_ACCOUNT_ID }} \
--region us-east-1 \
--query authorizationToken \
--output text)
echo "::add-mask::$UV_TEST_AWS_TOKEN"
echo "UV_TEST_AWS_TOKEN=$UV_TEST_AWS_TOKEN" >> $GITHUB_ENV
- name: "Authenticate with GCP"
id: "auth"
uses: "google-github-actions/auth@0920706a19e9d22c3d0da43d1db5939c6ad837a8"
with:
credentials_json: "${{ secrets.GCP_SERVICE_ACCOUNT_KEY }}"
- name: "Set up GCP SDK"
uses: "google-github-actions/setup-gcloud@a8b58010a5b2a061afd605f50e88629c9ec7536b"
- name: "Get GCP Artifact Registry token"
id: get_token
run: |
UV_TEST_GCP_TOKEN=$(gcloud auth print-access-token)
echo "::add-mask::$UV_TEST_GCP_TOKEN"
echo "UV_TEST_GCP_TOKEN=$UV_TEST_GCP_TOKEN" >> $GITHUB_ENV
- name: "Run registry tests"
run: ./uv run -p ${{ env.PYTHON_VERSION }} scripts/registries-test.py --uv ./uv --color always --all
env:
RUST_LOG: uv=debug
UV_TEST_ARTIFACTORY_TOKEN: ${{ secrets.UV_TEST_ARTIFACTORY_TOKEN }}
UV_TEST_ARTIFACTORY_URL: ${{ secrets.UV_TEST_ARTIFACTORY_URL }}
UV_TEST_ARTIFACTORY_USERNAME: ${{ secrets.UV_TEST_ARTIFACTORY_USERNAME }}
UV_TEST_AWS_URL: ${{ secrets.UV_TEST_AWS_URL }}
UV_TEST_AWS_USERNAME: aws
UV_TEST_AZURE_TOKEN: ${{ secrets.UV_TEST_AZURE_TOKEN }}
UV_TEST_AZURE_URL: ${{ secrets.UV_TEST_AZURE_URL }}
UV_TEST_AZURE_USERNAME: dummy
UV_TEST_CLOUDSMITH_TOKEN: ${{ secrets.UV_TEST_CLOUDSMITH_TOKEN }}
UV_TEST_CLOUDSMITH_URL: ${{ secrets.UV_TEST_CLOUDSMITH_URL }}
UV_TEST_CLOUDSMITH_USERNAME: ${{ secrets.UV_TEST_CLOUDSMITH_USERNAME }}
UV_TEST_GCP_URL: ${{ secrets.UV_TEST_GCP_URL }}
UV_TEST_GCP_USERNAME: oauth2accesstoken
UV_TEST_GEMFURY_TOKEN: ${{ secrets.UV_TEST_GEMFURY_TOKEN }}
UV_TEST_GEMFURY_URL: ${{ secrets.UV_TEST_GEMFURY_URL }}
UV_TEST_GEMFURY_USERNAME: ${{ secrets.UV_TEST_GEMFURY_USERNAME }}
UV_TEST_GITLAB_TOKEN: ${{ secrets.UV_TEST_GITLAB_TOKEN }}
UV_TEST_GITLAB_URL: ${{ secrets.UV_TEST_GITLAB_URL }}
UV_TEST_GITLAB_USERNAME: token
integration-test-publish:
timeout-minutes: 20
needs: integration-test-publish-changed
@ -2000,7 +2214,7 @@ jobs:
timeout-minutes: 10
needs: build-binary-windows-aarch64
name: "check system | x86-64 python3.13 on windows aarch64"
runs-on: github-windows-11-aarch64-4
runs-on: windows-11-arm
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
@ -2018,6 +2232,28 @@ jobs:
- name: "Validate global Python install"
run: py -3.13 ./scripts/check_system_python.py --uv ./uv.exe
system-test-windows-aarch64-aarch64-python-313:
timeout-minutes: 10
needs: build-binary-windows-aarch64
name: "check system | aarch64 python3.13 on windows aarch64"
runs-on: windows-11-arm
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: "3.13"
architecture: "arm64"
allow-prereleases: true
- name: "Download binary"
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: uv-windows-aarch64-${{ github.sha }}
- name: "Validate global Python install"
run: py -3.13 ./scripts/check_system_python.py --uv ./uv.exe
# Test our PEP 514 integration that installs Python into the Windows registry.
system-test-windows-registry:
timeout-minutes: 10
@ -2163,7 +2399,7 @@ jobs:
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: conda-incubator/setup-miniconda@505e6394dae86d6a5c7fbb6e3fb8938e3e863830 # v3.1.1
- uses: conda-incubator/setup-miniconda@835234971496cad1653abb28a638a281cf32541f # v3.2.0
with:
miniconda-version: "latest"
activate-environment: uv
@ -2265,13 +2501,13 @@ jobs:
- name: "Checkout Branch"
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
uses: taiki-e/install-action@7b20dfd705618832f20d29066e34aa2f2f6194c2 # v2.52.8
with:
tool: cargo-codspeed
@ -2302,13 +2538,13 @@ jobs:
- name: "Checkout Branch"
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: "Install Rust toolchain"
run: rustup show
- name: "Install codspeed"
uses: taiki-e/install-action@735e5933943122c5ac182670a935f54a949265c1 # v2.52.4
uses: taiki-e/install-action@7b20dfd705618832f20d29066e34aa2f2f6194c2 # v2.52.8
with:
tool: cargo-codspeed

View file

@ -22,12 +22,14 @@ jobs:
id-token: write
steps:
- name: "Install uv"
uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
pattern: wheels_uv-*
path: wheels_uv
merge-multiple: true
- name: Remove wheels unsupported by PyPI
run: rm wheels_uv/*riscv*
- name: Publish to PyPI
run: uv publish -v wheels_uv/*
@ -41,11 +43,13 @@ jobs:
id-token: write
steps:
- name: "Install uv"
uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
- uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
pattern: wheels_uv_build-*
path: wheels_uv_build
merge-multiple: true
- name: Remove wheels unsupported by PyPI
run: rm wheels_uv_build/*riscv*
- name: Publish to PyPI
run: uv publish -v wheels_uv_build/*

View file

@ -69,7 +69,7 @@ jobs:
# we specify bash to get pipefail; it guards against the `curl` command
# failing. otherwise `sh` won't catch that `curl` returned non-0
shell: bash
run: "curl --proto '=https' --tlsv1.2 -LsSf https://github.com/astral-sh/cargo-dist/releases/download/v0.28.4/cargo-dist-installer.sh | sh"
run: "curl --proto '=https' --tlsv1.2 -LsSf https://github.com/astral-sh/cargo-dist/releases/download/v0.28.7-prerelease.1/cargo-dist-installer.sh | sh"
- name: Cache dist
uses: actions/upload-artifact@6027e3dd177782cd8ab9af838c04fd81a07f1d47
with:

View file

@ -1,13 +1,43 @@
# Configures a drive for testing in CI.
#
# When using standard GitHub Actions runners, a `D:` drive is present and has
# similar or better performance characteristics than a ReFS dev drive. Sometimes
# using a larger runner is still more performant (e.g., when running the test
# suite) and we need to create a dev drive. This script automatically configures
# the appropriate drive.
#
# When using GitHub Actions' "larger runners", the `D:` drive is not present and
# we create a DevDrive mount on `C:`. This is purported to be more performant
# than an ReFS drive, though we did not see a change when we switched over.
#
# When using Depot runners, the underling infrastructure is EC2, which does not
# support Hyper-V. The `New-VHD` commandlet only works with Hyper-V, but we can
# create a ReFS drive using `diskpart` and `format` directory. We cannot use a
# DevDrive, as that also requires Hyper-V. The Depot runners use `D:` already,
# so we must check if it's a Depot runner first, and we use `V:` as the target
# instead.
# When not using a GitHub Actions "larger runner", the `D:` drive is present and
# has similar or better performance characteristics than a ReFS dev drive.
# Sometimes using a larger runner is still more performant (e.g., when running
# the test suite) and we need to create a dev drive. This script automatically
# configures the appropriate drive.
# Note we use `Get-PSDrive` is not sufficient because the drive letter is assigned.
if (Test-Path "D:\") {
if ($env:DEPOT_RUNNER -eq "1") {
Write-Output "DEPOT_RUNNER detected, setting up custom dev drive..."
# Create VHD and configure drive using diskpart
$vhdPath = "C:\uv_dev_drive.vhdx"
@"
create vdisk file="$vhdPath" maximum=20480 type=expandable
attach vdisk
create partition primary
active
assign letter=V
"@ | diskpart
# Format the drive as ReFS
format V: /fs:ReFS /q /y
$Drive = "V:"
Write-Output "Custom dev drive created at $Drive"
} elseif (Test-Path "D:\") {
# Note `Get-PSDrive` is not sufficient because the drive letter is assigned.
Write-Output "Using existing drive at D:"
$Drive = "D:"
} else {
@ -55,10 +85,8 @@ Write-Output `
"DEV_DRIVE=$($Drive)" `
"TMP=$($Tmp)" `
"TEMP=$($Tmp)" `
"UV_INTERNAL__TEST_DIR=$($Tmp)" `
"RUSTUP_HOME=$($Drive)/.rustup" `
"CARGO_HOME=$($Drive)/.cargo" `
"UV_WORKSPACE=$($Drive)/uv" `
"PATH=$($Drive)/.cargo/bin;$env:PATH" `
>> $env:GITHUB_ENV

View file

@ -17,7 +17,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: astral-sh/setup-uv@f0ec1fc3b38f5e7cd731bb6ce540c5af426746bb # v6.1.0
- uses: astral-sh/setup-uv@bd01e18f51369d5a26f1651c3cb451d3417e3bba # v6.3.1
with:
version: "latest"
enable-cache: true

3
.gitignore vendored
View file

@ -3,9 +3,10 @@
# Generated by Cargo
# will have compiled files and executables
/vendor/
debug/
target/
target-alpine/
target/
# Bootstrapped Python versions
/bin/

View file

@ -12,7 +12,7 @@ repos:
- id: validate-pyproject
- repo: https://github.com/crate-ci/typos
rev: v1.32.0
rev: v1.34.0
hooks:
- id: typos
@ -42,7 +42,7 @@ repos:
types_or: [yaml, json5]
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.11.12
rev: v0.12.2
hooks:
- id: ruff-format
- id: ruff

View file

@ -3,6 +3,6 @@ CHANGELOG.md
PREVIEW-CHANGELOG.md
docs/reference/cli.md
docs/reference/settings.md
docs/configuration/environment.md
docs/reference/environment.md
ecosystem/home-assistant-core/LICENSE.md
docs/guides/integration/gitlab.md

View file

@ -3,6 +3,272 @@
<!-- prettier-ignore-start -->
## 0.7.19
The **[uv build backend](https://docs.astral.sh/uv/concepts/build-backend/) is now stable**, and considered ready for production use.
The uv build backend is a great choice for pure Python projects. It has reasonable defaults, with the goal of requiring zero configuration for most users, but provides flexible configuration to accommodate most Python project structures. It integrates tightly with uv, to improve messaging and user experience. It validates project metadata and structures, preventing common mistakes. And, finally, it's very fast — `uv sync` on a new project (from `uv init`) is 10-30x faster than with other build backends.
To use uv as a build backend in an existing project, add `uv_build` to the `[build-system]` section in your `pyproject.toml`:
```toml
[build-system]
requires = ["uv_build>=0.7.19,<0.8.0"]
build-backend = "uv_build"
```
In a future release, it will replace `hatchling` as the default in `uv init`. As before, uv will remain compatible with all standards-compliant build backends.
### Python
- Add PGO distributions of Python for aarch64 Linux, which are more optimized for better performance
See the [python-build-standalone release](https://github.com/astral-sh/python-build-standalone/releases/tag/20250702) for more details.
### Enhancements
- Ignore Python patch version for `--universal` pip compile ([#14405](https://github.com/astral-sh/uv/pull/14405))
- Update the tilde version specifier warning to include more context ([#14335](https://github.com/astral-sh/uv/pull/14335))
- Clarify behavior and hint on tool install when no executables are available ([#14423](https://github.com/astral-sh/uv/pull/14423))
### Bug fixes
- Make project and interpreter lock acquisition non-fatal ([#14404](https://github.com/astral-sh/uv/pull/14404))
- Includes `sys.prefix` in cached environment keys to avoid `--with` collisions across projects ([#14403](https://github.com/astral-sh/uv/pull/14403))
### Documentation
- Add a migration guide from pip to uv projects ([#12382](https://github.com/astral-sh/uv/pull/12382))
## 0.7.18
### Python
- Added arm64 Windows Python 3.11, 3.12, 3.13, and 3.14
These are not downloaded by default, since x86-64 Python has broader ecosystem support on Windows.
However, they can be requested with `cpython-<version>-windows-aarch64`.
See the [python-build-standalone release](https://github.com/astral-sh/python-build-standalone/releases/tag/20250630) for more details.
### Enhancements
- Keep track of retries in `ManagedPythonDownload::fetch_with_retry` ([#14378](https://github.com/astral-sh/uv/pull/14378))
- Reuse build (virtual) environments across resolution and installation ([#14338](https://github.com/astral-sh/uv/pull/14338))
- Improve trace message for cached Python interpreter query ([#14328](https://github.com/astral-sh/uv/pull/14328))
- Use parsed URLs for conflicting URL error message ([#14380](https://github.com/astral-sh/uv/pull/14380))
### Preview features
- Ignore invalid build backend settings when not building ([#14372](https://github.com/astral-sh/uv/pull/14372))
### Bug fixes
- Fix equals-star and tilde-equals with `python_version` and `python_full_version` ([#14271](https://github.com/astral-sh/uv/pull/14271))
- Include the canonical path in the interpreter query cache key ([#14331](https://github.com/astral-sh/uv/pull/14331))
- Only drop build directories on program exit ([#14304](https://github.com/astral-sh/uv/pull/14304))
- Error instead of panic on conflict between global and subcommand flags ([#14368](https://github.com/astral-sh/uv/pull/14368))
- Consistently normalize trailing slashes on URLs with no path segments ([#14349](https://github.com/astral-sh/uv/pull/14349))
### Documentation
- Add instructions for publishing to JFrog's Artifactory ([#14253](https://github.com/astral-sh/uv/pull/14253))
- Edits to the build backend documentation ([#14376](https://github.com/astral-sh/uv/pull/14376))
## 0.7.17
### Bug fixes
- Apply build constraints when resolving `--with` dependencies ([#14340](https://github.com/astral-sh/uv/pull/14340))
- Drop trailing slashes when converting index URL from URL ([#14346](https://github.com/astral-sh/uv/pull/14346))
- Ignore `UV_PYTHON_CACHE_DIR` when empty ([#14336](https://github.com/astral-sh/uv/pull/14336))
- Fix error message ordering for `pyvenv.cfg` version conflict ([#14329](https://github.com/astral-sh/uv/pull/14329))
## 0.7.16
### Python
- Add Python 3.14.0b3
See the
[`python-build-standalone` release notes](https://github.com/astral-sh/python-build-standalone/releases/tag/20250626)
for more details.
### Enhancements
- Include path or URL when failing to convert in lockfile ([#14292](https://github.com/astral-sh/uv/pull/14292))
- Warn when `~=` is used as a Python version specifier without a patch version ([#14008](https://github.com/astral-sh/uv/pull/14008))
### Preview features
- Ensure preview default Python installs are upgradeable ([#14261](https://github.com/astral-sh/uv/pull/14261))
### Performance
- Share workspace cache between lock and sync operations ([#14321](https://github.com/astral-sh/uv/pull/14321))
### Bug fixes
- Allow local indexes to reference remote files ([#14294](https://github.com/astral-sh/uv/pull/14294))
- Avoid rendering desugared prefix matches in error messages ([#14195](https://github.com/astral-sh/uv/pull/14195))
- Avoid using path URL for workspace Git dependencies in `requirements.txt` ([#14288](https://github.com/astral-sh/uv/pull/14288))
- Normalize index URLs to remove trailing slash ([#14245](https://github.com/astral-sh/uv/pull/14245))
- Respect URL-encoded credentials in redirect location ([#14315](https://github.com/astral-sh/uv/pull/14315))
- Lock the source tree when running setuptools, to protect concurrent builds ([#14174](https://github.com/astral-sh/uv/pull/14174))
### Documentation
- Note that GCP Artifact Registry download URLs must have `/simple` component ([#14251](https://github.com/astral-sh/uv/pull/14251))
## 0.7.15
### Enhancements
- Consistently use `Ordering::Relaxed` for standalone atomic use cases ([#14190](https://github.com/astral-sh/uv/pull/14190))
- Warn on ambiguous relative paths for `--index` ([#14152](https://github.com/astral-sh/uv/pull/14152))
- Skip GitHub fast path when rate-limited ([#13033](https://github.com/astral-sh/uv/pull/13033))
- Preserve newlines in `schema.json` descriptions ([#13693](https://github.com/astral-sh/uv/pull/13693))
### Bug fixes
- Add check for using minor version link when creating a venv on Windows ([#14252](https://github.com/astral-sh/uv/pull/14252))
- Strip query parameters when parsing source URL ([#14224](https://github.com/astral-sh/uv/pull/14224))
### Documentation
- Add a link to PyPI FAQ to clarify what per-project token is ([#14242](https://github.com/astral-sh/uv/pull/14242))
### Preview features
- Allow symlinks in the build backend ([#14212](https://github.com/astral-sh/uv/pull/14212))
## 0.7.14
### Enhancements
- Add XPU to `--torch-backend` ([#14172](https://github.com/astral-sh/uv/pull/14172))
- Add ROCm backends to `--torch-backend` ([#14120](https://github.com/astral-sh/uv/pull/14120))
- Remove preview label from `--torch-backend` ([#14119](https://github.com/astral-sh/uv/pull/14119))
- Add `[tool.uv.dependency-groups].mygroup.requires-python` ([#13735](https://github.com/astral-sh/uv/pull/13735))
- Add auto-detection for AMD GPUs ([#14176](https://github.com/astral-sh/uv/pull/14176))
- Show retries for HTTP status code errors ([#13897](https://github.com/astral-sh/uv/pull/13897))
- Support transparent Python patch version upgrades ([#13954](https://github.com/astral-sh/uv/pull/13954))
- Warn on empty index directory ([#13940](https://github.com/astral-sh/uv/pull/13940))
- Publish to DockerHub ([#14088](https://github.com/astral-sh/uv/pull/14088))
### Performance
- Make cold resolves about 10% faster ([#14035](https://github.com/astral-sh/uv/pull/14035))
### Bug fixes
- Don't use walrus operator in interpreter query script ([#14108](https://github.com/astral-sh/uv/pull/14108))
- Fix handling of changes to `requires-python` ([#14076](https://github.com/astral-sh/uv/pull/14076))
- Fix implied `platform_machine` marker for `win_amd64` platform tag ([#14041](https://github.com/astral-sh/uv/pull/14041))
- Only update existing symlink directories on preview uninstall ([#14179](https://github.com/astral-sh/uv/pull/14179))
- Serialize Python requests for tools as canonicalized strings ([#14109](https://github.com/astral-sh/uv/pull/14109))
- Support netrc and same-origin credential propagation on index redirects ([#14126](https://github.com/astral-sh/uv/pull/14126))
- Support reading `dependency-groups` from pyproject.tomls with no `[project]` ([#13742](https://github.com/astral-sh/uv/pull/13742))
- Handle an existing shebang in `uv init --script` ([#14141](https://github.com/astral-sh/uv/pull/14141))
- Prevent concurrent updates of the environment in `uv run` ([#14153](https://github.com/astral-sh/uv/pull/14153))
- Filter managed Python distributions by platform before querying when included in request ([#13936](https://github.com/astral-sh/uv/pull/13936))
### Documentation
- Replace cuda124 with cuda128 ([#14168](https://github.com/astral-sh/uv/pull/14168))
- Document the way member sources shadow workspace sources ([#14136](https://github.com/astral-sh/uv/pull/14136))
- Sync documented PyTorch integration index for CUDA and ROCm versions from PyTorch website ([#14100](https://github.com/astral-sh/uv/pull/14100))
## 0.7.13
### Python
- Add Python 3.14.0b2
- Add Python 3.13.5
- Fix stability of `uuid.getnode` on 3.13
See the
[`python-build-standalone` release notes](https://github.com/astral-sh/python-build-standalone/releases/tag/20250612)
for more details.
### Enhancements
- Download versions in `uv python pin` if not found ([#13946](https://github.com/astral-sh/uv/pull/13946))
- Use TTY detection to determine if SIGINT forwarding is enabled ([#13925](https://github.com/astral-sh/uv/pull/13925))
- Avoid fetching an exact, cached Git commit, even if it isn't locked ([#13748](https://github.com/astral-sh/uv/pull/13748))
- Add `zstd` and `deflate` to `Accept-Encoding` ([#13982](https://github.com/astral-sh/uv/pull/13982))
- Build binaries for riscv64 ([#12688](https://github.com/astral-sh/uv/pull/12688))
### Bug fixes
- Check if relative URL is valid directory before treating as index ([#13917](https://github.com/astral-sh/uv/pull/13917))
- Ignore Python discovery errors during `uv python pin` ([#13944](https://github.com/astral-sh/uv/pull/13944))
- Do not allow `uv add --group ... --script` ([#13997](https://github.com/astral-sh/uv/pull/13997))
### Preview changes
- Build backend: Support namespace packages ([#13833](https://github.com/astral-sh/uv/pull/13833))
### Documentation
- Add 3.14 to the supported platform reference ([#13990](https://github.com/astral-sh/uv/pull/13990))
- Add an `llms.txt` to uv ([#13929](https://github.com/astral-sh/uv/pull/13929))
- Add supported macOS version to the platform reference ([#13993](https://github.com/astral-sh/uv/pull/13993))
- Update platform support reference to include Python implementation list ([#13991](https://github.com/astral-sh/uv/pull/13991))
- Update pytorch.md ([#13899](https://github.com/astral-sh/uv/pull/13899))
- Update the CLI help and reference to include references to the Python bin directory ([#13978](https://github.com/astral-sh/uv/pull/13978))
## 0.7.12
### Enhancements
- Add `uv python pin --rm` to remove `.python-version` pins ([#13860](https://github.com/astral-sh/uv/pull/13860))
- Don't hint at versions removed by `excluded-newer` ([#13884](https://github.com/astral-sh/uv/pull/13884))
- Add hint to use `tool.uv.environments` on resolution error ([#13455](https://github.com/astral-sh/uv/pull/13455))
- Add hint to use `tool.uv.required-environments` on resolution error ([#13575](https://github.com/astral-sh/uv/pull/13575))
- Improve `python pin` error messages ([#13862](https://github.com/astral-sh/uv/pull/13862))
### Bug fixes
- Lock environments during `uv sync`, `uv add` and `uv remove` to prevent race conditions ([#13869](https://github.com/astral-sh/uv/pull/13869))
- Add `--no-editable` to `uv export` for `pylock.toml` ([#13852](https://github.com/astral-sh/uv/pull/13852))
### Documentation
- List `.gitignore` in project init files ([#13855](https://github.com/astral-sh/uv/pull/13855))
- Move the pip interface documentation into the concepts section ([#13841](https://github.com/astral-sh/uv/pull/13841))
- Remove the configuration section in favor of concepts / reference ([#13842](https://github.com/astral-sh/uv/pull/13842))
- Update Git and GitHub Actions docs to mention `gh auth login` ([#13850](https://github.com/astral-sh/uv/pull/13850))
### Preview
- Fix directory glob traversal fallback preventing exclusion of all files ([#13882](https://github.com/astral-sh/uv/pull/13882))
## 0.7.11
### Python
- Add Python 3.14.0b1
- Add Python 3.13.4
- Add Python 3.12.11
- Add Python 3.11.13
- Add Python 3.10.18
- Add Python 3.9.23
### Enhancements
- Add Pyodide support ([#12731](https://github.com/astral-sh/uv/pull/12731))
- Better error message for version specifier with missing operator ([#13803](https://github.com/astral-sh/uv/pull/13803))
### Bug fixes
- Downgrade `reqwest` and `hyper-util` to resolve connection reset errors over IPv6 ([#13835](https://github.com/astral-sh/uv/pull/13835))
- Prefer `uv`'s binary's version when checking if it's up to date ([#13840](https://github.com/astral-sh/uv/pull/13840))
### Documentation
- Use "terminal driver" instead of "shell" in `SIGINT` docs ([#13787](https://github.com/astral-sh/uv/pull/13787))
## 0.7.10
### Enhancements
@ -15,7 +281,7 @@
- Avoid redaction of placeholder `git` username when using SSH authentication ([#13799](https://github.com/astral-sh/uv/pull/13799))
- Propagate credentials to files on devpi indexes ending in `/+simple` ([#13743](https://github.com/astral-sh/uv/pull/13743))
- Restore retention of credentials for direct URLs in `uv export` ([#13809](https://github.com/astral-sh/uv/pull/13809))
- Restore retention of credentials for direct URLs in `uv export` ([#13809](https://github.com/astral-sh/uv/pull/13809))
## 0.7.9

437
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -12,7 +12,7 @@ resolver = "2"
[workspace.package]
edition = "2024"
rust-version = "1.85"
rust-version = "1.86"
homepage = "https://pypi.org/project/uv/"
documentation = "https://pypi.org/project/uv/"
repository = "https://github.com/astral-sh/uv"
@ -142,16 +142,16 @@ ref-cast = { version = "1.0.24" }
reflink-copy = { version = "0.1.19" }
regex = { version = "1.10.6" }
regex-automata = { version = "0.4.8", default-features = false, features = ["dfa-build", "dfa-search", "perf", "std", "syntax"] }
reqwest = { version = "0.12.7", default-features = false, features = ["json", "gzip", "stream", "rustls-tls", "rustls-tls-native-roots", "socks", "multipart", "http2", "blocking"] }
reqwest-middleware = { version = "0.4.0", features = ["multipart"] }
reqwest-retry = { version = "0.7.0" }
reqwest = { version = "0.12.22", default-features = false, features = ["json", "gzip", "deflate", "zstd", "stream", "rustls-tls", "rustls-tls-native-roots", "socks", "multipart", "http2", "blocking"] }
reqwest-middleware = { git = "https://github.com/astral-sh/reqwest-middleware", rev = "ad8b9d332d1773fde8b4cd008486de5973e0a3f8", features = ["multipart"] }
reqwest-retry = { git = "https://github.com/astral-sh/reqwest-middleware", rev = "ad8b9d332d1773fde8b4cd008486de5973e0a3f8" }
rkyv = { version = "0.8.8", features = ["bytecheck"] }
rmp-serde = { version = "1.3.0" }
rust-netrc = { version = "0.1.2" }
rustc-hash = { version = "2.0.0" }
rustix = { version = "1.0.0", default-features = false, features = ["fs", "std"] }
same-file = { version = "1.0.6" }
schemars = { version = "0.8.21", features = ["url"] }
schemars = { version = "1.0.0", features = ["url2"] }
seahash = { version = "4.1.0" }
self-replace = { version = "1.5.0" }
serde = { version = "1.0.210", features = ["derive", "rc"] }
@ -183,13 +183,13 @@ unscanny = { version = "0.1.0" }
url = { version = "2.5.2", features = ["serde"] }
version-ranges = { git = "https://github.com/astral-sh/pubgrub", rev = "06ec5a5f59ffaeb6cf5079c6cb184467da06c9db" }
walkdir = { version = "2.5.0" }
which = { version = "7.0.0", features = ["regex"] }
which = { version = "8.0.0", features = ["regex"] }
windows = { version = "0.59.0", features = ["Win32_Storage_FileSystem"] }
windows-core = { version = "0.59.0" }
windows-registry = { version = "0.5.0" }
windows-result = { version = "0.3.0" }
windows-sys = { version = "0.59.0", features = ["Win32_Foundation", "Win32_Security", "Win32_Storage_FileSystem", "Win32_System_Ioctl", "Win32_System_IO", "Win32_System_Registry"] }
wiremock = { version = "0.6.2" }
wiremock = { version = "0.6.4" }
xz2 = { version = "0.1.7" }
zip = { version = "2.2.3", default-features = false, features = ["deflate", "zstd", "bzip2", "lzma", "xz"] }
@ -214,6 +214,7 @@ missing_panics_doc = "allow"
module_name_repetitions = "allow"
must_use_candidate = "allow"
similar_names = "allow"
struct_excessive_bools = "allow"
too_many_arguments = "allow"
too_many_lines = "allow"
used_underscore_binding = "allow"
@ -296,83 +297,6 @@ codegen-units = 1
[profile.dist]
inherits = "release"
# Config for 'dist'
[workspace.metadata.dist]
# The preferred dist version to use in CI (Cargo.toml SemVer syntax)
cargo-dist-version = "0.28.4"
# make a package being included in our releases opt-in instead of opt-out
dist = false
# CI backends to support
ci = "github"
# The installers to generate for each app
installers = ["shell", "powershell"]
# The archive format to use for windows builds (defaults .zip)
windows-archive = ".zip"
# The archive format to use for non-windows builds (defaults .tar.xz)
unix-archive = ".tar.gz"
# Target platforms to build apps for (Rust target-triple syntax)
targets = [
"aarch64-apple-darwin",
"aarch64-pc-windows-msvc",
"aarch64-unknown-linux-gnu",
"aarch64-unknown-linux-musl",
"arm-unknown-linux-musleabihf",
"armv7-unknown-linux-gnueabihf",
"armv7-unknown-linux-musleabihf",
"i686-pc-windows-msvc",
"i686-unknown-linux-gnu",
"i686-unknown-linux-musl",
"powerpc64-unknown-linux-gnu",
"powerpc64le-unknown-linux-gnu",
"s390x-unknown-linux-gnu",
"x86_64-apple-darwin",
"x86_64-pc-windows-msvc",
"x86_64-unknown-linux-gnu",
"x86_64-unknown-linux-musl",
]
# Whether to auto-include files like READMEs, LICENSEs, and CHANGELOGs (default true)
auto-includes = false
# Whether dist should create a Github Release or use an existing draft
create-release = true
# Which actions to run on pull requests
pr-run-mode = "plan"
# Whether CI should trigger releases with dispatches instead of tag pushes
dispatch-releases = true
# Which phase dist should use to create the GitHub release
github-release = "announce"
# Whether CI should include auto-generated code to build local artifacts
build-local-artifacts = false
# Local artifacts jobs to run in CI
local-artifacts-jobs = ["./build-binaries", "./build-docker"]
# Publish jobs to run in CI
publish-jobs = ["./publish-pypi"]
# Post-announce jobs to run in CI
post-announce-jobs = ["./publish-docs"]
# Custom permissions for GitHub Jobs
github-custom-job-permissions = { "build-docker" = { packages = "write", contents = "read", id-token = "write", attestations = "write" } }
# Whether to install an updater program
install-updater = false
# Path that installers should place binaries in
install-path = ["$XDG_BIN_HOME/", "$XDG_DATA_HOME/../bin", "~/.local/bin"]
[workspace.metadata.dist.github-custom-runners]
global = "depot-ubuntu-latest-4"
[workspace.metadata.dist.min-glibc-version]
# Override glibc version for specific target triplets.
aarch64-unknown-linux-gnu = "2.28"
# Override all remaining glibc versions.
"*" = "2.17"
[workspace.metadata.dist.github-action-commits]
"actions/checkout" = "11bd71901bbe5b1630ceea73d27597364c9af683" # v4
"actions/upload-artifact" = "6027e3dd177782cd8ab9af838c04fd81a07f1d47" # v4.6.2
"actions/download-artifact" = "d3f86a106a0bac45b974a628896c90dbdf5c8093" # v4.3.0
"actions/attest-build-provenance" = "c074443f1aee8d4aeeae555aebba3282517141b2" #v2.2.3
[workspace.metadata.dist.binaries]
"*" = ["uv", "uvx"]
# Add "uvw" binary for Windows targets
aarch64-pc-windows-msvc = ["uv", "uvx", "uvw"]
i686-pc-windows-msvc = ["uv", "uvx", "uvw"]
x86_64-pc-windows-msvc = ["uv", "uvx", "uvw"]
[patch.crates-io]
reqwest-middleware = { git = "https://github.com/astral-sh/reqwest-middleware", rev = "ad8b9d332d1773fde8b4cd008486de5973e0a3f8" }
reqwest-retry = { git = "https://github.com/astral-sh/reqwest-middleware", rev = "ad8b9d332d1773fde8b4cd008486de5973e0a3f8" }

View file

@ -960,7 +960,7 @@ argument (or the `UV_INDEX` environment variable); to replace the default index
These changes are entirely backwards-compatible with the deprecated `--index-url` and
`--extra-index-url` options, which continue to work as before.
See the [Index](https://docs.astral.sh/uv/configuration/indexes/) documentation for more.
See the [Index](https://docs.astral.sh/uv/concepts/indexes/) documentation for more.
### Enhancements

View file

@ -6,6 +6,8 @@ doc-valid-idents = [
"GraalPy",
"ReFS",
"PyTorch",
"ROCm",
"XPU",
".." # Include the defaults
]
@ -35,7 +37,7 @@ disallowed-methods = [
"std::fs::soft_link",
"std::fs::symlink_metadata",
"std::fs::write",
"std::os::unix::fs::symlink",
"std::os::windows::fs::symlink_dir",
"std::os::windows::fs::symlink_file",
{ path = "std::os::unix::fs::symlink", allow-invalid = true },
{ path = "std::os::windows::fs::symlink_dir", allow-invalid = true },
{ path = "std::os::windows::fs::symlink_file", allow-invalid = true },
]

View file

@ -86,7 +86,7 @@ impl Indexes {
Self(FxHashSet::default())
}
/// Create a new [`AuthIndexUrls`] from an iterator of [`AuthIndexUrl`]s.
/// Create a new [`Indexes`] instance from an iterator of [`Index`]s.
pub fn from_indexes(urls: impl IntoIterator<Item = Index>) -> Self {
let mut index_urls = Self::new();
for url in urls {

View file

@ -18,11 +18,6 @@ workspace = true
doctest = false
bench = false
[[bench]]
name = "distribution-filename"
path = "benches/distribution_filename.rs"
harness = false
[[bench]]
name = "uv"
path = "benches/uv.rs"
@ -34,7 +29,6 @@ uv-client = { workspace = true }
uv-configuration = { workspace = true }
uv-dispatch = { workspace = true }
uv-distribution = { workspace = true }
uv-distribution-filename = { workspace = true }
uv-distribution-types = { workspace = true }
uv-extract = { workspace = true, optional = true }
uv-install-wheel = { workspace = true }
@ -48,8 +42,10 @@ uv-types = { workspace = true }
uv-workspace = { workspace = true }
anyhow = { workspace = true }
codspeed-criterion-compat = { version = "2.7.2", default-features = false, optional = true }
criterion = { version = "0.6.0", default-features = false, features = ["async_tokio"] }
codspeed-criterion-compat = { version = "3.0.2", default-features = false, optional = true }
criterion = { version = "0.6.0", default-features = false, features = [
"async_tokio",
] }
jiff = { workspace = true }
tokio = { workspace = true }

View file

@ -1,168 +0,0 @@
use std::str::FromStr;
use uv_bench::criterion::{
BenchmarkId, Criterion, Throughput, criterion_group, criterion_main, measurement::WallTime,
};
use uv_distribution_filename::WheelFilename;
use uv_platform_tags::{AbiTag, LanguageTag, PlatformTag, Tags};
/// A set of platform tags extracted from burntsushi's Archlinux workstation.
/// We could just re-create these via `Tags::from_env`, but those might differ
/// depending on the platform. This way, we always use the same data. It also
/// lets us assert tag compatibility regardless of where the benchmarks run.
const PLATFORM_TAGS: &[(&str, &str, &str)] = include!("../inputs/platform_tags.rs");
/// A set of wheel names used in the benchmarks below. We pick short and long
/// names, as well as compatible and not-compatibles (with `PLATFORM_TAGS`)
/// names.
///
/// The tuple is (name, filename, compatible) where `name` is a descriptive
/// name for humans used in the benchmark definition. And `filename` is the
/// actual wheel filename we want to benchmark operation on. And `compatible`
/// indicates whether the tags in the wheel filename are expected to be
/// compatible with the tags in `PLATFORM_TAGS`.
const WHEEL_NAMES: &[(&str, &str, bool)] = &[
// This tests a case with a very short name that *is* compatible with
// PLATFORM_TAGS. It only uses one tag for each component (one Python
// version, one ABI and one platform).
(
"flyte-short-compatible",
"ipython-2.1.0-py3-none-any.whl",
true,
),
// This tests a case with a long name that is *not* compatible. That
// is, all platform tags need to be checked against the tags in the
// wheel filename. This is essentially the worst possible practical
// case.
(
"flyte-long-incompatible",
"protobuf-3.5.2.post1-cp36-cp36m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl",
false,
),
// This tests a case with a long name that *is* compatible. We
// expect this to be (on average) quicker because the compatibility
// check stops as soon as a positive match is found. (Where as the
// incompatible case needs to check all tags.)
(
"flyte-long-compatible",
"coverage-6.6.0b1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
true,
),
];
/// A list of names that are candidates for wheel filenames but will ultimately
/// fail to parse.
const INVALID_WHEEL_NAMES: &[(&str, &str)] = &[
("flyte-short-extension", "mock-5.1.0.tar.gz"),
(
"flyte-long-extension",
"Pillow-5.4.0.dev0-py3.7-macosx-10.13-x86_64.egg",
),
];
/// Benchmarks the construction of platform tags.
///
/// This only happens ~once per program startup. Originally, construction was
/// trivial. But to speed up `WheelFilename::is_compatible`, we added some
/// extra processing. We thus expect construction to become slower, but we
/// write a benchmark to ensure it is still "reasonable."
fn benchmark_build_platform_tags(c: &mut Criterion<WallTime>) {
let tags: Vec<(LanguageTag, AbiTag, PlatformTag)> = PLATFORM_TAGS
.iter()
.map(|&(py, abi, plat)| {
(
LanguageTag::from_str(py).unwrap(),
AbiTag::from_str(abi).unwrap(),
PlatformTag::from_str(plat).unwrap(),
)
})
.collect();
let mut group = c.benchmark_group("build_platform_tags");
group.bench_function(BenchmarkId::from_parameter("burntsushi-archlinux"), |b| {
b.iter(|| std::hint::black_box(Tags::new(tags.clone())));
});
group.finish();
}
/// Benchmarks `WheelFilename::from_str`. This has been observed to take some
/// non-trivial time in profiling (although, at time of writing, not as much
/// as tag compatibility). In the process of optimizing tag compatibility,
/// we tweaked wheel filename parsing. This benchmark was therefore added to
/// ensure we didn't regress here.
fn benchmark_wheelname_parsing(c: &mut Criterion<WallTime>) {
let mut group = c.benchmark_group("wheelname_parsing");
for (name, filename, _) in WHEEL_NAMES.iter().copied() {
let len = u64::try_from(filename.len()).expect("length fits in u64");
group.throughput(Throughput::Bytes(len));
group.bench_function(BenchmarkId::from_parameter(name), |b| {
b.iter(|| {
filename
.parse::<WheelFilename>()
.expect("valid wheel filename");
});
});
}
group.finish();
}
/// Benchmarks `WheelFilename::from_str` when it fails. This routine is called
/// on every filename in a package's metadata. A non-trivial portion of which
/// are not wheel filenames. Ensuring that the error path is fast is thus
/// probably a good idea.
fn benchmark_wheelname_parsing_failure(c: &mut Criterion<WallTime>) {
let mut group = c.benchmark_group("wheelname_parsing_failure");
for (name, filename) in INVALID_WHEEL_NAMES.iter().copied() {
let len = u64::try_from(filename.len()).expect("length fits in u64");
group.throughput(Throughput::Bytes(len));
group.bench_function(BenchmarkId::from_parameter(name), |b| {
b.iter(|| {
filename
.parse::<WheelFilename>()
.expect_err("invalid wheel filename");
});
});
}
group.finish();
}
/// Benchmarks the `WheelFilename::is_compatible` routine. This was revealed
/// to be the #1 bottleneck in the resolver. The main issue was that the
/// set of platform tags (generated once) is quite large, and the original
/// implementation did an exhaustive search over each of them for each tag in
/// the wheel filename.
fn benchmark_wheelname_tag_compatibility(c: &mut Criterion<WallTime>) {
let tags: Vec<(LanguageTag, AbiTag, PlatformTag)> = PLATFORM_TAGS
.iter()
.map(|&(py, abi, plat)| {
(
LanguageTag::from_str(py).unwrap(),
AbiTag::from_str(abi).unwrap(),
PlatformTag::from_str(plat).unwrap(),
)
})
.collect();
let tags = Tags::new(tags);
let mut group = c.benchmark_group("wheelname_tag_compatibility");
for (name, filename, expected) in WHEEL_NAMES.iter().copied() {
let wheelname: WheelFilename = filename.parse().expect("valid wheel filename");
let len = u64::try_from(filename.len()).expect("length fits in u64");
group.throughput(Throughput::Bytes(len));
group.bench_function(BenchmarkId::from_parameter(name), |b| {
b.iter(|| {
assert_eq!(expected, wheelname.is_compatible(&tags));
});
});
}
group.finish();
}
criterion_group!(
uv_distribution_filename,
benchmark_build_platform_tags,
benchmark_wheelname_parsing,
benchmark_wheelname_parsing_failure,
benchmark_wheelname_tag_compatibility,
);
criterion_main!(uv_distribution_filename);

View file

@ -1,6 +1,6 @@
use std::str::FromStr;
use uv_bench::criterion::black_box;
use std::hint::black_box;
use uv_bench::criterion::{Criterion, criterion_group, criterion_main, measurement::WallTime};
use uv_cache::Cache;
use uv_client::RegistryClientBuilder;
@ -91,7 +91,7 @@ mod resolver {
};
use uv_dispatch::{BuildDispatch, SharedState};
use uv_distribution::DistributionDatabase;
use uv_distribution_types::{DependencyMetadata, IndexLocations};
use uv_distribution_types::{DependencyMetadata, IndexLocations, RequiresPython};
use uv_install_wheel::LinkMode;
use uv_pep440::Version;
use uv_pep508::{MarkerEnvironment, MarkerEnvironmentBuilder};
@ -99,8 +99,8 @@ mod resolver {
use uv_pypi_types::{Conflicts, ResolverMarkerEnvironment};
use uv_python::Interpreter;
use uv_resolver::{
FlatIndex, InMemoryIndex, Manifest, OptionsBuilder, PythonRequirement, RequiresPython,
Resolver, ResolverEnvironment, ResolverOutput,
FlatIndex, InMemoryIndex, Manifest, OptionsBuilder, PythonRequirement, Resolver,
ResolverEnvironment, ResolverOutput,
};
use uv_types::{BuildIsolation, EmptyInstalledPackages, HashStrategy};
use uv_workspace::WorkspaceCache;
@ -206,6 +206,7 @@ mod resolver {
options,
&python_requirement,
markers,
interpreter.markers(),
conflicts,
Some(&TAGS),
&flat_index,

View file

@ -9,21 +9,19 @@ pub use settings::{BuildBackendSettings, WheelDataIncludes};
pub use source_dist::{build_source_dist, list_source_dist};
pub use wheel::{build_editable, build_wheel, list_wheel, metadata};
use std::fs::FileType;
use std::io;
use std::path::{Path, PathBuf};
use std::str::FromStr;
use thiserror::Error;
use tracing::debug;
use walkdir::DirEntry;
use uv_fs::Simplified;
use uv_globfilter::PortableGlobError;
use uv_normalize::PackageName;
use uv_pypi_types::IdentifierParseError;
use uv_pypi_types::{Identifier, IdentifierParseError};
use crate::metadata::ValidationError;
use crate::settings::ModuleName;
#[derive(Debug, Error)]
pub enum Error {
@ -33,8 +31,8 @@ pub enum Error {
Toml(#[from] toml::de::Error),
#[error("Invalid pyproject.toml")]
Validation(#[from] ValidationError),
#[error(transparent)]
Identifier(#[from] IdentifierParseError),
#[error("Invalid module name: {0}")]
InvalidModuleName(String, #[source] IdentifierParseError),
#[error("Unsupported glob expression in: `{field}`")]
PortableGlob {
field: String,
@ -56,33 +54,14 @@ pub enum Error {
#[source]
err: walkdir::Error,
},
#[error("Unsupported file type {:?}: `{}`", _1, _0.user_display())]
UnsupportedFileType(PathBuf, FileType),
#[error("Failed to write wheel zip archive")]
Zip(#[from] zip::result::ZipError),
#[error("Failed to write RECORD file")]
Csv(#[from] csv::Error),
#[error(
"Missing source directory at: `{}`",
_0.user_display()
)]
MissingSrc(PathBuf),
#[error(
"Expected a Python module directory at: `{}`",
_0.user_display()
)]
#[error("Expected a Python module at: `{}`", _0.user_display())]
MissingInitPy(PathBuf),
#[error(
"Missing module directory for `{}` in `{}`. Found: `{}`",
module_name,
src_root.user_display(),
dir_listing.join("`, `")
)]
MissingModuleDir {
module_name: String,
src_root: PathBuf,
dir_listing: Vec<String>,
},
#[error("For namespace packages, `__init__.py[i]` is not allowed in parent directory: `{}`", _0.user_display())]
NotANamespace(PathBuf),
/// Either an absolute path or a parent path through `..`.
#[error("Module root must be inside the project: `{}`", _0.user_display())]
InvalidModuleRoot(PathBuf),
@ -105,6 +84,16 @@ trait DirectoryWriter {
/// Files added through the method are considered generated when listing included files.
fn write_bytes(&mut self, path: &str, bytes: &[u8]) -> Result<(), Error>;
/// Add the file or directory to the path.
fn write_dir_entry(&mut self, entry: &DirEntry, target_path: &str) -> Result<(), Error> {
if entry.file_type().is_dir() {
self.write_directory(target_path)?;
} else {
self.write_file(target_path, entry.path())?;
}
Ok(())
}
/// Add a local file.
fn write_file(&mut self, path: &str, file: &Path) -> Result<(), Error>;
@ -195,12 +184,26 @@ fn check_metadata_directory(
Ok(())
}
/// Resolve the source root, module root and the module name.
/// Returns the source root and the module path with the `__init__.py[i]` below to it while
/// checking the project layout and names.
///
/// Some target platforms have case-sensitive filesystems, while others have case-insensitive
/// filesystems. We always lower case the package name, our default for the module, while some
/// users want uppercase letters in their module names. For example, the package name is `pil_util`,
/// but the module `PIL_util`. To make the behavior as consistent as possible across platforms as
/// possible, we require that an upper case name is given explicitly through
/// `tool.uv.build-backend.module-name`.
///
/// By default, the dist-info-normalized package name is the module name. For
/// dist-info-normalization, the rules are lowercasing, replacing `.` with `_` and
/// replace `-` with `_`. Since `.` and `-` are not allowed in identifiers, we can use a string
/// comparison with the module name.
fn find_roots(
source_tree: &Path,
pyproject_toml: &PyProjectToml,
relative_module_root: &Path,
module_name: Option<&ModuleName>,
module_name: Option<&str>,
namespace: bool,
) -> Result<(PathBuf, PathBuf), Error> {
let relative_module_root = uv_fs::normalize_path(relative_module_root);
let src_root = source_tree.join(&relative_module_root);
@ -208,93 +211,114 @@ fn find_roots(
return Err(Error::InvalidModuleRoot(relative_module_root.to_path_buf()));
}
let src_root = source_tree.join(&relative_module_root);
let module_root = find_module_root(&src_root, module_name, pyproject_toml.name())?;
Ok((src_root, module_root))
debug!("Source root: {}", src_root.user_display());
if namespace {
// `namespace = true` disables module structure checks.
let module_relative = if let Some(module_name) = module_name {
module_name.split('.').collect::<PathBuf>()
} else {
PathBuf::from(pyproject_toml.name().as_dist_info_name().to_string())
};
debug!("Namespace module path: {}", module_relative.user_display());
return Ok((src_root, module_relative));
}
let module_relative = if let Some(module_name) = module_name {
module_path_from_module_name(&src_root, module_name)?
} else {
find_module_path_from_package_name(&src_root, pyproject_toml.name())?
};
debug!("Module path: {}", module_relative.user_display());
Ok((src_root, module_relative))
}
/// Match the module name to its module directory with potentially different casing.
/// Infer stubs packages from package name alone.
///
/// Some target platforms have case-sensitive filesystems, while others have case-insensitive
/// filesystems and we always lower case the package name, our default for the module, while some
/// users want uppercase letters in their module names. For example, the package name is `pil_util`,
/// but the module `PIL_util`.
///
/// By default, the dist-info-normalized package name is the module name. For
/// dist-info-normalization, the rules are lowercasing, replacing `.` with `_` and
/// replace `-` with `_`. Since `.` and `-` are not allowed in identifiers, we can use a string
/// comparison with the module name.
///
/// To make the behavior as consistent as possible across platforms as possible, we require that an
/// upper case name is given explicitly through `tool.uv.module-name`.
///
/// Returns the module root path, the directory below which the `__init__.py` lives.
fn find_module_root(
/// There are potential false positives if someone had a regular package with `-stubs`.
/// The `Identifier` checks in `module_path_from_module_name` are here covered by the `PackageName`
/// validation.
fn find_module_path_from_package_name(
src_root: &Path,
module_name: Option<&ModuleName>,
package_name: &PackageName,
) -> Result<PathBuf, Error> {
let (module_name, stubs) = if let Some(module_name) = module_name {
// This name can be uppercase.
match module_name {
ModuleName::Identifier(module_name) => (module_name.to_string(), false),
ModuleName::Stubs(module_name) => (module_name.to_string(), true),
if let Some(stem) = package_name.to_string().strip_suffix("-stubs") {
debug!("Building stubs package instead of a regular package");
let module_name = PackageName::from_str(stem)
.expect("non-empty package name prefix must be valid package name")
.as_dist_info_name()
.to_string();
let module_relative = PathBuf::from(format!("{module_name}-stubs"));
let init_pyi = src_root.join(&module_relative).join("__init__.pyi");
if !init_pyi.is_file() {
return Err(Error::MissingInitPy(init_pyi));
}
Ok(module_relative)
} else {
// Infer stubs packages from package name alone. There are potential false positives if
// someone had a regular package with `-stubs`.
if let Some(stem) = package_name.to_string().strip_suffix("-stubs") {
debug!("Building stubs package instead of a regular package");
let module_name = PackageName::from_str(stem)
.expect("non-empty package name prefix must be valid package name")
.as_dist_info_name()
.to_string();
(format!("{module_name}-stubs"), true)
} else {
// This name is always lowercase.
(package_name.as_dist_info_name().to_string(), false)
// This name is always lowercase.
let module_relative = PathBuf::from(package_name.as_dist_info_name().to_string());
let init_py = src_root.join(&module_relative).join("__init__.py");
if !init_py.is_file() {
return Err(Error::MissingInitPy(init_py));
}
Ok(module_relative)
}
}
/// Determine the relative module path from an explicit module name.
fn module_path_from_module_name(src_root: &Path, module_name: &str) -> Result<PathBuf, Error> {
// This name can be uppercase.
let module_relative = module_name.split('.').collect::<PathBuf>();
// Check if we have a regular module or a namespace.
let (root_name, namespace_segments) =
if let Some((root_name, namespace_segments)) = module_name.split_once('.') {
(
root_name,
namespace_segments.split('.').collect::<Vec<&str>>(),
)
} else {
(module_name, Vec::new())
};
// Check if we have an implementation or a stubs package.
// For stubs for a namespace, the `-stubs` prefix must be on the root.
let stubs = if let Some(stem) = root_name.strip_suffix("-stubs") {
// Check that the stubs belong to a valid module.
Identifier::from_str(stem)
.map_err(|err| Error::InvalidModuleName(module_name.to_string(), err))?;
true
} else {
Identifier::from_str(root_name)
.map_err(|err| Error::InvalidModuleName(module_name.to_string(), err))?;
false
};
let dir = match fs_err::read_dir(src_root) {
Ok(dir_iterator) => dir_iterator.collect::<Result<Vec<_>, _>>()?,
Err(err) if err.kind() == io::ErrorKind::NotFound => {
return Err(Error::MissingSrc(src_root.to_path_buf()));
}
Err(err) => return Err(Error::Io(err)),
};
let module_root = dir.iter().find_map(|entry| {
// TODO(konsti): Do we ever need to check if `dir/{module_name}/__init__.py` exists because
// the wrong casing may be recorded on disk?
if entry
.file_name()
.to_str()
.is_some_and(|file_name| file_name == module_name)
// For a namespace, check that all names below the root is valid.
for segment in namespace_segments {
Identifier::from_str(segment)
.map_err(|err| Error::InvalidModuleName(module_name.to_string(), err))?;
}
// Check that an `__init__.py[i]` exists for the module.
let init_py =
src_root
.join(&module_relative)
.join(if stubs { "__init__.pyi" } else { "__init__.py" });
if !init_py.is_file() {
return Err(Error::MissingInitPy(init_py));
}
// For a namespace, check that the directories above the lowest are namespace directories.
for namespace_dir in module_relative.ancestors().skip(1) {
if src_root.join(namespace_dir).join("__init__.py").exists()
|| src_root.join(namespace_dir).join("__init__.pyi").exists()
{
Some(entry.path())
} else {
None
return Err(Error::NotANamespace(src_root.join(namespace_dir)));
}
});
let init_py = if stubs { "__init__.pyi" } else { "__init__.py" };
let module_root = if let Some(module_root) = module_root {
if module_root.join(init_py).is_file() {
module_root.clone()
} else {
return Err(Error::MissingInitPy(module_root.join(init_py)));
}
} else {
return Err(Error::MissingModuleDir {
module_name,
src_root: src_root.to_path_buf(),
dir_listing: dir
.into_iter()
.filter_map(|entry| Some(entry.file_name().to_str()?.to_string()))
.collect(),
});
};
}
debug!("Module name: `{}`", module_name);
Ok(module_root)
Ok(module_relative)
}
#[cfg(test)]
@ -845,7 +869,7 @@ mod tests {
.replace('\\', "/");
assert_snapshot!(
err_message,
@"Missing module directory for `camel_case` in `[TEMP_PATH]/src`. Found: `camelCase`"
@"Expected a Python module at: `[TEMP_PATH]/src/camel_case/__init__.py`"
);
}
@ -872,14 +896,10 @@ mod tests {
let err_message = format_err(&build_err);
assert_snapshot!(
err_message,
@r#"
Invalid pyproject.toml
Caused by: TOML parse error at line 10, column 15
|
10 | module-name = "django@home-stubs"
| ^^^^^^^^^^^^^^^^^^^
Invalid character `@` at position 7 for identifier `django@home`, expected an underscore or an alphanumeric character
"#
@r"
Invalid module name: django@home-stubs
Caused by: Invalid character `@` at position 7 for identifier `django@home`, expected an underscore or an alphanumeric character
"
);
}
@ -914,7 +934,7 @@ mod tests {
.replace('\\', "/");
assert_snapshot!(
err_message,
@"Expected a Python module directory at: `[TEMP_PATH]/src/stuffed_bird-stubs/__init__.pyi`"
@"Expected a Python module at: `[TEMP_PATH]/src/stuffed_bird-stubs/__init__.pyi`"
);
// Create the correct file
@ -956,4 +976,237 @@ mod tests {
let build2 = build(src.path(), dist.path()).unwrap();
assert_eq!(build1.wheel_contents, build2.wheel_contents);
}
/// A simple namespace package with a single root `__init__.py`.
#[test]
fn simple_namespace_package() {
let src = TempDir::new().unwrap();
let pyproject_toml = indoc! {r#"
[project]
name = "simple-namespace-part"
version = "1.0.0"
[tool.uv.build-backend]
module-name = "simple_namespace.part"
[build-system]
requires = ["uv_build>=0.5.15,<0.6"]
build-backend = "uv_build"
"#
};
fs_err::write(src.path().join("pyproject.toml"), pyproject_toml).unwrap();
fs_err::create_dir_all(src.path().join("src").join("simple_namespace").join("part"))
.unwrap();
let dist = TempDir::new().unwrap();
let build_err = build(src.path(), dist.path()).unwrap_err();
let err_message = format_err(&build_err)
.replace(&src.path().user_display().to_string(), "[TEMP_PATH]")
.replace('\\', "/");
assert_snapshot!(
err_message,
@"Expected a Python module at: `[TEMP_PATH]/src/simple_namespace/part/__init__.py`"
);
// Create the correct file
File::create(
src.path()
.join("src")
.join("simple_namespace")
.join("part")
.join("__init__.py"),
)
.unwrap();
// For a namespace package, there must not be an `__init__.py` here.
let bogus_init_py = src
.path()
.join("src")
.join("simple_namespace")
.join("__init__.py");
File::create(&bogus_init_py).unwrap();
let build_err = build(src.path(), dist.path()).unwrap_err();
let err_message = format_err(&build_err)
.replace(&src.path().user_display().to_string(), "[TEMP_PATH]")
.replace('\\', "/");
assert_snapshot!(
err_message,
@"For namespace packages, `__init__.py[i]` is not allowed in parent directory: `[TEMP_PATH]/src/simple_namespace`"
);
fs_err::remove_file(bogus_init_py).unwrap();
let build1 = build(src.path(), dist.path()).unwrap();
assert_snapshot!(build1.source_dist_contents.join("\n"), @r"
simple_namespace_part-1.0.0/
simple_namespace_part-1.0.0/PKG-INFO
simple_namespace_part-1.0.0/pyproject.toml
simple_namespace_part-1.0.0/src
simple_namespace_part-1.0.0/src/simple_namespace
simple_namespace_part-1.0.0/src/simple_namespace/part
simple_namespace_part-1.0.0/src/simple_namespace/part/__init__.py
");
assert_snapshot!(build1.wheel_contents.join("\n"), @r"
simple_namespace/
simple_namespace/part/
simple_namespace/part/__init__.py
simple_namespace_part-1.0.0.dist-info/
simple_namespace_part-1.0.0.dist-info/METADATA
simple_namespace_part-1.0.0.dist-info/RECORD
simple_namespace_part-1.0.0.dist-info/WHEEL
");
// Check that `namespace = true` works too.
let pyproject_toml = indoc! {r#"
[project]
name = "simple-namespace-part"
version = "1.0.0"
[tool.uv.build-backend]
module-name = "simple_namespace.part"
namespace = true
[build-system]
requires = ["uv_build>=0.5.15,<0.6"]
build-backend = "uv_build"
"#
};
fs_err::write(src.path().join("pyproject.toml"), pyproject_toml).unwrap();
let build2 = build(src.path(), dist.path()).unwrap();
assert_eq!(build1, build2);
}
/// A complex namespace package with a multiple root `__init__.py`.
#[test]
fn complex_namespace_package() {
let src = TempDir::new().unwrap();
let pyproject_toml = indoc! {r#"
[project]
name = "complex-namespace"
version = "1.0.0"
[tool.uv.build-backend]
namespace = true
[build-system]
requires = ["uv_build>=0.5.15,<0.6"]
build-backend = "uv_build"
"#
};
fs_err::write(src.path().join("pyproject.toml"), pyproject_toml).unwrap();
fs_err::create_dir_all(
src.path()
.join("src")
.join("complex_namespace")
.join("part_a"),
)
.unwrap();
File::create(
src.path()
.join("src")
.join("complex_namespace")
.join("part_a")
.join("__init__.py"),
)
.unwrap();
fs_err::create_dir_all(
src.path()
.join("src")
.join("complex_namespace")
.join("part_b"),
)
.unwrap();
File::create(
src.path()
.join("src")
.join("complex_namespace")
.join("part_b")
.join("__init__.py"),
)
.unwrap();
let dist = TempDir::new().unwrap();
let build1 = build(src.path(), dist.path()).unwrap();
assert_snapshot!(build1.wheel_contents.join("\n"), @r"
complex_namespace-1.0.0.dist-info/
complex_namespace-1.0.0.dist-info/METADATA
complex_namespace-1.0.0.dist-info/RECORD
complex_namespace-1.0.0.dist-info/WHEEL
complex_namespace/
complex_namespace/part_a/
complex_namespace/part_a/__init__.py
complex_namespace/part_b/
complex_namespace/part_b/__init__.py
");
// Check that setting the name manually works equally.
let pyproject_toml = indoc! {r#"
[project]
name = "complex-namespace"
version = "1.0.0"
[tool.uv.build-backend]
module-name = "complex_namespace"
namespace = true
[build-system]
requires = ["uv_build>=0.5.15,<0.6"]
build-backend = "uv_build"
"#
};
fs_err::write(src.path().join("pyproject.toml"), pyproject_toml).unwrap();
let build2 = build(src.path(), dist.path()).unwrap();
assert_eq!(build1, build2);
}
/// Stubs for a namespace package.
#[test]
fn stubs_namespace() {
let src = TempDir::new().unwrap();
let pyproject_toml = indoc! {r#"
[project]
name = "cloud.db.schema-stubs"
version = "1.0.0"
[tool.uv.build-backend]
module-name = "cloud-stubs.db.schema"
[build-system]
requires = ["uv_build>=0.5.15,<0.6"]
build-backend = "uv_build"
"#
};
fs_err::write(src.path().join("pyproject.toml"), pyproject_toml).unwrap();
fs_err::create_dir_all(
src.path()
.join("src")
.join("cloud-stubs")
.join("db")
.join("schema"),
)
.unwrap();
File::create(
src.path()
.join("src")
.join("cloud-stubs")
.join("db")
.join("schema")
.join("__init__.pyi"),
)
.unwrap();
let dist = TempDir::new().unwrap();
let build = build(src.path(), dist.path()).unwrap();
assert_snapshot!(build.wheel_contents.join("\n"), @r"
cloud-stubs/
cloud-stubs/db/
cloud-stubs/db/schema/
cloud-stubs/db/schema/__init__.pyi
cloud_db_schema_stubs-1.0.0.dist-info/
cloud_db_schema_stubs-1.0.0.dist-info/METADATA
cloud_db_schema_stubs-1.0.0.dist-info/RECORD
cloud_db_schema_stubs-1.0.0.dist-info/WHEEL
");
}
}

View file

@ -1,16 +1,9 @@
use serde::{Deserialize, Deserializer, Serialize, Serializer};
use std::fmt::Display;
use serde::{Deserialize, Serialize};
use std::path::PathBuf;
use std::str::FromStr;
use uv_macros::OptionsMetadata;
use uv_pypi_types::Identifier;
/// Settings for the uv build backend (`uv_build`).
///
/// !!! note
///
/// The uv build backend is currently in preview and may change in any future release.
///
/// Note that those settings only apply when using the `uv_build` backend, other build backends
/// (such as hatchling) have their own configuration.
///
@ -38,6 +31,9 @@ pub struct BuildBackendSettings {
/// `__init__.py`. An exception are stubs packages, whose name ends with `-stubs`, with the stem
/// being the module name, and which contain a `__init__.pyi` file.
///
/// For namespace packages with a single module, the path can be dotted, e.g., `foo.bar` or
/// `foo-stubs.bar`.
///
/// Note that using this option runs the risk of creating two packages with different names but
/// the same module names. Installing such packages together leads to unspecified behavior,
/// often with corrupted files or directory trees.
@ -46,7 +42,7 @@ pub struct BuildBackendSettings {
value_type = "str",
example = r#"module-name = "sklearn""#
)]
pub module_name: Option<ModuleName>,
pub module_name: Option<String>,
/// Glob expressions which files and directories to additionally include in the source
/// distribution.
@ -85,6 +81,56 @@ pub struct BuildBackendSettings {
)]
pub wheel_exclude: Vec<String>,
/// Build a namespace package.
///
/// Build a PEP 420 implicit namespace package, allowing more than one root `__init__.py`.
///
/// Use this option when the namespace package contains multiple root `__init__.py`, for
/// namespace packages with a single root `__init__.py` use a dotted `module-name` instead.
///
/// To compare dotted `module-name` and `namespace = true`, the first example below can be
/// expressed with `module-name = "cloud.database"`: There is one root `__init__.py` `database`.
/// In the second example, we have three roots (`cloud.database`, `cloud.database_pro`,
/// `billing.modules.database_pro`), so `namespace = true` is required.
///
/// ```text
/// src
/// └── cloud
/// └── database
/// ├── __init__.py
/// ├── query_builder
/// │ └── __init__.py
/// └── sql
/// ├── parser.py
/// └── __init__.py
/// ```
///
/// ```text
/// src
/// ├── cloud
/// │ ├── database
/// │ │ ├── __init__.py
/// │ │ ├── query_builder
/// │ │ │ └── __init__.py
/// │ │ └── sql
/// │ │ ├── __init__.py
/// │ │ └── parser.py
/// │ └── database_pro
/// │ ├── __init__.py
/// │ └── query_builder.py
/// └── billing
/// └── modules
/// └── database_pro
/// ├── __init__.py
/// └── sql.py
/// ```
#[option(
default = r#"false"#,
value_type = "bool",
example = r#"namespace = true"#
)]
pub namespace: bool,
/// Data includes for wheels.
///
/// Each entry is a directory, whose contents are copied to the matching directory in the wheel
@ -129,88 +175,12 @@ impl Default for BuildBackendSettings {
default_excludes: true,
source_exclude: Vec::new(),
wheel_exclude: Vec::new(),
namespace: false,
data: WheelDataIncludes::default(),
}
}
}
/// Packages come in two kinds: Regular packages, where the name must be a valid Python identifier,
/// and stubs packages.
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum ModuleName {
/// A Python module name, which needs to be a valid Python identifier to be used with `import`.
Identifier(Identifier),
/// A type stubs package, whose name ends with `-stubs` with the stem being the module name.
Stubs(String),
}
impl Display for ModuleName {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
ModuleName::Identifier(module_name) => Display::fmt(module_name, f),
ModuleName::Stubs(module_name) => Display::fmt(module_name, f),
}
}
}
impl<'de> Deserialize<'de> for ModuleName {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: Deserializer<'de>,
{
let module_name = String::deserialize(deserializer)?;
if let Some(stem) = module_name.strip_suffix("-stubs") {
// Check that the stubs belong to a valid module.
Identifier::from_str(stem)
.map(ModuleName::Identifier)
.map_err(serde::de::Error::custom)?;
Ok(ModuleName::Stubs(module_name))
} else {
Identifier::from_str(&module_name)
.map(ModuleName::Identifier)
.map_err(serde::de::Error::custom)
}
}
}
impl Serialize for ModuleName {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
match self {
ModuleName::Identifier(module_name) => module_name.serialize(serializer),
ModuleName::Stubs(module_name) => module_name.serialize(serializer),
}
}
}
#[cfg(feature = "schemars")]
impl schemars::JsonSchema for ModuleName {
fn schema_name() -> String {
"ModuleName".to_string()
}
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
schemars::schema::SchemaObject {
instance_type: Some(schemars::schema::InstanceType::String.into()),
string: Some(Box::new(schemars::schema::StringValidation {
// Best-effort Unicode support (https://stackoverflow.com/a/68844380/3549270)
pattern: Some(r"^[_\p{Alphabetic}][_0-9\p{Alphabetic}]*(-stubs)?$".to_string()),
..schemars::schema::StringValidation::default()
})),
metadata: Some(Box::new(schemars::schema::Metadata {
description: Some(
"The name of the module, or the name of a stubs package".to_string(),
),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
}
}
/// Data includes for wheels.
///
/// See `BuildBackendSettings::data`.

View file

@ -68,16 +68,18 @@ fn source_dist_matcher(
includes.push(globset::escape("pyproject.toml"));
// Check that the source tree contains a module.
let (_, module_root) = find_roots(
let (src_root, module_relative) = find_roots(
source_tree,
pyproject_toml,
&settings.module_root,
settings.module_name.as_ref(),
settings.module_name.as_deref(),
settings.namespace,
)?;
// The wheel must not include any files included by the source distribution (at least until we
// have files generated in the source dist -> wheel build step).
let import_path = uv_fs::normalize_path(
&uv_fs::relative_to(module_root, source_tree).expect("module root is inside source tree"),
&uv_fs::relative_to(src_root.join(module_relative), source_tree)
.expect("module root is inside source tree"),
)
.portable_display()
.to_string();
@ -248,32 +250,16 @@ fn write_source_dist(
.expect("walkdir starts with root");
if !include_matcher.match_path(relative) || exclude_matcher.is_match(relative) {
trace!("Excluding: `{}`", relative.user_display());
trace!("Excluding from sdist: `{}`", relative.user_display());
continue;
}
debug!("Including {}", relative.user_display());
if entry.file_type().is_dir() {
writer.write_directory(
&Path::new(&top_level)
.join(relative)
.portable_display()
.to_string(),
)?;
} else if entry.file_type().is_file() {
writer.write_file(
&Path::new(&top_level)
.join(relative)
.portable_display()
.to_string(),
entry.path(),
)?;
} else {
return Err(Error::UnsupportedFileType(
relative.to_path_buf(),
entry.file_type(),
));
}
let entry_path = Path::new(&top_level)
.join(relative)
.portable_display()
.to_string();
debug!("Adding to sdist: {}", relative.user_display());
writer.write_dir_entry(&entry, &entry_path)?;
}
debug!("Visited {files_visited} files for source dist build");

View file

@ -17,8 +17,7 @@ use uv_warnings::warn_user_once;
use crate::metadata::DEFAULT_EXCLUDES;
use crate::{
BuildBackendSettings, DirectoryWriter, Error, FileList, ListWriter, PyProjectToml,
find_module_root, find_roots,
BuildBackendSettings, DirectoryWriter, Error, FileList, ListWriter, PyProjectToml, find_roots,
};
/// Build a wheel from the source tree and place it in the output directory.
@ -124,15 +123,24 @@ fn write_wheel(
let exclude_matcher = build_exclude_matcher(excludes)?;
debug!("Adding content files to wheel");
let (src_root, module_root) = find_roots(
let (src_root, module_relative) = find_roots(
source_tree,
pyproject_toml,
&settings.module_root,
settings.module_name.as_ref(),
settings.module_name.as_deref(),
settings.namespace,
)?;
// For convenience, have directories for the whole tree in the wheel
for ancestor in module_relative.ancestors().skip(1) {
if ancestor == Path::new("") {
continue;
}
wheel_writer.write_directory(&ancestor.portable_display().to_string())?;
}
let mut files_visited = 0;
for entry in WalkDir::new(module_root)
for entry in WalkDir::new(src_root.join(module_relative))
.sort_by_file_name()
.into_iter()
.filter_entry(|entry| !exclude_matcher.is_match(entry.path()))
@ -156,7 +164,7 @@ fn write_wheel(
.path()
.strip_prefix(source_tree)
.expect("walkdir starts with root");
let wheel_path = entry
let entry_path = entry
.path()
.strip_prefix(&src_root)
.expect("walkdir starts with root");
@ -164,21 +172,10 @@ fn write_wheel(
trace!("Excluding from module: `{}`", match_path.user_display());
continue;
}
let wheel_path = wheel_path.portable_display().to_string();
debug!("Adding to wheel: `{wheel_path}`");
if entry.file_type().is_dir() {
wheel_writer.write_directory(&wheel_path)?;
} else if entry.file_type().is_file() {
wheel_writer.write_file(&wheel_path, entry.path())?;
} else {
// TODO(konsti): We may want to support symlinks, there is support for installing them.
return Err(Error::UnsupportedFileType(
entry.path().to_path_buf(),
entry.file_type(),
));
}
let entry_path = entry_path.portable_display().to_string();
debug!("Adding to wheel: {entry_path}");
wheel_writer.write_dir_entry(&entry, &entry_path)?;
}
debug!("Visited {files_visited} files for wheel build");
@ -267,16 +264,13 @@ pub fn build_editable(
let mut wheel_writer = ZipDirectoryWriter::new_wheel(File::create(&wheel_path)?);
debug!("Adding pth file to {}", wheel_path.user_display());
let src_root = source_tree.join(&settings.module_root);
if !src_root.starts_with(source_tree) {
return Err(Error::InvalidModuleRoot(settings.module_root.clone()));
}
// Check that a module root exists in the directory we're linking from the `.pth` file
find_module_root(
&src_root,
settings.module_name.as_ref(),
pyproject_toml.name(),
let (src_root, _module_relative) = find_roots(
source_tree,
&pyproject_toml,
&settings.module_root,
settings.module_name.as_deref(),
settings.namespace,
)?;
wheel_writer.write_bytes(
@ -514,23 +508,12 @@ fn wheel_subdir_from_globs(
continue;
}
let relative_licenses = Path::new(target)
let license_path = Path::new(target)
.join(relative)
.portable_display()
.to_string();
if entry.file_type().is_dir() {
wheel_writer.write_directory(&relative_licenses)?;
} else if entry.file_type().is_file() {
debug!("Adding {} file: `{}`", globs_field, relative.user_display());
wheel_writer.write_file(&relative_licenses, entry.path())?;
} else {
// TODO(konsti): We may want to support symlinks, there is support for installing them.
return Err(Error::UnsupportedFileType(
entry.path().to_path_buf(),
entry.file_type(),
));
}
debug!("Adding for {}: `{}`", globs_field, relative.user_display());
wheel_writer.write_dir_entry(&entry, &license_path)?;
}
Ok(())
}

View file

@ -17,6 +17,7 @@ doctest = false
workspace = true
[dependencies]
uv-cache-key = { workspace = true }
uv-configuration = { workspace = true }
uv-distribution = { workspace = true }
uv-distribution-types = { workspace = true }

View file

@ -25,11 +25,14 @@ use tempfile::TempDir;
use tokio::io::AsyncBufReadExt;
use tokio::process::Command;
use tokio::sync::{Mutex, Semaphore};
use tracing::{Instrument, debug, info_span, instrument};
use tracing::{Instrument, debug, info_span, instrument, warn};
use uv_cache_key::cache_digest;
use uv_configuration::PreviewMode;
use uv_configuration::{BuildKind, BuildOutput, ConfigSettings, SourceStrategy};
use uv_distribution::BuildRequires;
use uv_distribution_types::{IndexLocations, Requirement, Resolution};
use uv_fs::LockedFile;
use uv_fs::{PythonExt, Simplified};
use uv_pep440::Version;
use uv_pep508::PackageName;
@ -200,6 +203,11 @@ impl Pep517Backend {
{import}
"#, backend_path = backend_path_encoded}
}
fn is_setuptools(&self) -> bool {
// either `setuptools.build_meta` or `setuptools.build_meta:__legacy__`
self.backend.split(':').next() == Some("setuptools.build_meta")
}
}
/// Uses an [`Rc`] internally, clone freely.
@ -278,6 +286,7 @@ impl SourceBuild {
mut environment_variables: FxHashMap<OsString, OsString>,
level: BuildOutput,
concurrent_builds: usize,
preview: PreviewMode,
) -> Result<Self, Error> {
let temp_dir = build_context.cache().venv_dir()?;
@ -325,6 +334,8 @@ impl SourceBuild {
false,
false,
false,
false,
preview,
)?
};
@ -430,6 +441,31 @@ impl SourceBuild {
})
}
/// Acquire a lock on the source tree, if necessary.
async fn acquire_lock(&self) -> Result<Option<LockedFile>, Error> {
// Depending on the command, setuptools puts `*.egg-info`, `build/`, and `dist/` in the
// source tree, and concurrent invocations of setuptools using the same source dir can
// stomp on each other. We need to lock something to fix that, but we don't want to dump a
// `.lock` file into the source tree that the user will need to .gitignore. Take a global
// proxy lock instead.
let mut source_tree_lock = None;
if self.pep517_backend.is_setuptools() {
debug!("Locking the source tree for setuptools");
let canonical_source_path = self.source_tree.canonicalize()?;
let lock_path = env::temp_dir().join(format!(
"uv-setuptools-{}.lock",
cache_digest(&canonical_source_path)
));
source_tree_lock = LockedFile::acquire(lock_path, self.source_tree.to_string_lossy())
.await
.inspect_err(|err| {
warn!("Failed to acquire build lock: {err}");
})
.ok();
}
Ok(source_tree_lock)
}
async fn get_resolved_requirements(
build_context: &impl BuildContext,
source_build_context: SourceBuildContext,
@ -600,6 +636,9 @@ impl SourceBuild {
return Ok(Some(metadata_dir.clone()));
}
// Lock the source tree, if necessary.
let _lock = self.acquire_lock().await?;
// Hatch allows for highly dynamic customization of metadata via hooks. In such cases, Hatch
// can't uphold the PEP 517 contract, in that the metadata Hatch would return by
// `prepare_metadata_for_build_wheel` isn't guaranteed to match that of the built wheel.
@ -712,16 +751,15 @@ impl SourceBuild {
pub async fn build(&self, wheel_dir: &Path) -> Result<String, Error> {
// The build scripts run with the extracted root as cwd, so they need the absolute path.
let wheel_dir = std::path::absolute(wheel_dir)?;
let filename = self.pep517_build(&wheel_dir, &self.pep517_backend).await?;
let filename = self.pep517_build(&wheel_dir).await?;
Ok(filename)
}
/// Perform a PEP 517 build for a wheel or source distribution (sdist).
async fn pep517_build(
&self,
output_dir: &Path,
pep517_backend: &Pep517Backend,
) -> Result<String, Error> {
async fn pep517_build(&self, output_dir: &Path) -> Result<String, Error> {
// Lock the source tree, if necessary.
let _lock = self.acquire_lock().await?;
// Write the hook output to a file so that we can read it back reliably.
let outfile = self
.temp_dir
@ -733,7 +771,7 @@ impl SourceBuild {
BuildKind::Sdist => {
debug!(
r#"Calling `{}.build_{}("{}", {})`"#,
pep517_backend.backend,
self.pep517_backend.backend,
self.build_kind,
output_dir.escape_for_python(),
self.config_settings.escape_for_python(),
@ -746,7 +784,7 @@ impl SourceBuild {
with open("{}", "w") as fp:
fp.write(sdist_filename)
"#,
pep517_backend.backend_import(),
self.pep517_backend.backend_import(),
self.build_kind,
output_dir.escape_for_python(),
self.config_settings.escape_for_python(),
@ -762,7 +800,7 @@ impl SourceBuild {
});
debug!(
r#"Calling `{}.build_{}("{}", {}, {})`"#,
pep517_backend.backend,
self.pep517_backend.backend,
self.build_kind,
output_dir.escape_for_python(),
self.config_settings.escape_for_python(),
@ -776,7 +814,7 @@ impl SourceBuild {
with open("{}", "w") as fp:
fp.write(wheel_filename)
"#,
pep517_backend.backend_import(),
self.pep517_backend.backend_import(),
self.build_kind,
output_dir.escape_for_python(),
self.config_settings.escape_for_python(),
@ -806,7 +844,7 @@ impl SourceBuild {
return Err(Error::from_command_output(
format!(
"Call to `{}.build_{}` failed",
pep517_backend.backend, self.build_kind
self.pep517_backend.backend, self.build_kind
),
&output,
self.level,
@ -821,7 +859,7 @@ impl SourceBuild {
return Err(Error::from_command_output(
format!(
"Call to `{}.build_{}` failed",
pep517_backend.backend, self.build_kind
self.pep517_backend.backend, self.build_kind
),
&output,
self.level,

View file

@ -1,6 +1,6 @@
[package]
name = "uv-build"
version = "0.7.10"
version = "0.7.19"
edition.workspace = true
rust-version.workspace = true
homepage.workspace = true

View file

@ -1,6 +1,6 @@
[project]
name = "uv-build"
version = "0.7.10"
version = "0.7.19"
description = "The uv build backend"
authors = [{ name = "Astral Software Inc.", email = "hey@astral.sh" }]
requires-python = ">=3.8"

View file

@ -0,0 +1,2 @@
# It is important retain compatibility with old versions in the build backend
target-version = "py37"

View file

@ -13,7 +13,6 @@ pub trait CompatArgs {
/// For example, users often pass `--allow-unsafe`, which is unnecessary with uv. But it's a
/// nice user experience to warn, rather than fail, when users pass `--allow-unsafe`.
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipCompileCompatArgs {
#[clap(long, hide = true)]
allow_unsafe: bool,
@ -159,7 +158,6 @@ impl CompatArgs for PipCompileCompatArgs {
///
/// These represent a subset of the `pip list` interface that uv supports by default.
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipListCompatArgs {
#[clap(long, hide = true)]
disable_pip_version_check: bool,
@ -184,7 +182,6 @@ impl CompatArgs for PipListCompatArgs {
///
/// These represent a subset of the `pip-sync` interface that uv supports by default.
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipSyncCompatArgs {
#[clap(short, long, hide = true)]
ask: bool,
@ -268,7 +265,6 @@ enum Resolver {
///
/// These represent a subset of the `virtualenv` interface that uv supports by default.
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct VenvCompatArgs {
#[clap(long, hide = true)]
clear: bool,
@ -327,7 +323,6 @@ impl CompatArgs for VenvCompatArgs {
///
/// These represent a subset of the `pip install` interface that uv supports by default.
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipInstallCompatArgs {
#[clap(long, hide = true)]
disable_pip_version_check: bool,
@ -361,7 +356,6 @@ impl CompatArgs for PipInstallCompatArgs {
///
/// These represent a subset of the `pip` interface that exists on all commands.
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipGlobalCompatArgs {
#[clap(long, hide = true)]
disable_pip_version_check: bool,

View file

@ -85,7 +85,6 @@ const STYLES: Styles = Styles::styled()
disable_version_flag = true
)]
#[command(styles=STYLES)]
#[allow(clippy::struct_excessive_bools)]
pub struct Cli {
#[command(subcommand)]
pub command: Box<Commands>,
@ -133,7 +132,6 @@ pub struct TopLevelArgs {
#[derive(Parser, Debug, Clone)]
#[command(next_help_heading = "Global options", next_display_order = 1000)]
#[allow(clippy::struct_excessive_bools)]
pub struct GlobalArgs {
#[arg(
global = true,
@ -526,7 +524,6 @@ pub struct HelpArgs {
#[derive(Args)]
#[command(group = clap::ArgGroup::new("operation"))]
#[allow(clippy::struct_excessive_bools)]
pub struct VersionArgs {
/// Set the project version to this value
///
@ -657,7 +654,6 @@ pub struct SelfUpdateArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct CacheNamespace {
#[command(subcommand)]
pub command: CacheCommand,
@ -687,14 +683,12 @@ pub enum CacheCommand {
}
#[derive(Args, Debug)]
#[allow(clippy::struct_excessive_bools)]
pub struct CleanArgs {
/// The packages to remove from the cache.
pub package: Vec<PackageName>,
}
#[derive(Args, Debug)]
#[allow(clippy::struct_excessive_bools)]
pub struct PruneArgs {
/// Optimize the cache for persistence in a continuous integration environment, like GitHub
/// Actions.
@ -714,7 +708,6 @@ pub struct PruneArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipNamespace {
#[command(subcommand)]
pub command: PipCommand,
@ -1095,7 +1088,6 @@ fn parse_maybe_string(input: &str) -> Result<Maybe<String>, String> {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
#[command(group = clap::ArgGroup::new("sources").required(true).multiple(true))]
pub struct PipCompileArgs {
/// Include all packages listed in the given `requirements.in` files.
@ -1443,7 +1435,6 @@ pub struct PipCompileArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipSyncArgs {
/// Include all packages listed in the given `requirements.txt` files.
///
@ -1491,7 +1482,7 @@ pub struct PipSyncArgs {
/// Hash-checking mode introduces a number of additional constraints:
///
/// - Git dependencies are not supported.
/// - Editable installs are not supported.
/// - Editable installations are not supported.
/// - Local dependencies are not supported, unless they point to a specific wheel (`.whl`) or
/// source archive (`.zip`, `.tar.gz`), as opposed to a directory.
#[arg(
@ -1700,7 +1691,6 @@ pub struct PipSyncArgs {
#[derive(Args)]
#[command(group = clap::ArgGroup::new("sources").required(true).multiple(true))]
#[allow(clippy::struct_excessive_bools)]
pub struct PipInstallArgs {
/// Install all listed packages.
///
@ -1801,7 +1791,7 @@ pub struct PipInstallArgs {
/// Hash-checking mode introduces a number of additional constraints:
///
/// - Git dependencies are not supported.
/// - Editable installs are not supported.
/// - Editable installations are not supported.
/// - Local dependencies are not supported, unless they point to a specific wheel (`.whl`) or
/// source archive (`.zip`, `.tar.gz`), as opposed to a directory.
#[arg(
@ -2015,7 +2005,6 @@ pub struct PipInstallArgs {
#[derive(Args)]
#[command(group = clap::ArgGroup::new("sources").required(true).multiple(true))]
#[allow(clippy::struct_excessive_bools)]
pub struct PipUninstallArgs {
/// Uninstall all listed packages.
#[arg(group = "sources")]
@ -2104,7 +2093,6 @@ pub struct PipUninstallArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipFreezeArgs {
/// Exclude any editable packages from output.
#[arg(long)]
@ -2159,7 +2147,6 @@ pub struct PipFreezeArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipListArgs {
/// Only include editable projects.
#[arg(short, long)]
@ -2235,7 +2222,6 @@ pub struct PipListArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipCheckArgs {
/// The Python interpreter for which packages should be checked.
///
@ -2271,7 +2257,6 @@ pub struct PipCheckArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipShowArgs {
/// The package(s) to display.
pub package: Vec<PackageName>,
@ -2325,7 +2310,6 @@ pub struct PipShowArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PipTreeArgs {
/// Show the version constraint(s) imposed on each package.
#[arg(long)]
@ -2382,7 +2366,6 @@ pub struct PipTreeArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct BuildArgs {
/// The directory from which distributions should be built, or a source
/// distribution archive to build into a wheel.
@ -2471,7 +2454,7 @@ pub struct BuildArgs {
/// Hash-checking mode introduces a number of additional constraints:
///
/// - Git dependencies are not supported.
/// - Editable installs are not supported.
/// - Editable installations are not supported.
/// - Local dependencies are not supported, unless they point to a specific wheel (`.whl`) or
/// source archive (`.zip`, `.tar.gz`), as opposed to a directory.
#[arg(
@ -2529,7 +2512,6 @@ pub struct BuildArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct VenvArgs {
/// The Python interpreter to use for the virtual environment.
///
@ -2725,7 +2707,6 @@ pub enum AuthorFrom {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct InitArgs {
/// The path to use for the project/script.
///
@ -2883,7 +2864,6 @@ pub struct InitArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct RunArgs {
/// Include optional dependencies from the specified extra name.
///
@ -3170,7 +3150,6 @@ pub struct RunArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct SyncArgs {
/// Include optional dependencies from the specified extra name.
///
@ -3427,7 +3406,6 @@ pub struct SyncArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct LockArgs {
/// Check if the lockfile is up-to-date.
///
@ -3489,7 +3467,6 @@ pub struct LockArgs {
#[derive(Args)]
#[command(group = clap::ArgGroup::new("sources").required(true).multiple(true))]
#[allow(clippy::struct_excessive_bools)]
pub struct AddArgs {
/// The packages to add, as PEP 508 requirements (e.g., `ruff==0.5.0`).
#[arg(group = "sources")]
@ -3516,7 +3493,12 @@ pub struct AddArgs {
/// Add the requirements to the development dependency group.
///
/// This option is an alias for `--group dev`.
#[arg(long, conflicts_with("optional"), conflicts_with("group"))]
#[arg(
long,
conflicts_with("optional"),
conflicts_with("group"),
conflicts_with("script")
)]
pub dev: bool,
/// Add the requirements to the package's optional dependencies for the specified extra.
@ -3530,7 +3512,12 @@ pub struct AddArgs {
/// Add the requirements to the specified dependency group.
///
/// These requirements will not be included in the published metadata for the project.
#[arg(long, conflicts_with("dev"), conflicts_with("optional"))]
#[arg(
long,
conflicts_with("dev"),
conflicts_with("optional"),
conflicts_with("script")
)]
pub group: Option<GroupName>,
/// Add the requirements as editable.
@ -3664,7 +3651,6 @@ pub struct AddArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct RemoveArgs {
/// The names of the dependencies to remove (e.g., `ruff`).
#[arg(required = true)]
@ -3677,11 +3663,21 @@ pub struct RemoveArgs {
pub dev: bool,
/// Remove the packages from the project's optional dependencies for the specified extra.
#[arg(long, conflicts_with("dev"), conflicts_with("group"))]
#[arg(
long,
conflicts_with("dev"),
conflicts_with("group"),
conflicts_with("script")
)]
pub optional: Option<ExtraName>,
/// Remove the packages from the specified dependency group.
#[arg(long, conflicts_with("dev"), conflicts_with("optional"))]
#[arg(
long,
conflicts_with("dev"),
conflicts_with("optional"),
conflicts_with("script")
)]
pub group: Option<GroupName>,
/// Avoid syncing the virtual environment after re-locking the project.
@ -3749,7 +3745,6 @@ pub struct RemoveArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct TreeArgs {
/// Show a platform-independent dependency tree.
///
@ -3889,7 +3884,6 @@ pub struct TreeArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct ExportArgs {
/// The format to which `uv.lock` should be exported.
///
@ -4104,7 +4098,6 @@ pub struct ExportArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct ToolNamespace {
#[command(subcommand)]
pub command: ToolCommand,
@ -4197,7 +4190,6 @@ pub enum ToolCommand {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct ToolRunArgs {
/// The command to run.
///
@ -4316,7 +4308,6 @@ pub struct UvxArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct ToolInstallArgs {
/// The package to install commands from.
pub package: String,
@ -4405,7 +4396,6 @@ pub struct ToolInstallArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct ToolListArgs {
/// Whether to display the path to each tool environment and installed executable.
#[arg(long)]
@ -4432,7 +4422,6 @@ pub struct ToolListArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct ToolDirArgs {
/// Show the directory into which `uv tool` will install executables.
///
@ -4451,7 +4440,6 @@ pub struct ToolDirArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct ToolUninstallArgs {
/// The name of the tool to uninstall.
#[arg(required = true)]
@ -4463,7 +4451,6 @@ pub struct ToolUninstallArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct ToolUpgradeArgs {
/// The name of the tool to upgrade, along with an optional version specifier.
#[arg(required = true)]
@ -4693,7 +4680,6 @@ pub struct ToolUpgradeArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PythonNamespace {
#[command(subcommand)]
pub command: PythonCommand,
@ -4728,13 +4714,32 @@ pub enum PythonCommand {
///
/// A `python` executable is not made globally available, managed Python versions are only used
/// in uv commands or in active virtual environments. There is experimental support for adding
/// Python executables to the `PATH` — use the `--preview` flag to enable this behavior.
/// Python executables to a directory on the path — use the `--preview` flag to enable this
/// behavior and `uv python dir --bin` to retrieve the target directory.
///
/// Multiple Python versions may be requested.
///
/// See `uv help python` to view supported request formats.
Install(PythonInstallArgs),
/// Upgrade installed Python versions to the latest supported patch release (requires the
/// `--preview` flag).
///
/// A target Python minor version to upgrade may be provided, e.g., `3.13`. Multiple versions
/// may be provided to perform more than one upgrade.
///
/// If no target version is provided, then uv will upgrade all managed CPython versions.
///
/// During an upgrade, uv will not uninstall outdated patch versions.
///
/// When an upgrade is performed, virtual environments created by uv will automatically
/// use the new version. However, if the virtual environment was created before the
/// upgrade functionality was added, it will continue to use the old Python version; to enable
/// upgrades, the environment must be recreated.
///
/// Upgrades are not yet supported for alternative implementations, like PyPy.
Upgrade(PythonUpgradeArgs),
/// Search for a Python installation.
///
/// Displays the path to the Python executable.
@ -4763,7 +4768,8 @@ pub enum PythonCommand {
/// The Python installation directory may be overridden with `$UV_PYTHON_INSTALL_DIR`.
///
/// To view the directory where uv installs Python executables instead, use the `--bin` flag.
/// Note that Python executables are only installed when preview mode is enabled.
/// The Python executable directory may be overridden with `$UV_PYTHON_BIN_DIR`. Note that
/// Python executables are only installed when preview mode is enabled.
Dir(PythonDirArgs),
/// Uninstall Python versions.
@ -4771,7 +4777,6 @@ pub enum PythonCommand {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PythonListArgs {
/// A Python request to filter by.
///
@ -4826,7 +4831,6 @@ pub struct PythonListArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PythonDirArgs {
/// Show the directory into which `uv python` will install Python executables.
///
@ -4844,7 +4848,6 @@ pub struct PythonDirArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PythonInstallArgs {
/// The directory to store the Python installation in.
///
@ -4923,7 +4926,50 @@ pub struct PythonInstallArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PythonUpgradeArgs {
/// The directory Python installations are stored in.
///
/// If provided, `UV_PYTHON_INSTALL_DIR` will need to be set for subsequent operations for uv to
/// discover the Python installation.
///
/// See `uv python dir` to view the current Python installation directory. Defaults to
/// `~/.local/share/uv/python`.
#[arg(long, short, env = EnvVars::UV_PYTHON_INSTALL_DIR)]
pub install_dir: Option<PathBuf>,
/// The Python minor version(s) to upgrade.
///
/// If no target version is provided, then uv will upgrade all managed CPython versions.
#[arg(env = EnvVars::UV_PYTHON)]
pub targets: Vec<String>,
/// Set the URL to use as the source for downloading Python installations.
///
/// The provided URL will replace
/// `https://github.com/astral-sh/python-build-standalone/releases/download` in, e.g.,
/// `https://github.com/astral-sh/python-build-standalone/releases/download/20240713/cpython-3.12.4%2B20240713-aarch64-apple-darwin-install_only.tar.gz`.
///
/// Distributions can be read from a local directory by using the `file://` URL scheme.
#[arg(long, env = EnvVars::UV_PYTHON_INSTALL_MIRROR)]
pub mirror: Option<String>,
/// Set the URL to use as the source for downloading PyPy installations.
///
/// The provided URL will replace `https://downloads.python.org/pypy` in, e.g.,
/// `https://downloads.python.org/pypy/pypy3.8-v7.3.7-osx64.tar.bz2`.
///
/// Distributions can be read from a local directory by using the `file://` URL scheme.
#[arg(long, env = EnvVars::UV_PYPY_INSTALL_MIRROR)]
pub pypy_mirror: Option<String>,
/// URL pointing to JSON of custom Python installations.
///
/// Note that currently, only local paths are supported.
#[arg(long, env = EnvVars::UV_PYTHON_DOWNLOADS_JSON_URL)]
pub python_downloads_json_url: Option<String>,
}
#[derive(Args)]
pub struct PythonUninstallArgs {
/// The directory where the Python was installed.
#[arg(long, short, env = EnvVars::UV_PYTHON_INSTALL_DIR)]
@ -4941,7 +4987,6 @@ pub struct PythonUninstallArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PythonFindArgs {
/// The Python request.
///
@ -4990,7 +5035,6 @@ pub struct PythonFindArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct PythonPinArgs {
/// The Python version request.
///
@ -5032,10 +5076,13 @@ pub struct PythonPinArgs {
/// directory, this version will be used instead.
#[arg(long)]
pub global: bool,
/// Remove the Python version pin.
#[arg(long, conflicts_with = "request", conflicts_with = "resolved")]
pub rm: bool,
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct GenerateShellCompletionArgs {
/// The shell to generate the completion script for
pub shell: clap_complete_command::Shell,
@ -5074,7 +5121,6 @@ pub struct GenerateShellCompletionArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct IndexArgs {
/// The URLs to use when resolving dependencies, in addition to the default index.
///
@ -5084,6 +5130,9 @@ pub struct IndexArgs {
/// All indexes provided via this flag take priority over the index specified by
/// `--default-index` (which defaults to PyPI). When multiple `--index` flags are provided,
/// earlier values take priority.
///
/// Index names are not supported as values. Relative paths must be disambiguated from index
/// names with `./` or `../` on Unix or `.\\`, `..\\`, `./` or `../` on Windows.
//
// The nested Vec structure (`Vec<Vec<Maybe<Index>>>`) is required for clap's
// value parsing mechanism, which processes one value at a time, in order to handle
@ -5149,7 +5198,6 @@ pub struct IndexArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct RefreshArgs {
/// Refresh all cached data.
#[arg(
@ -5175,7 +5223,6 @@ pub struct RefreshArgs {
}
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct BuildOptionsArgs {
/// Don't build source distributions.
///
@ -5231,7 +5278,6 @@ pub struct BuildOptionsArgs {
/// Arguments that are used by commands that need to install (but not resolve) packages.
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct InstallerArgs {
#[command(flatten)]
pub index_args: IndexArgs,
@ -5373,7 +5419,6 @@ pub struct InstallerArgs {
/// Arguments that are used by commands that need to resolve (but not install) packages.
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct ResolverArgs {
#[command(flatten)]
pub index_args: IndexArgs,
@ -5540,7 +5585,6 @@ pub struct ResolverArgs {
/// Arguments that are used by commands that need to resolve and install packages.
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct ResolverInstallerArgs {
#[command(flatten)]
pub index_args: IndexArgs,
@ -5757,7 +5801,6 @@ pub struct ResolverInstallerArgs {
/// Arguments that are used by commands that need to fetch from the Simple API.
#[derive(Args)]
#[allow(clippy::struct_excessive_bools)]
pub struct FetchArgs {
#[command(flatten)]
pub index_args: IndexArgs,

View file

@ -1,7 +1,10 @@
use anstream::eprintln;
use uv_cache::Refresh;
use uv_configuration::ConfigSettings;
use uv_resolver::PrereleaseMode;
use uv_settings::{Combine, PipOptions, ResolverInstallerOptions, ResolverOptions};
use uv_warnings::owo_colors::OwoColorize;
use crate::{
BuildOptionsArgs, FetchArgs, IndexArgs, InstallerArgs, Maybe, RefreshArgs, ResolverArgs,
@ -9,12 +12,27 @@ use crate::{
};
/// Given a boolean flag pair (like `--upgrade` and `--no-upgrade`), resolve the value of the flag.
pub fn flag(yes: bool, no: bool) -> Option<bool> {
pub fn flag(yes: bool, no: bool, name: &str) -> Option<bool> {
match (yes, no) {
(true, false) => Some(true),
(false, true) => Some(false),
(false, false) => None,
(..) => unreachable!("Clap should make this impossible"),
(..) => {
eprintln!(
"{}{} `{}` and `{}` cannot be used together. \
Boolean flags on different levels are currently not supported \
(https://github.com/clap-rs/clap/issues/6049)",
"error".bold().red(),
":".bold(),
format!("--{name}").green(),
format!("--no-{name}").green(),
);
// No error forwarding since should eventually be solved on the clap side.
#[allow(clippy::exit)]
{
std::process::exit(2);
}
}
}
}
@ -26,7 +44,7 @@ impl From<RefreshArgs> for Refresh {
refresh_package,
} = value;
Self::from_args(flag(refresh, no_refresh), refresh_package)
Self::from_args(flag(refresh, no_refresh, "no-refresh"), refresh_package)
}
}
@ -53,7 +71,7 @@ impl From<ResolverArgs> for PipOptions {
} = args;
Self {
upgrade: flag(upgrade, no_upgrade),
upgrade: flag(upgrade, no_upgrade, "no-upgrade"),
upgrade_package: Some(upgrade_package),
index_strategy,
keyring_provider,
@ -66,7 +84,7 @@ impl From<ResolverArgs> for PipOptions {
},
config_settings: config_setting
.map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()),
no_build_isolation: flag(no_build_isolation, build_isolation),
no_build_isolation: flag(no_build_isolation, build_isolation, "build-isolation"),
no_build_isolation_package: Some(no_build_isolation_package),
exclude_newer,
link_mode,
@ -96,16 +114,16 @@ impl From<InstallerArgs> for PipOptions {
} = args;
Self {
reinstall: flag(reinstall, no_reinstall),
reinstall: flag(reinstall, no_reinstall, "reinstall"),
reinstall_package: Some(reinstall_package),
index_strategy,
keyring_provider,
config_settings: config_setting
.map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()),
no_build_isolation: flag(no_build_isolation, build_isolation),
no_build_isolation: flag(no_build_isolation, build_isolation, "build-isolation"),
exclude_newer,
link_mode,
compile_bytecode: flag(compile_bytecode, no_compile_bytecode),
compile_bytecode: flag(compile_bytecode, no_compile_bytecode, "compile-bytecode"),
no_sources: if no_sources { Some(true) } else { None },
..PipOptions::from(index_args)
}
@ -140,9 +158,9 @@ impl From<ResolverInstallerArgs> for PipOptions {
} = args;
Self {
upgrade: flag(upgrade, no_upgrade),
upgrade: flag(upgrade, no_upgrade, "upgrade"),
upgrade_package: Some(upgrade_package),
reinstall: flag(reinstall, no_reinstall),
reinstall: flag(reinstall, no_reinstall, "reinstall"),
reinstall_package: Some(reinstall_package),
index_strategy,
keyring_provider,
@ -155,11 +173,11 @@ impl From<ResolverInstallerArgs> for PipOptions {
fork_strategy,
config_settings: config_setting
.map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()),
no_build_isolation: flag(no_build_isolation, build_isolation),
no_build_isolation: flag(no_build_isolation, build_isolation, "build-isolation"),
no_build_isolation_package: Some(no_build_isolation_package),
exclude_newer,
link_mode,
compile_bytecode: flag(compile_bytecode, no_compile_bytecode),
compile_bytecode: flag(compile_bytecode, no_compile_bytecode, "compile-bytecode"),
no_sources: if no_sources { Some(true) } else { None },
..PipOptions::from(index_args)
}
@ -289,7 +307,7 @@ pub fn resolver_options(
.filter_map(Maybe::into_option)
.collect()
}),
upgrade: flag(upgrade, no_upgrade),
upgrade: flag(upgrade, no_upgrade, "no-upgrade"),
upgrade_package: Some(upgrade_package),
index_strategy,
keyring_provider,
@ -303,13 +321,13 @@ pub fn resolver_options(
dependency_metadata: None,
config_settings: config_setting
.map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()),
no_build_isolation: flag(no_build_isolation, build_isolation),
no_build_isolation: flag(no_build_isolation, build_isolation, "build-isolation"),
no_build_isolation_package: Some(no_build_isolation_package),
exclude_newer,
link_mode,
no_build: flag(no_build, build),
no_build: flag(no_build, build, "build"),
no_build_package: Some(no_build_package),
no_binary: flag(no_binary, binary),
no_binary: flag(no_binary, binary, "binary"),
no_binary_package: Some(no_binary_package),
no_sources: if no_sources { Some(true) } else { None },
}
@ -386,13 +404,13 @@ pub fn resolver_installer_options(
.filter_map(Maybe::into_option)
.collect()
}),
upgrade: flag(upgrade, no_upgrade),
upgrade: flag(upgrade, no_upgrade, "upgrade"),
upgrade_package: if upgrade_package.is_empty() {
None
} else {
Some(upgrade_package)
},
reinstall: flag(reinstall, no_reinstall),
reinstall: flag(reinstall, no_reinstall, "reinstall"),
reinstall_package: if reinstall_package.is_empty() {
None
} else {
@ -410,7 +428,7 @@ pub fn resolver_installer_options(
dependency_metadata: None,
config_settings: config_setting
.map(|config_settings| config_settings.into_iter().collect::<ConfigSettings>()),
no_build_isolation: flag(no_build_isolation, build_isolation),
no_build_isolation: flag(no_build_isolation, build_isolation, "build-isolation"),
no_build_isolation_package: if no_build_isolation_package.is_empty() {
None
} else {
@ -418,14 +436,14 @@ pub fn resolver_installer_options(
},
exclude_newer,
link_mode,
compile_bytecode: flag(compile_bytecode, no_compile_bytecode),
no_build: flag(no_build, build),
compile_bytecode: flag(compile_bytecode, no_compile_bytecode, "compile-bytecode"),
no_build: flag(no_build, build, "build"),
no_build_package: if no_build_package.is_empty() {
None
} else {
Some(no_build_package)
},
no_binary: flag(no_binary, binary),
no_binary: flag(no_binary, binary, "binary"),
no_binary_package: if no_binary_package.is_empty() {
None
} else {

View file

@ -65,3 +65,4 @@ hyper = { version = "1.4.1", features = ["server", "http1"] }
hyper-util = { version = "0.1.8", features = ["tokio"] }
insta = { version = "1.40.0", features = ["filters", "json", "redactions"] }
tokio = { workspace = true }
wiremock = { workspace = true }

View file

@ -6,16 +6,26 @@ use std::sync::Arc;
use std::time::Duration;
use std::{env, io, iter};
use anyhow::anyhow;
use http::{
HeaderMap, HeaderName, HeaderValue, Method, StatusCode,
header::{
AUTHORIZATION, CONTENT_ENCODING, CONTENT_LENGTH, CONTENT_TYPE, COOKIE, LOCATION,
PROXY_AUTHORIZATION, REFERER, TRANSFER_ENCODING, WWW_AUTHENTICATE,
},
};
use itertools::Itertools;
use reqwest::{Client, ClientBuilder, Proxy, Response};
use reqwest::{Client, ClientBuilder, IntoUrl, Proxy, Request, Response, multipart};
use reqwest_middleware::{ClientWithMiddleware, Middleware};
use reqwest_retry::policies::ExponentialBackoff;
use reqwest_retry::{
DefaultRetryableStrategy, RetryTransientMiddleware, Retryable, RetryableStrategy,
};
use tracing::{debug, trace};
use url::ParseError;
use url::Url;
use uv_auth::Credentials;
use uv_auth::{AuthMiddleware, Indexes};
use uv_configuration::{KeyringProviderType, TrustedHost};
use uv_fs::Simplified;
@ -32,6 +42,10 @@ use crate::middleware::OfflineMiddleware;
use crate::tls::read_identity;
pub const DEFAULT_RETRIES: u32 = 3;
/// Maximum number of redirects to follow before giving up.
///
/// This is the default used by [`reqwest`].
const DEFAULT_MAX_REDIRECTS: u32 = 10;
/// Selectively skip parts or the entire auth middleware.
#[derive(Debug, Clone, Copy, Default)]
@ -61,6 +75,31 @@ pub struct BaseClientBuilder<'a> {
default_timeout: Duration,
extra_middleware: Option<ExtraMiddleware>,
proxies: Vec<Proxy>,
redirect_policy: RedirectPolicy,
/// Whether credentials should be propagated during cross-origin redirects.
///
/// A policy allowing propagation is insecure and should only be available for test code.
cross_origin_credential_policy: CrossOriginCredentialsPolicy,
}
/// The policy for handling HTTP redirects.
#[derive(Debug, Default, Clone, Copy)]
pub enum RedirectPolicy {
/// Use reqwest's built-in redirect handling. This bypasses our custom middleware
/// on redirect.
#[default]
BypassMiddleware,
/// Handle redirects manually, re-triggering our custom middleware for each request.
RetriggerMiddleware,
}
impl RedirectPolicy {
pub fn reqwest_policy(self) -> reqwest::redirect::Policy {
match self {
RedirectPolicy::BypassMiddleware => reqwest::redirect::Policy::default(),
RedirectPolicy::RetriggerMiddleware => reqwest::redirect::Policy::none(),
}
}
}
/// A list of user-defined middlewares to be applied to the client.
@ -96,6 +135,8 @@ impl BaseClientBuilder<'_> {
default_timeout: Duration::from_secs(30),
extra_middleware: None,
proxies: vec![],
redirect_policy: RedirectPolicy::default(),
cross_origin_credential_policy: CrossOriginCredentialsPolicy::Secure,
}
}
}
@ -173,6 +214,24 @@ impl<'a> BaseClientBuilder<'a> {
self
}
#[must_use]
pub fn redirect(mut self, policy: RedirectPolicy) -> Self {
self.redirect_policy = policy;
self
}
/// Allows credentials to be propagated on cross-origin redirects.
///
/// WARNING: This should only be available for tests. In production code, propagating credentials
/// during cross-origin redirects can lead to security vulnerabilities including credential
/// leakage to untrusted domains.
#[cfg(test)]
#[must_use]
pub fn allow_cross_origin_credentials(mut self) -> Self {
self.cross_origin_credential_policy = CrossOriginCredentialsPolicy::Insecure;
self
}
pub fn is_offline(&self) -> bool {
matches!(self.connectivity, Connectivity::Offline)
}
@ -229,6 +288,7 @@ impl<'a> BaseClientBuilder<'a> {
timeout,
ssl_cert_file_exists,
Security::Secure,
self.redirect_policy,
);
// Create an insecure client that accepts invalid certificates.
@ -237,11 +297,20 @@ impl<'a> BaseClientBuilder<'a> {
timeout,
ssl_cert_file_exists,
Security::Insecure,
self.redirect_policy,
);
// Wrap in any relevant middleware and handle connectivity.
let client = self.apply_middleware(raw_client.clone());
let dangerous_client = self.apply_middleware(raw_dangerous_client.clone());
let client = RedirectClientWithMiddleware {
client: self.apply_middleware(raw_client.clone()),
redirect_policy: self.redirect_policy,
cross_origin_credentials_policy: self.cross_origin_credential_policy,
};
let dangerous_client = RedirectClientWithMiddleware {
client: self.apply_middleware(raw_dangerous_client.clone()),
redirect_policy: self.redirect_policy,
cross_origin_credentials_policy: self.cross_origin_credential_policy,
};
BaseClient {
connectivity: self.connectivity,
@ -258,8 +327,16 @@ impl<'a> BaseClientBuilder<'a> {
/// Share the underlying client between two different middleware configurations.
pub fn wrap_existing(&self, existing: &BaseClient) -> BaseClient {
// Wrap in any relevant middleware and handle connectivity.
let client = self.apply_middleware(existing.raw_client.clone());
let dangerous_client = self.apply_middleware(existing.raw_dangerous_client.clone());
let client = RedirectClientWithMiddleware {
client: self.apply_middleware(existing.raw_client.clone()),
redirect_policy: self.redirect_policy,
cross_origin_credentials_policy: self.cross_origin_credential_policy,
};
let dangerous_client = RedirectClientWithMiddleware {
client: self.apply_middleware(existing.raw_dangerous_client.clone()),
redirect_policy: self.redirect_policy,
cross_origin_credentials_policy: self.cross_origin_credential_policy,
};
BaseClient {
connectivity: self.connectivity,
@ -279,6 +356,7 @@ impl<'a> BaseClientBuilder<'a> {
timeout: Duration,
ssl_cert_file_exists: bool,
security: Security,
redirect_policy: RedirectPolicy,
) -> Client {
// Configure the builder.
let client_builder = ClientBuilder::new()
@ -286,7 +364,8 @@ impl<'a> BaseClientBuilder<'a> {
.user_agent(user_agent)
.pool_max_idle_per_host(20)
.read_timeout(timeout)
.tls_built_in_root_certs(false);
.tls_built_in_root_certs(false)
.redirect(redirect_policy.reqwest_policy());
// If necessary, accept invalid certificates.
let client_builder = match security {
@ -381,9 +460,9 @@ impl<'a> BaseClientBuilder<'a> {
#[derive(Debug, Clone)]
pub struct BaseClient {
/// The underlying HTTP client that enforces valid certificates.
client: ClientWithMiddleware,
client: RedirectClientWithMiddleware,
/// The underlying HTTP client that accepts invalid certificates.
dangerous_client: ClientWithMiddleware,
dangerous_client: RedirectClientWithMiddleware,
/// The HTTP client without middleware.
raw_client: Client,
/// The HTTP client that accepts invalid certificates without middleware.
@ -408,7 +487,7 @@ enum Security {
impl BaseClient {
/// Selects the appropriate client based on the host's trustworthiness.
pub fn for_host(&self, url: &DisplaySafeUrl) -> &ClientWithMiddleware {
pub fn for_host(&self, url: &DisplaySafeUrl) -> &RedirectClientWithMiddleware {
if self.disable_ssl(url) {
&self.dangerous_client
} else {
@ -416,6 +495,12 @@ impl BaseClient {
}
}
/// Executes a request, applying redirect policy.
pub async fn execute(&self, req: Request) -> reqwest_middleware::Result<Response> {
let client = self.for_host(&DisplaySafeUrl::from(req.url().clone()));
client.execute(req).await
}
/// Returns `true` if the host is trusted to use the insecure client.
pub fn disable_ssl(&self, url: &DisplaySafeUrl) -> bool {
self.allow_insecure_host
@ -439,6 +524,326 @@ impl BaseClient {
}
}
/// Wrapper around [`ClientWithMiddleware`] that manages redirects.
#[derive(Debug, Clone)]
pub struct RedirectClientWithMiddleware {
client: ClientWithMiddleware,
redirect_policy: RedirectPolicy,
/// Whether credentials should be preserved during cross-origin redirects.
///
/// WARNING: This should only be available for tests. In production code, preserving credentials
/// during cross-origin redirects can lead to security vulnerabilities including credential
/// leakage to untrusted domains.
cross_origin_credentials_policy: CrossOriginCredentialsPolicy,
}
impl RedirectClientWithMiddleware {
/// Convenience method to make a `GET` request to a URL.
pub fn get<U: IntoUrl>(&self, url: U) -> RequestBuilder {
RequestBuilder::new(self.client.get(url), self)
}
/// Convenience method to make a `POST` request to a URL.
pub fn post<U: IntoUrl>(&self, url: U) -> RequestBuilder {
RequestBuilder::new(self.client.post(url), self)
}
/// Convenience method to make a `HEAD` request to a URL.
pub fn head<U: IntoUrl>(&self, url: U) -> RequestBuilder {
RequestBuilder::new(self.client.head(url), self)
}
/// Executes a request, applying the redirect policy.
pub async fn execute(&self, req: Request) -> reqwest_middleware::Result<Response> {
match self.redirect_policy {
RedirectPolicy::BypassMiddleware => self.client.execute(req).await,
RedirectPolicy::RetriggerMiddleware => self.execute_with_redirect_handling(req).await,
}
}
/// Executes a request. If the response is a redirect (one of HTTP 301, 302, 303, 307, or 308), the
/// request is executed again with the redirect location URL (up to a maximum number of
/// redirects).
///
/// Unlike the built-in reqwest redirect policies, this sends the redirect request through the
/// entire middleware pipeline again.
///
/// See RFC 7231 7.1.2 <https://www.rfc-editor.org/rfc/rfc7231#section-7.1.2> for details on
/// redirect semantics.
async fn execute_with_redirect_handling(
&self,
req: Request,
) -> reqwest_middleware::Result<Response> {
let mut request = req;
let mut redirects = 0;
let max_redirects = DEFAULT_MAX_REDIRECTS;
loop {
let result = self
.client
.execute(request.try_clone().expect("HTTP request must be cloneable"))
.await;
let Ok(response) = result else {
return result;
};
if redirects >= max_redirects {
return Ok(response);
}
let Some(redirect_request) =
request_into_redirect(request, &response, self.cross_origin_credentials_policy)?
else {
return Ok(response);
};
redirects += 1;
request = redirect_request;
}
}
pub fn raw_client(&self) -> &ClientWithMiddleware {
&self.client
}
}
impl From<RedirectClientWithMiddleware> for ClientWithMiddleware {
fn from(item: RedirectClientWithMiddleware) -> ClientWithMiddleware {
item.client
}
}
/// Check if this is should be a redirect and, if so, return a new redirect request.
///
/// This implementation is based on the [`reqwest`] crate redirect implementation.
/// It takes ownership of the original [`Request`] and mutates it to create the new
/// redirect [`Request`].
fn request_into_redirect(
mut req: Request,
res: &Response,
cross_origin_credentials_policy: CrossOriginCredentialsPolicy,
) -> reqwest_middleware::Result<Option<Request>> {
let original_req_url = DisplaySafeUrl::from(req.url().clone());
let status = res.status();
let should_redirect = match status {
StatusCode::MOVED_PERMANENTLY
| StatusCode::FOUND
| StatusCode::TEMPORARY_REDIRECT
| StatusCode::PERMANENT_REDIRECT => true,
StatusCode::SEE_OTHER => {
// Per RFC 7231, HTTP 303 is intended for the user agent
// to perform a GET or HEAD request to the redirect target.
// Historically, some browsers also changed method from POST
// to GET on 301 or 302, but this is not required by RFC 7231
// and was not intended by the HTTP spec.
*req.body_mut() = None;
for header in &[
TRANSFER_ENCODING,
CONTENT_ENCODING,
CONTENT_TYPE,
CONTENT_LENGTH,
] {
req.headers_mut().remove(header);
}
match *req.method() {
Method::GET | Method::HEAD => {}
_ => {
*req.method_mut() = Method::GET;
}
}
true
}
_ => false,
};
if !should_redirect {
return Ok(None);
}
let location = res
.headers()
.get(LOCATION)
.ok_or(reqwest_middleware::Error::Middleware(anyhow!(
"Server returned redirect (HTTP {status}) without destination URL. This may indicate a server configuration issue"
)))?
.to_str()
.map_err(|_| {
reqwest_middleware::Error::Middleware(anyhow!(
"Invalid HTTP {status} 'Location' value: must only contain visible ascii characters"
))
})?;
let mut redirect_url = match DisplaySafeUrl::parse(location) {
Ok(url) => url,
// Per RFC 7231, URLs should be resolved against the request URL.
Err(ParseError::RelativeUrlWithoutBase) => original_req_url.join(location).map_err(|err| {
reqwest_middleware::Error::Middleware(anyhow!(
"Invalid HTTP {status} 'Location' value `{location}` relative to `{original_req_url}`: {err}"
))
})?,
Err(err) => {
return Err(reqwest_middleware::Error::Middleware(anyhow!(
"Invalid HTTP {status} 'Location' value `{location}`: {err}"
)));
}
};
// Per RFC 7231, fragments must be propagated
if let Some(fragment) = original_req_url.fragment() {
redirect_url.set_fragment(Some(fragment));
}
// Ensure the URL is a valid HTTP URI.
if let Err(err) = redirect_url.as_str().parse::<http::Uri>() {
return Err(reqwest_middleware::Error::Middleware(anyhow!(
"HTTP {status} 'Location' value `{redirect_url}` is not a valid HTTP URI: {err}"
)));
}
if redirect_url.scheme() != "http" && redirect_url.scheme() != "https" {
return Err(reqwest_middleware::Error::Middleware(anyhow!(
"Invalid HTTP {status} 'Location' value `{redirect_url}`: scheme needs to be https or http"
)));
}
let mut headers = HeaderMap::new();
std::mem::swap(req.headers_mut(), &mut headers);
let cross_host = redirect_url.host_str() != original_req_url.host_str()
|| redirect_url.port_or_known_default() != original_req_url.port_or_known_default();
if cross_host {
if cross_origin_credentials_policy == CrossOriginCredentialsPolicy::Secure {
debug!("Received a cross-origin redirect. Removing sensitive headers.");
headers.remove(AUTHORIZATION);
headers.remove(COOKIE);
headers.remove(PROXY_AUTHORIZATION);
headers.remove(WWW_AUTHENTICATE);
}
// If the redirect request is not a cross-origin request and the original request already
// had a Referer header, attempt to set the Referer header for the redirect request.
} else if headers.contains_key(REFERER) {
if let Some(referer) = make_referer(&redirect_url, &original_req_url) {
headers.insert(REFERER, referer);
}
}
// Check if there are credentials on the redirect location itself.
// If so, move them to Authorization header.
if !redirect_url.username().is_empty() {
if let Some(credentials) = Credentials::from_url(&redirect_url) {
let _ = redirect_url.set_username("");
let _ = redirect_url.set_password(None);
headers.insert(AUTHORIZATION, credentials.to_header_value());
}
}
std::mem::swap(req.headers_mut(), &mut headers);
*req.url_mut() = Url::from(redirect_url);
debug!(
"Received HTTP {status}. Redirecting to {}",
DisplaySafeUrl::ref_cast(req.url())
);
Ok(Some(req))
}
/// Return a Referer [`HeaderValue`] according to RFC 7231.
///
/// Return [`None`] if https has been downgraded in the redirect location.
fn make_referer(
redirect_url: &DisplaySafeUrl,
original_url: &DisplaySafeUrl,
) -> Option<HeaderValue> {
if redirect_url.scheme() == "http" && original_url.scheme() == "https" {
return None;
}
let mut referer = original_url.clone();
referer.remove_credentials();
referer.set_fragment(None);
referer.as_str().parse().ok()
}
#[derive(Clone, Copy, Debug, Default, PartialEq, Eq, Hash)]
pub(crate) enum CrossOriginCredentialsPolicy {
/// Do not propagate credentials on cross-origin requests.
#[default]
Secure,
/// Propagate credentials on cross-origin requests.
///
/// WARNING: This should only be available for tests. In production code, preserving credentials
/// during cross-origin redirects can lead to security vulnerabilities including credential
/// leakage to untrusted domains.
#[cfg(test)]
Insecure,
}
/// A builder to construct the properties of a `Request`.
///
/// This wraps [`reqwest_middleware::RequestBuilder`] to ensure that the [`BaseClient`]
/// redirect policy is respected if `send()` is called.
#[derive(Debug)]
#[must_use]
pub struct RequestBuilder<'a> {
builder: reqwest_middleware::RequestBuilder,
client: &'a RedirectClientWithMiddleware,
}
impl<'a> RequestBuilder<'a> {
pub fn new(
builder: reqwest_middleware::RequestBuilder,
client: &'a RedirectClientWithMiddleware,
) -> Self {
Self { builder, client }
}
/// Add a `Header` to this Request.
pub fn header<K, V>(mut self, key: K, value: V) -> Self
where
HeaderName: TryFrom<K>,
<HeaderName as TryFrom<K>>::Error: Into<http::Error>,
HeaderValue: TryFrom<V>,
<HeaderValue as TryFrom<V>>::Error: Into<http::Error>,
{
self.builder = self.builder.header(key, value);
self
}
/// Add a set of Headers to the existing ones on this Request.
///
/// The headers will be merged in to any already set.
pub fn headers(mut self, headers: HeaderMap) -> Self {
self.builder = self.builder.headers(headers);
self
}
#[cfg(not(target_arch = "wasm32"))]
pub fn version(mut self, version: reqwest::Version) -> Self {
self.builder = self.builder.version(version);
self
}
#[cfg_attr(docsrs, doc(cfg(feature = "multipart")))]
pub fn multipart(mut self, multipart: multipart::Form) -> Self {
self.builder = self.builder.multipart(multipart);
self
}
/// Build a `Request`.
pub fn build(self) -> reqwest::Result<Request> {
self.builder.build()
}
/// Constructs the Request and sends it to the target URL, returning a
/// future Response.
pub async fn send(self) -> reqwest_middleware::Result<Response> {
self.client.execute(self.build()?).await
}
pub fn raw_builder(&self) -> &reqwest_middleware::RequestBuilder {
&self.builder
}
}
/// Extends [`DefaultRetryableStrategy`], to log transient request failures and additional retry cases.
pub struct UvRetryableStrategy;
@ -528,3 +933,204 @@ fn find_source<E: Error + 'static>(orig: &dyn Error) -> Option<&E> {
fn find_sources<E: Error + 'static>(orig: &dyn Error) -> impl Iterator<Item = &E> {
iter::successors(find_source::<E>(orig), |&err| find_source(err))
}
#[cfg(test)]
mod tests {
use super::*;
use anyhow::Result;
use reqwest::{Client, Method};
use wiremock::matchers::method;
use wiremock::{Mock, MockServer, ResponseTemplate};
use crate::base_client::request_into_redirect;
#[tokio::test]
async fn test_redirect_preserves_authorization_header_on_same_origin() -> Result<()> {
for status in &[301, 302, 303, 307, 308] {
let server = MockServer::start().await;
Mock::given(method("GET"))
.respond_with(
ResponseTemplate::new(*status)
.insert_header("location", format!("{}/redirect", server.uri())),
)
.mount(&server)
.await;
let request = Client::new()
.get(server.uri())
.basic_auth("username", Some("password"))
.build()
.unwrap();
assert!(request.headers().contains_key(AUTHORIZATION));
let response = Client::builder()
.redirect(reqwest::redirect::Policy::none())
.build()
.unwrap()
.execute(request.try_clone().unwrap())
.await
.unwrap();
let redirect_request =
request_into_redirect(request, &response, CrossOriginCredentialsPolicy::Secure)?
.unwrap();
assert!(redirect_request.headers().contains_key(AUTHORIZATION));
}
Ok(())
}
#[tokio::test]
async fn test_redirect_preserves_fragment() -> Result<()> {
for status in &[301, 302, 303, 307, 308] {
let server = MockServer::start().await;
Mock::given(method("GET"))
.respond_with(
ResponseTemplate::new(*status)
.insert_header("location", format!("{}/redirect", server.uri())),
)
.mount(&server)
.await;
let request = Client::new()
.get(format!("{}#fragment", server.uri()))
.build()
.unwrap();
let response = Client::builder()
.redirect(reqwest::redirect::Policy::none())
.build()
.unwrap()
.execute(request.try_clone().unwrap())
.await
.unwrap();
let redirect_request =
request_into_redirect(request, &response, CrossOriginCredentialsPolicy::Secure)?
.unwrap();
assert!(
redirect_request
.url()
.fragment()
.is_some_and(|fragment| fragment == "fragment")
);
}
Ok(())
}
#[tokio::test]
async fn test_redirect_removes_authorization_header_on_cross_origin() -> Result<()> {
for status in &[301, 302, 303, 307, 308] {
let server = MockServer::start().await;
Mock::given(method("GET"))
.respond_with(
ResponseTemplate::new(*status)
.insert_header("location", "https://cross-origin.com/simple"),
)
.mount(&server)
.await;
let request = Client::new()
.get(server.uri())
.basic_auth("username", Some("password"))
.build()
.unwrap();
assert!(request.headers().contains_key(AUTHORIZATION));
let response = Client::builder()
.redirect(reqwest::redirect::Policy::none())
.build()
.unwrap()
.execute(request.try_clone().unwrap())
.await
.unwrap();
let redirect_request =
request_into_redirect(request, &response, CrossOriginCredentialsPolicy::Secure)?
.unwrap();
assert!(!redirect_request.headers().contains_key(AUTHORIZATION));
}
Ok(())
}
#[tokio::test]
async fn test_redirect_303_changes_post_to_get() -> Result<()> {
let server = MockServer::start().await;
Mock::given(method("POST"))
.respond_with(
ResponseTemplate::new(303)
.insert_header("location", format!("{}/redirect", server.uri())),
)
.mount(&server)
.await;
let request = Client::new()
.post(server.uri())
.basic_auth("username", Some("password"))
.build()
.unwrap();
assert_eq!(request.method(), Method::POST);
let response = Client::builder()
.redirect(reqwest::redirect::Policy::none())
.build()
.unwrap()
.execute(request.try_clone().unwrap())
.await
.unwrap();
let redirect_request =
request_into_redirect(request, &response, CrossOriginCredentialsPolicy::Secure)?
.unwrap();
assert_eq!(redirect_request.method(), Method::GET);
Ok(())
}
#[tokio::test]
async fn test_redirect_no_referer_if_disabled() -> Result<()> {
for status in &[301, 302, 303, 307, 308] {
let server = MockServer::start().await;
Mock::given(method("GET"))
.respond_with(
ResponseTemplate::new(*status)
.insert_header("location", format!("{}/redirect", server.uri())),
)
.mount(&server)
.await;
let request = Client::builder()
.referer(false)
.build()
.unwrap()
.get(server.uri())
.basic_auth("username", Some("password"))
.build()
.unwrap();
assert!(!request.headers().contains_key(REFERER));
let response = Client::builder()
.redirect(reqwest::redirect::Policy::none())
.build()
.unwrap()
.execute(request.try_clone().unwrap())
.await
.unwrap();
let redirect_request =
request_into_redirect(request, &response, CrossOriginCredentialsPolicy::Secure)?
.unwrap();
assert!(!redirect_request.headers().contains_key(REFERER));
}
Ok(())
}
}

View file

@ -1,4 +1,3 @@
use std::fmt::{Debug, Display, Formatter};
use std::time::{Duration, SystemTime};
use std::{borrow::Cow, path::Path};
@ -100,44 +99,62 @@ where
}
}
/// Either a cached client error or a (user specified) error from the callback
/// Dispatch type: Either a cached client error or a (user specified) error from the callback
pub enum CachedClientError<CallbackError: std::error::Error + 'static> {
Client(Error),
Callback(CallbackError),
Client {
retries: Option<u32>,
err: Error,
},
Callback {
retries: Option<u32>,
err: CallbackError,
},
}
impl<CallbackError: std::error::Error + 'static> Display for CachedClientError<CallbackError> {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
impl<CallbackError: std::error::Error + 'static> CachedClientError<CallbackError> {
/// Attach the number of retries to the error context.
///
/// Adds to existing errors if any, in case different layers retried.
fn with_retries(self, retries: u32) -> Self {
match self {
CachedClientError::Client(err) => write!(f, "{err}"),
CachedClientError::Callback(err) => write!(f, "{err}"),
CachedClientError::Client {
retries: existing_retries,
err,
} => CachedClientError::Client {
retries: Some(existing_retries.unwrap_or_default() + retries),
err,
},
CachedClientError::Callback {
retries: existing_retries,
err,
} => CachedClientError::Callback {
retries: Some(existing_retries.unwrap_or_default() + retries),
err,
},
}
}
}
impl<CallbackError: std::error::Error + 'static> Debug for CachedClientError<CallbackError> {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
fn retries(&self) -> Option<u32> {
match self {
CachedClientError::Client(err) => write!(f, "{err:?}"),
CachedClientError::Callback(err) => write!(f, "{err:?}"),
CachedClientError::Client { retries, .. } => *retries,
CachedClientError::Callback { retries, .. } => *retries,
}
}
}
impl<CallbackError: std::error::Error + 'static> std::error::Error
for CachedClientError<CallbackError>
{
fn source(&self) -> Option<&(dyn std::error::Error + 'static)> {
fn error(&self) -> &dyn std::error::Error {
match self {
CachedClientError::Client(err) => Some(err),
CachedClientError::Callback(err) => Some(err),
CachedClientError::Client { err, .. } => err,
CachedClientError::Callback { err, .. } => err,
}
}
}
impl<CallbackError: std::error::Error + 'static> From<Error> for CachedClientError<CallbackError> {
fn from(error: Error) -> Self {
Self::Client(error)
Self::Client {
retries: None,
err: error,
}
}
}
@ -145,15 +162,35 @@ impl<CallbackError: std::error::Error + 'static> From<ErrorKind>
for CachedClientError<CallbackError>
{
fn from(error: ErrorKind) -> Self {
Self::Client(error.into())
Self::Client {
retries: None,
err: error.into(),
}
}
}
impl<E: Into<Self> + std::error::Error + 'static> From<CachedClientError<E>> for Error {
/// Attach retry error context, if there were retries.
fn from(error: CachedClientError<E>) -> Self {
match error {
CachedClientError::Client(error) => error,
CachedClientError::Callback(error) => error.into(),
CachedClientError::Client {
retries: Some(retries),
err,
} => ErrorKind::RequestWithRetries {
source: Box::new(err.into_kind()),
retries,
}
.into(),
CachedClientError::Client { retries: None, err } => err,
CachedClientError::Callback {
retries: Some(retries),
err,
} => ErrorKind::RequestWithRetries {
source: Box::new(err.into().into_kind()),
retries,
}
.into(),
CachedClientError::Callback { retries: None, err } => err.into(),
}
}
}
@ -385,7 +422,7 @@ impl CachedClient {
let data = response_callback(response)
.boxed_local()
.await
.map_err(|err| CachedClientError::Callback(err))?;
.map_err(|err| CachedClientError::Callback { retries: None, err })?;
let Some(cache_policy) = cache_policy else {
return Ok(data.into_target());
};
@ -486,7 +523,6 @@ impl CachedClient {
debug!("Sending revalidation request for: {url}");
let response = self
.0
.for_host(&url)
.execute(req)
.instrument(info_span!("revalidation_request", url = url.as_str()))
.await
@ -527,12 +563,23 @@ impl CachedClient {
let cache_policy_builder = CachePolicyBuilder::new(&req);
let response = self
.0
.for_host(&url)
.execute(req)
.await
.map_err(|err| ErrorKind::from_reqwest_middleware(url.clone(), err))?
.error_for_status()
.map_err(|err| ErrorKind::from_reqwest(url.clone(), err))?;
.map_err(|err| ErrorKind::from_reqwest_middleware(url.clone(), err))?;
let retry_count = response
.extensions()
.get::<reqwest_retry::RetryCount>()
.map(|retries| retries.value());
if let Err(status_error) = response.error_for_status_ref() {
return Err(CachedClientError::<Error>::Client {
retries: retry_count,
err: ErrorKind::from_reqwest(url, status_error).into(),
}
.into());
}
let cache_policy = cache_policy_builder.build(&response);
let cache_policy = if cache_policy.to_archived().is_storable() {
Some(Box::new(cache_policy))
@ -579,7 +626,7 @@ impl CachedClient {
cache_control: CacheControl,
response_callback: Callback,
) -> Result<Payload::Target, CachedClientError<CallBackError>> {
let mut n_past_retries = 0;
let mut past_retries = 0;
let start_time = SystemTime::now();
let retry_policy = self.uncached().retry_policy();
loop {
@ -587,11 +634,20 @@ impl CachedClient {
let result = self
.get_cacheable(fresh_req, cache_entry, cache_control, &response_callback)
.await;
// Check if the middleware already performed retries
let middleware_retries = match &result {
Err(err) => err.retries().unwrap_or_default(),
Ok(_) => 0,
};
if result
.as_ref()
.is_err_and(|err| is_extended_transient_error(err))
.is_err_and(|err| is_extended_transient_error(err.error()))
{
let retry_decision = retry_policy.should_retry(start_time, n_past_retries);
// If middleware already retried, consider that in our retry budget
let total_retries = past_retries + middleware_retries;
let retry_decision = retry_policy.should_retry(start_time, total_retries);
if let reqwest_retry::RetryDecision::Retry { execute_after } = retry_decision {
debug!(
"Transient failure while handling response from {}; retrying...",
@ -601,10 +657,15 @@ impl CachedClient {
.duration_since(SystemTime::now())
.unwrap_or_else(|_| Duration::default());
tokio::time::sleep(duration).await;
n_past_retries += 1;
past_retries += 1;
continue;
}
}
if past_retries > 0 {
return result.map_err(|err| err.with_retries(past_retries));
}
return result;
}
}
@ -622,7 +683,7 @@ impl CachedClient {
cache_entry: &CacheEntry,
response_callback: Callback,
) -> Result<Payload, CachedClientError<CallBackError>> {
let mut n_past_retries = 0;
let mut past_retries = 0;
let start_time = SystemTime::now();
let retry_policy = self.uncached().retry_policy();
loop {
@ -630,12 +691,20 @@ impl CachedClient {
let result = self
.skip_cache(fresh_req, cache_entry, &response_callback)
.await;
// Check if the middleware already performed retries
let middleware_retries = match &result {
Err(err) => err.retries().unwrap_or_default(),
_ => 0,
};
if result
.as_ref()
.err()
.is_some_and(|err| is_extended_transient_error(err))
.is_some_and(|err| is_extended_transient_error(err.error()))
{
let retry_decision = retry_policy.should_retry(start_time, n_past_retries);
let total_retries = past_retries + middleware_retries;
let retry_decision = retry_policy.should_retry(start_time, total_retries);
if let reqwest_retry::RetryDecision::Retry { execute_after } = retry_decision {
debug!(
"Transient failure while handling response from {}; retrying...",
@ -645,10 +714,15 @@ impl CachedClient {
.duration_since(SystemTime::now())
.unwrap_or_else(|_| Duration::default());
tokio::time::sleep(duration).await;
n_past_retries += 1;
past_retries += 1;
continue;
}
}
if past_retries > 0 {
return result.map_err(|err| err.with_retries(past_retries));
}
return result;
}
}

View file

@ -197,6 +197,13 @@ pub enum ErrorKind {
#[error("Failed to fetch: `{0}`")]
WrappedReqwestError(DisplaySafeUrl, #[source] WrappedReqwestError),
/// Add the number of failed retries to the error.
#[error("Request failed after {retries} retries")]
RequestWithRetries {
source: Box<ErrorKind>,
retries: u32,
},
#[error("Received some unexpected JSON from {}", url)]
BadJson {
source: serde_json::Error,

View file

@ -246,7 +246,7 @@ impl<'a> FlatIndexClient<'a> {
.collect();
Ok(FlatIndexEntries::from_entries(files))
}
Err(CachedClientError::Client(err)) if err.is_offline() => {
Err(CachedClientError::Client { err, .. }) if err.is_offline() => {
Ok(FlatIndexEntries::offline())
}
Err(err) => Err(err.into()),

View file

@ -21,7 +21,6 @@ use crate::rkyvutil::OwnedArchive;
rkyv::Serialize,
)]
#[rkyv(derive(Debug))]
#[allow(clippy::struct_excessive_bools)]
pub struct CacheControl {
// directives for requests and responses
/// * <https://www.rfc-editor.org/rfc/rfc9111.html#name-max-age>

View file

@ -1,6 +1,6 @@
pub use base_client::{
AuthIntegration, BaseClient, BaseClientBuilder, DEFAULT_RETRIES, ExtraMiddleware,
UvRetryableStrategy, is_extended_transient_error,
RedirectClientWithMiddleware, RequestBuilder, UvRetryableStrategy, is_extended_transient_error,
};
pub use cached_client::{CacheControl, CachedClient, CachedClientError, DataWithCachePolicy};
pub use error::{Error, ErrorKind, WrappedReqwestError};

View file

@ -10,7 +10,6 @@ use futures::{FutureExt, StreamExt, TryStreamExt};
use http::{HeaderMap, StatusCode};
use itertools::Either;
use reqwest::{Proxy, Response};
use reqwest_middleware::ClientWithMiddleware;
use rustc_hash::FxHashMap;
use tokio::sync::{Mutex, Semaphore};
use tracing::{Instrument, debug, info_span, instrument, trace, warn};
@ -35,15 +34,15 @@ use uv_redacted::DisplaySafeUrl;
use uv_small_str::SmallString;
use uv_torch::TorchStrategy;
use crate::base_client::{BaseClientBuilder, ExtraMiddleware};
use crate::base_client::{BaseClientBuilder, ExtraMiddleware, RedirectPolicy};
use crate::cached_client::CacheControl;
use crate::flat_index::FlatIndexEntry;
use crate::html::SimpleHtml;
use crate::remote_metadata::wheel_metadata_from_remote_zip;
use crate::rkyvutil::OwnedArchive;
use crate::{
BaseClient, CachedClient, CachedClientError, Error, ErrorKind, FlatIndexClient,
FlatIndexEntries,
BaseClient, CachedClient, Error, ErrorKind, FlatIndexClient, FlatIndexEntries,
RedirectClientWithMiddleware,
};
/// A builder for an [`RegistryClient`].
@ -152,9 +151,23 @@ impl<'a> RegistryClientBuilder<'a> {
self
}
/// Allows credentials to be propagated on cross-origin redirects.
///
/// WARNING: This should only be available for tests. In production code, propagating credentials
/// during cross-origin redirects can lead to security vulnerabilities including credential
/// leakage to untrusted domains.
#[cfg(test)]
#[must_use]
pub fn allow_cross_origin_credentials(mut self) -> Self {
self.base_client_builder = self.base_client_builder.allow_cross_origin_credentials();
self
}
pub fn build(self) -> RegistryClient {
// Build a base client
let builder = self.base_client_builder;
let builder = self
.base_client_builder
.redirect(RedirectPolicy::RetriggerMiddleware);
let client = builder.build();
@ -251,7 +264,7 @@ impl RegistryClient {
}
/// Return the [`BaseClient`] used by this client.
pub fn uncached_client(&self, url: &DisplaySafeUrl) -> &ClientWithMiddleware {
pub fn uncached_client(&self, url: &DisplaySafeUrl) -> &RedirectClientWithMiddleware {
self.client.uncached().for_host(url)
}
@ -558,7 +571,7 @@ impl RegistryClient {
let simple_request = self
.uncached_client(url)
.get(Url::from(url.clone()))
.header("Accept-Encoding", "gzip")
.header("Accept-Encoding", "gzip, deflate, zstd")
.header("Accept", MediaType::accepts())
.build()
.map_err(|err| ErrorKind::from_reqwest(url.clone(), err))?;
@ -607,18 +620,16 @@ impl RegistryClient {
.boxed_local()
.instrument(info_span!("parse_simple_api", package = %package_name))
};
self.cached_client()
let simple = self
.cached_client()
.get_cacheable_with_retry(
simple_request,
cache_entry,
cache_control,
parse_simple_response,
)
.await
.map_err(|err| match err {
CachedClientError::Client(err) => err,
CachedClientError::Callback(err) => err,
})
.await?;
Ok(simple)
}
/// Fetch the [`SimpleMetadata`] from a local file, using a PEP 503-compatible directory
@ -900,15 +911,13 @@ impl RegistryClient {
.map_err(|err| ErrorKind::AsyncHttpRangeReader(url.clone(), err))?;
trace!("Getting metadata for {filename} by range request");
let text = wheel_metadata_from_remote_zip(filename, url, &mut reader).await?;
let metadata =
ResolutionMetadata::parse_metadata(text.as_bytes()).map_err(|err| {
Error::from(ErrorKind::MetadataParseError(
filename.clone(),
url.to_string(),
Box::new(err),
))
})?;
Ok::<ResolutionMetadata, CachedClientError<Error>>(metadata)
ResolutionMetadata::parse_metadata(text.as_bytes()).map_err(|err| {
Error::from(ErrorKind::MetadataParseError(
filename.clone(),
url.to_string(),
Box::new(err),
))
})
}
.boxed_local()
.instrument(info_span!("read_metadata_range_request", wheel = %filename))
@ -1222,12 +1231,191 @@ impl Connectivity {
mod tests {
use std::str::FromStr;
use url::Url;
use uv_normalize::PackageName;
use uv_pypi_types::{JoinRelativeError, SimpleJson};
use uv_redacted::DisplaySafeUrl;
use crate::{SimpleMetadata, SimpleMetadatum, html::SimpleHtml};
use uv_cache::Cache;
use wiremock::matchers::{basic_auth, method, path_regex};
use wiremock::{Mock, MockServer, ResponseTemplate};
use crate::RegistryClientBuilder;
type Error = Box<dyn std::error::Error>;
async fn start_test_server(username: &'static str, password: &'static str) -> MockServer {
let server = MockServer::start().await;
Mock::given(method("GET"))
.and(basic_auth(username, password))
.respond_with(ResponseTemplate::new(200))
.mount(&server)
.await;
Mock::given(method("GET"))
.respond_with(ResponseTemplate::new(401))
.mount(&server)
.await;
server
}
#[tokio::test]
async fn test_redirect_to_server_with_credentials() -> Result<(), Error> {
let username = "user";
let password = "password";
let auth_server = start_test_server(username, password).await;
let auth_base_url = DisplaySafeUrl::parse(&auth_server.uri())?;
let redirect_server = MockServer::start().await;
// Configure the redirect server to respond with a 302 to the auth server
Mock::given(method("GET"))
.respond_with(
ResponseTemplate::new(302).insert_header("Location", format!("{auth_base_url}")),
)
.mount(&redirect_server)
.await;
let redirect_server_url = DisplaySafeUrl::parse(&redirect_server.uri())?;
let cache = Cache::temp()?;
let registry_client = RegistryClientBuilder::new(cache)
.allow_cross_origin_credentials()
.build();
let client = registry_client.cached_client().uncached();
assert_eq!(
client
.for_host(&redirect_server_url)
.get(redirect_server.uri())
.send()
.await?
.status(),
401,
"Requests should fail if credentials are missing"
);
let mut url = redirect_server_url.clone();
let _ = url.set_username(username);
let _ = url.set_password(Some(password));
assert_eq!(
client
.for_host(&redirect_server_url)
.get(Url::from(url))
.send()
.await?
.status(),
200,
"Requests should succeed if credentials are present"
);
Ok(())
}
#[tokio::test]
async fn test_redirect_root_relative_url() -> Result<(), Error> {
let username = "user";
let password = "password";
let redirect_server = MockServer::start().await;
// Configure the redirect server to respond with a 307 with a relative URL.
Mock::given(method("GET"))
.and(path_regex("/foo/"))
.respond_with(
ResponseTemplate::new(307).insert_header("Location", "/bar/baz/".to_string()),
)
.mount(&redirect_server)
.await;
Mock::given(method("GET"))
.and(path_regex("/bar/baz/"))
.and(basic_auth(username, password))
.respond_with(ResponseTemplate::new(200))
.mount(&redirect_server)
.await;
let redirect_server_url = DisplaySafeUrl::parse(&redirect_server.uri())?.join("foo/")?;
let cache = Cache::temp()?;
let registry_client = RegistryClientBuilder::new(cache)
.allow_cross_origin_credentials()
.build();
let client = registry_client.cached_client().uncached();
let mut url = redirect_server_url.clone();
let _ = url.set_username(username);
let _ = url.set_password(Some(password));
assert_eq!(
client
.for_host(&url)
.get(Url::from(url))
.send()
.await?
.status(),
200,
"Requests should succeed for relative URL"
);
Ok(())
}
#[tokio::test]
async fn test_redirect_relative_url() -> Result<(), Error> {
let username = "user";
let password = "password";
let redirect_server = MockServer::start().await;
// Configure the redirect server to respond with a 307 with a relative URL.
Mock::given(method("GET"))
.and(path_regex("/foo/bar/baz/"))
.and(basic_auth(username, password))
.respond_with(ResponseTemplate::new(200))
.mount(&redirect_server)
.await;
Mock::given(method("GET"))
.and(path_regex("/foo/"))
.and(basic_auth(username, password))
.respond_with(
ResponseTemplate::new(307).insert_header("Location", "bar/baz/".to_string()),
)
.mount(&redirect_server)
.await;
let cache = Cache::temp()?;
let registry_client = RegistryClientBuilder::new(cache)
.allow_cross_origin_credentials()
.build();
let client = registry_client.cached_client().uncached();
let redirect_server_url = DisplaySafeUrl::parse(&redirect_server.uri())?.join("foo/")?;
let mut url = redirect_server_url.clone();
let _ = url.set_username(username);
let _ = url.set_password(Some(password));
assert_eq!(
client
.for_host(&url)
.get(Url::from(url))
.send()
.await?
.status(),
200,
"Requests should succeed for relative URL"
);
Ok(())
}
#[test]
fn ignore_failing_files() {
// 1.7.7 has an invalid requires-python field (double comma), 1.7.8 is valid

View file

@ -4,7 +4,7 @@ use uv_pep508::PackageName;
use crate::{PackageNameSpecifier, PackageNameSpecifiers};
#[derive(Copy, Clone, Debug, Default, PartialEq, Eq)]
#[derive(Copy, Clone, Debug, Default, PartialEq, Eq, Hash)]
pub enum BuildKind {
/// A PEP 517 wheel build.
#[default]

View file

@ -295,6 +295,15 @@ pub struct DependencyGroupsWithDefaults {
}
impl DependencyGroupsWithDefaults {
/// Do not enable any groups
///
/// Many places in the code need to know what dependency-groups are active,
/// but various commands or subsystems never enable any dependency-groups,
/// in which case they want this.
pub fn none() -> Self {
DependencyGroups::default().with_defaults(DefaultGroups::default())
}
/// Returns `true` if the specification was enabled, and *only* because it was a default
pub fn contains_because_default(&self, group: &GroupName) -> bool {
self.cur.contains(group) && !self.prev.contains(group)

View file

@ -263,6 +263,14 @@ pub struct ExtrasSpecificationWithDefaults {
}
impl ExtrasSpecificationWithDefaults {
/// Do not enable any extras
///
/// Many places in the code need to know what extras are active,
/// but various commands or subsystems never enable any extras,
/// in which case they want this.
pub fn none() -> Self {
ExtrasSpecification::default().with_defaults(DefaultExtras::default())
}
/// Returns `true` if the specification was enabled, and *only* because it was a default
pub fn contains_because_default(&self, extra: &ExtraName) -> bool {
self.cur.contains(extra) && !self.prev.contains(extra)

View file

@ -1,3 +1,5 @@
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::str::FromStr;
use uv_pep508::PackageName;
@ -63,28 +65,16 @@ impl<'de> serde::Deserialize<'de> for PackageNameSpecifier {
#[cfg(feature = "schemars")]
impl schemars::JsonSchema for PackageNameSpecifier {
fn schema_name() -> String {
"PackageNameSpecifier".to_string()
fn schema_name() -> Cow<'static, str> {
Cow::Borrowed("PackageNameSpecifier")
}
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
schemars::schema::SchemaObject {
instance_type: Some(schemars::schema::InstanceType::String.into()),
string: Some(Box::new(schemars::schema::StringValidation {
// See: https://packaging.python.org/en/latest/specifications/name-normalization/#name-format
pattern: Some(
r"^(:none:|:all:|([a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9._-]*[a-zA-Z0-9]))$"
.to_string(),
),
..schemars::schema::StringValidation::default()
})),
metadata: Some(Box::new(schemars::schema::Metadata {
description: Some("The name of a package, or `:all:` or `:none:` to select or omit all packages, respectively.".to_string()),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
fn json_schema(_gen: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::json_schema!({
"type": "string",
"pattern": r"^(:none:|:all:|([a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9._-]*[a-zA-Z0-9]))$",
"description": "The name of a package, or `:all:` or `:none:` to select or omit all packages, respectively.",
})
}
}

View file

@ -1,5 +1,6 @@
use std::fmt::Formatter;
use std::str::FromStr;
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::{fmt::Formatter, str::FromStr};
use uv_pep440::{Version, VersionSpecifier, VersionSpecifiers, VersionSpecifiersParseError};
@ -36,20 +37,15 @@ impl FromStr for RequiredVersion {
#[cfg(feature = "schemars")]
impl schemars::JsonSchema for RequiredVersion {
fn schema_name() -> String {
String::from("RequiredVersion")
fn schema_name() -> Cow<'static, str> {
Cow::Borrowed("RequiredVersion")
}
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
schemars::schema::SchemaObject {
instance_type: Some(schemars::schema::InstanceType::String.into()),
metadata: Some(Box::new(schemars::schema::Metadata {
description: Some("A version specifier, e.g. `>=0.5.0` or `==0.5.0`.".to_string()),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::json_schema!({
"type": "string",
"description": "A version specifier, e.g. `>=0.5.0` or `==0.5.0`."
})
}
}

View file

@ -1,4 +1,6 @@
#[derive(Debug, Default, Clone, Copy, PartialEq, Eq, serde::Serialize, serde::Deserialize)]
#[derive(
Debug, Default, Clone, Copy, PartialEq, Eq, Hash, serde::Serialize, serde::Deserialize,
)]
#[serde(rename_all = "kebab-case", deny_unknown_fields)]
pub enum SourceStrategy {
/// Use `tool.uv.sources` when resolving dependencies.

View file

@ -226,6 +226,10 @@ pub enum TargetTriple {
#[serde(rename = "aarch64-manylinux_2_40")]
#[serde(alias = "aarch64-manylinux240")]
Aarch64Manylinux240,
/// A wasm32 target using the the Pyodide 2024 platform. Meant for use with Python 3.12.
#[cfg_attr(feature = "clap", value(name = "wasm32-pyodide2024"))]
Wasm32Pyodide2024,
}
impl TargetTriple {
@ -450,6 +454,13 @@ impl TargetTriple {
},
Arch::Aarch64,
),
Self::Wasm32Pyodide2024 => Platform::new(
Os::Pyodide {
major: 2024,
minor: 0,
},
Arch::Wasm32,
),
}
}
@ -490,6 +501,7 @@ impl TargetTriple {
Self::Aarch64Manylinux238 => "aarch64",
Self::Aarch64Manylinux239 => "aarch64",
Self::Aarch64Manylinux240 => "aarch64",
Self::Wasm32Pyodide2024 => "wasm32",
}
}
@ -530,6 +542,7 @@ impl TargetTriple {
Self::Aarch64Manylinux238 => "Linux",
Self::Aarch64Manylinux239 => "Linux",
Self::Aarch64Manylinux240 => "Linux",
Self::Wasm32Pyodide2024 => "Emscripten",
}
}
@ -570,6 +583,10 @@ impl TargetTriple {
Self::Aarch64Manylinux238 => "",
Self::Aarch64Manylinux239 => "",
Self::Aarch64Manylinux240 => "",
// This is the value Emscripten gives for its version:
// https://github.com/emscripten-core/emscripten/blob/4.0.8/system/lib/libc/emscripten_syscall_stubs.c#L63
// It doesn't really seem to mean anything? But for completeness we include it here.
Self::Wasm32Pyodide2024 => "#1",
}
}
@ -610,6 +627,9 @@ impl TargetTriple {
Self::Aarch64Manylinux238 => "",
Self::Aarch64Manylinux239 => "",
Self::Aarch64Manylinux240 => "",
// This is the Emscripten compiler version for Pyodide 2024.
// See https://pyodide.org/en/stable/development/abi.html#pyodide-2024-0
Self::Wasm32Pyodide2024 => "3.1.58",
}
}
@ -650,6 +670,7 @@ impl TargetTriple {
Self::Aarch64Manylinux238 => "posix",
Self::Aarch64Manylinux239 => "posix",
Self::Aarch64Manylinux240 => "posix",
Self::Wasm32Pyodide2024 => "posix",
}
}
@ -690,6 +711,7 @@ impl TargetTriple {
Self::Aarch64Manylinux238 => "linux",
Self::Aarch64Manylinux239 => "linux",
Self::Aarch64Manylinux240 => "linux",
Self::Wasm32Pyodide2024 => "emscripten",
}
}
@ -730,6 +752,7 @@ impl TargetTriple {
Self::Aarch64Manylinux238 => true,
Self::Aarch64Manylinux239 => true,
Self::Aarch64Manylinux240 => true,
Self::Wasm32Pyodide2024 => false,
}
}

View file

@ -62,7 +62,7 @@ pub static RAYON_PARALLELISM: AtomicUsize = AtomicUsize::new(0);
/// `LazyLock::force(&RAYON_INITIALIZE)`.
pub static RAYON_INITIALIZE: LazyLock<()> = LazyLock::new(|| {
rayon::ThreadPoolBuilder::new()
.num_threads(RAYON_PARALLELISM.load(Ordering::SeqCst))
.num_threads(RAYON_PARALLELISM.load(Ordering::Relaxed))
.stack_size(min_stack_size())
.build_global()
.expect("failed to initialize global rayon pool");

View file

@ -1,4 +1,6 @@
use serde::{Deserialize, Deserializer};
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::str::FromStr;
use url::Url;
@ -143,20 +145,15 @@ impl std::fmt::Display for TrustedHost {
#[cfg(feature = "schemars")]
impl schemars::JsonSchema for TrustedHost {
fn schema_name() -> String {
"TrustedHost".to_string()
fn schema_name() -> Cow<'static, str> {
Cow::Borrowed("TrustedHost")
}
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
schemars::schema::SchemaObject {
instance_type: Some(schemars::schema::InstanceType::String.into()),
metadata: Some(Box::new(schemars::schema::Metadata {
description: Some("A host or host-port pair.".to_string()),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::json_schema!({
"type": "string",
"description": "A host or host-port pair."
})
}
}

View file

@ -4,7 +4,7 @@ use clap::Parser;
use tracing::info;
use uv_cache::{Cache, CacheArgs};
use uv_configuration::Concurrency;
use uv_configuration::{Concurrency, PreviewMode};
use uv_python::{EnvironmentPreference, PythonEnvironment, PythonRequest};
#[derive(Parser)]
@ -26,6 +26,7 @@ pub(crate) async fn compile(args: CompileArgs) -> anyhow::Result<()> {
&PythonRequest::default(),
EnvironmentPreference::OnlyVirtual,
&cache,
PreviewMode::Disabled,
)?
.into_interpreter();
interpreter.sys_executable().to_path_buf()

View file

@ -21,7 +21,7 @@ pub(crate) fn main(args: &Args) -> anyhow::Result<()> {
let filename = "environment.md";
let reference_path = PathBuf::from(ROOT_DIR)
.join("docs")
.join("configuration")
.join("reference")
.join(filename);
match args.mode {

View file

@ -3,7 +3,7 @@ use std::path::PathBuf;
use anstream::println;
use anyhow::{Result, bail};
use pretty_assertions::StrComparison;
use schemars::{JsonSchema, schema_for};
use schemars::JsonSchema;
use serde::Deserialize;
use uv_settings::Options as SettingsOptions;
@ -91,7 +91,10 @@ const REPLACEMENTS: &[(&str, &str)] = &[
/// Generate the JSON schema for the combined options as a string.
fn generate() -> String {
let schema = schema_for!(CombinedOptions);
let settings = schemars::generate::SchemaSettings::draft07();
let generator = schemars::SchemaGenerator::new(settings);
let schema = generator.into_root_schema_for::<CombinedOptions>();
let mut output = serde_json::to_string_pretty(&schema).unwrap();
for (value, replacement) in REPLACEMENTS {

View file

@ -11,7 +11,7 @@ use crate::ROOT_DIR;
use crate::generate_all::Mode;
/// Contains current supported targets
const TARGETS_YML_URL: &str = "https://raw.githubusercontent.com/astral-sh/python-build-standalone/refs/tags/20250529/cpython-unix/targets.yml";
const TARGETS_YML_URL: &str = "https://raw.githubusercontent.com/astral-sh/python-build-standalone/refs/tags/20250702/cpython-unix/targets.yml";
#[derive(clap::Args)]
pub(crate) struct Args {
@ -130,7 +130,7 @@ async fn generate() -> Result<String> {
output.push_str("//! DO NOT EDIT\n");
output.push_str("//!\n");
output.push_str("//! Generated with `cargo run dev generate-sysconfig-metadata`\n");
output.push_str("//! Targets from <https://github.com/astral-sh/python-build-standalone/blob/20250529/cpython-unix/targets.yml>\n");
output.push_str("//! Targets from <https://github.com/astral-sh/python-build-standalone/blob/20250702/cpython-unix/targets.yml>\n");
output.push_str("//!\n");
// Disable clippy/fmt

View file

@ -11,6 +11,7 @@ use itertools::Itertools;
use rustc_hash::FxHashMap;
use thiserror::Error;
use tracing::{debug, instrument, trace};
use uv_build_backend::check_direct_build;
use uv_build_frontend::{SourceBuild, SourceBuildContext};
use uv_cache::Cache;
@ -35,8 +36,8 @@ use uv_resolver::{
PythonRequirement, Resolver, ResolverEnvironment,
};
use uv_types::{
AnyErrorBuild, BuildContext, BuildIsolation, BuildStack, EmptyInstalledPackages, HashStrategy,
InFlight,
AnyErrorBuild, BuildArena, BuildContext, BuildIsolation, BuildStack, EmptyInstalledPackages,
HashStrategy, InFlight,
};
use uv_workspace::WorkspaceCache;
@ -179,6 +180,10 @@ impl BuildContext for BuildDispatch<'_> {
&self.shared_state.git
}
fn build_arena(&self) -> &BuildArena<SourceBuild> {
&self.shared_state.build_arena
}
fn capabilities(&self) -> &IndexCapabilities {
&self.shared_state.capabilities
}
@ -226,6 +231,7 @@ impl BuildContext for BuildDispatch<'_> {
.build(),
&python_requirement,
ResolverEnvironment::specific(marker_env),
self.interpreter.markers(),
// Conflicting groups only make sense when doing universal resolution.
Conflicts::empty(),
Some(tags),
@ -432,6 +438,7 @@ impl BuildContext for BuildDispatch<'_> {
self.build_extra_env_vars.clone(),
build_output,
self.concurrency.builds,
self.preview,
)
.boxed_local()
.await?;
@ -446,12 +453,6 @@ impl BuildContext for BuildDispatch<'_> {
build_kind: BuildKind,
version_id: Option<&'data str>,
) -> Result<Option<DistFilename>, BuildDispatchError> {
// Direct builds are a preview feature with the uv build backend.
if self.preview.is_disabled() {
trace!("Preview is disabled, not checking for direct build");
return Ok(None);
}
let source_tree = if let Some(subdir) = subdirectory {
source.join(subdir)
} else {
@ -519,6 +520,8 @@ pub struct SharedState {
index: InMemoryIndex,
/// The downloaded distributions.
in_flight: InFlight,
/// Build directories for any PEP 517 builds executed during resolution or installation.
build_arena: BuildArena<SourceBuild>,
}
impl SharedState {
@ -531,6 +534,7 @@ impl SharedState {
Self {
git: self.git.clone(),
capabilities: self.capabilities.clone(),
build_arena: self.build_arena.clone(),
..Default::default()
}
}
@ -554,4 +558,9 @@ impl SharedState {
pub fn capabilities(&self) -> &IndexCapabilities {
&self.capabilities
}
/// Return the [`BuildArena`] used by the [`SharedState`].
pub fn build_arena(&self) -> &BuildArena<SourceBuild> {
&self.build_arena
}
}

View file

@ -27,7 +27,6 @@ rkyv = { workspace = true, features = ["smallvec-1"] }
serde = { workspace = true }
smallvec = { workspace = true }
thiserror = { workspace = true }
url = { workspace = true }
[dev-dependencies]
insta = { version = "1.40.0" }

View file

@ -5,7 +5,6 @@ use std::str::FromStr;
use memchr::memchr;
use serde::{Deserialize, Deserializer, Serialize, Serializer, de};
use thiserror::Error;
use url::Url;
use uv_cache_key::cache_digest;
use uv_normalize::{InvalidNameError, PackageName};
@ -300,29 +299,6 @@ impl WheelFilename {
}
}
impl TryFrom<&Url> for WheelFilename {
type Error = WheelFilenameError;
fn try_from(url: &Url) -> Result<Self, Self::Error> {
let filename = url
.path_segments()
.ok_or_else(|| {
WheelFilenameError::InvalidWheelFileName(
url.to_string(),
"URL must have a path".to_string(),
)
})?
.next_back()
.ok_or_else(|| {
WheelFilenameError::InvalidWheelFileName(
url.to_string(),
"URL must contain a filename".to_string(),
)
})?;
Self::from_str(filename)
}
}
impl<'de> Deserialize<'de> for WheelFilename {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where

View file

@ -29,6 +29,7 @@ uv-platform-tags = { workspace = true }
uv-pypi-types = { workspace = true }
uv-redacted = { workspace = true }
uv-small-str = { workspace = true }
uv-warnings = { workspace = true }
arcstr = { workspace = true }
bitflags = { workspace = true }

View file

@ -26,7 +26,11 @@ impl std::fmt::Display for SourceAnnotation {
write!(f, "{project_name} ({})", path.portable_display())
}
RequirementOrigin::Group(path, project_name, group) => {
write!(f, "{project_name} ({}:{group})", path.portable_display())
if let Some(project_name) = project_name {
write!(f, "{project_name} ({}:{group})", path.portable_display())
} else {
write!(f, "({}:{group})", path.portable_display())
}
}
RequirementOrigin::Workspace => {
write!(f, "(workspace)")
@ -45,11 +49,15 @@ impl std::fmt::Display for SourceAnnotation {
}
RequirementOrigin::Group(path, project_name, group) => {
// Group is not used for override
write!(
f,
"--override {project_name} ({}:{group})",
path.portable_display()
)
if let Some(project_name) = project_name {
write!(
f,
"--override {project_name} ({}:{group})",
path.portable_display()
)
} else {
write!(f, "--override ({}:{group})", path.portable_display())
}
}
RequirementOrigin::Workspace => {
write!(f, "--override (workspace)")

View file

@ -1,3 +1,4 @@
use std::borrow::Cow;
use std::fmt::{self, Display, Formatter};
use std::str::FromStr;
@ -160,16 +161,33 @@ impl UrlString {
.unwrap_or(self.as_ref())
}
/// Return the [`UrlString`] with any fragments removed.
/// Return the [`UrlString`] (as a [`Cow`]) with any fragments removed.
#[must_use]
pub fn without_fragment(&self) -> Self {
Self(
self.as_ref()
.split_once('#')
.map(|(path, _)| path)
.map(SmallString::from)
.unwrap_or_else(|| self.0.clone()),
)
pub fn without_fragment(&self) -> Cow<'_, Self> {
self.as_ref()
.split_once('#')
.map(|(path, _)| Cow::Owned(UrlString(SmallString::from(path))))
.unwrap_or(Cow::Borrowed(self))
}
/// Return the [`UrlString`] (as a [`Cow`]) with trailing slash removed.
///
/// This matches the semantics of [`Url::pop_if_empty`], which will not trim a trailing slash if
/// it's the only path segment, e.g., `https://example.com/` would be unchanged.
#[must_use]
pub fn without_trailing_slash(&self) -> Cow<'_, Self> {
self.as_ref()
.strip_suffix('/')
.filter(|path| {
// Only strip the trailing slash if there's _another_ trailing slash that isn't a
// part of the scheme.
path.split_once("://")
.map(|(_scheme, rest)| rest)
.unwrap_or(path)
.contains('/')
})
.map(|path| Cow::Owned(UrlString(SmallString::from(path))))
.unwrap_or(Cow::Borrowed(self))
}
}
@ -252,16 +270,51 @@ mod tests {
#[test]
fn without_fragment() {
// Borrows a URL without a fragment
let url = UrlString("https://example.com/path".into());
assert_eq!(&*url.without_fragment(), &url);
assert!(matches!(url.without_fragment(), Cow::Borrowed(_)));
// Removes the fragment if present on the URL
let url = UrlString("https://example.com/path?query#fragment".into());
assert_eq!(
url.without_fragment(),
UrlString("https://example.com/path?query".into())
&*url.without_fragment(),
&UrlString("https://example.com/path?query".into())
);
assert!(matches!(url.without_fragment(), Cow::Owned(_)));
}
let url = UrlString("https://example.com/path#fragment".into());
assert_eq!(url.base_str(), "https://example.com/path");
#[test]
fn without_trailing_slash() {
// Borrows a URL without a slash
let url = UrlString("https://example.com/path".into());
assert_eq!(url.base_str(), "https://example.com/path");
assert_eq!(&*url.without_trailing_slash(), &url);
assert!(matches!(url.without_trailing_slash(), Cow::Borrowed(_)));
// Removes the trailing slash if present on the URL
let url = UrlString("https://example.com/path/".into());
assert_eq!(
&*url.without_trailing_slash(),
&UrlString("https://example.com/path".into())
);
assert!(matches!(url.without_trailing_slash(), Cow::Owned(_)));
// Does not remove a trailing slash if it's the only path segment
let url = UrlString("https://example.com/".into());
assert_eq!(&*url.without_trailing_slash(), &url);
assert!(matches!(url.without_trailing_slash(), Cow::Borrowed(_)));
// Does not remove a trailing slash if it's the only path segment with a missing scheme
let url = UrlString("example.com/".into());
assert_eq!(&*url.without_trailing_slash(), &url);
assert!(matches!(url.without_trailing_slash(), Cow::Borrowed(_)));
// Removes the trailing slash when the scheme is missing
let url = UrlString("example.com/path/".into());
assert_eq!(
&*url.without_trailing_slash(),
&UrlString("example.com/path".into())
);
assert!(matches!(url.without_trailing_slash(), Cow::Owned(_)));
}
}

View file

@ -12,6 +12,7 @@ use url::{ParseError, Url};
use uv_pep508::{Scheme, VerbatimUrl, VerbatimUrlError, split_scheme};
use uv_redacted::DisplaySafeUrl;
use uv_warnings::warn_user;
use crate::{Index, IndexStatusCodeStrategy, Verbatim};
@ -37,6 +38,8 @@ impl IndexUrl {
///
/// If no root directory is provided, relative paths are resolved against the current working
/// directory.
///
/// Normalizes non-file URLs by removing trailing slashes for consistency.
pub fn parse(path: &str, root_dir: Option<&Path>) -> Result<Self, IndexUrlError> {
let url = match split_scheme(path) {
Some((scheme, ..)) => {
@ -92,20 +95,15 @@ impl IndexUrl {
#[cfg(feature = "schemars")]
impl schemars::JsonSchema for IndexUrl {
fn schema_name() -> String {
"IndexUrl".to_string()
fn schema_name() -> Cow<'static, str> {
Cow::Borrowed("IndexUrl")
}
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
schemars::schema::SchemaObject {
instance_type: Some(schemars::schema::InstanceType::String.into()),
metadata: Some(Box::new(schemars::schema::Metadata {
description: Some("The URL of an index to use for fetching packages (e.g., `https://pypi.org/simple`), or a local path.".to_string()),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::json_schema!({
"type": "string",
"description": "The URL of an index to use for fetching packages (e.g., `https://pypi.org/simple`), or a local path."
})
}
}
@ -140,6 +138,30 @@ impl IndexUrl {
Cow::Owned(url)
}
}
/// Warn user if the given URL was provided as an ambiguous relative path.
///
/// This is a temporary warning. Ambiguous values will not be
/// accepted in the future.
pub fn warn_on_disambiguated_relative_path(&self) {
let Self::Path(verbatim_url) = &self else {
return;
};
if let Some(path) = verbatim_url.given() {
if !is_disambiguated_path(path) {
if cfg!(windows) {
warn_user!(
"Relative paths passed to `--index` or `--default-index` should be disambiguated from index names (use `.\\{path}` or `./{path}`). Support for ambiguous values will be removed in the future"
);
} else {
warn_user!(
"Relative paths passed to `--index` or `--default-index` should be disambiguated from index names (use `./{path}`). Support for ambiguous values will be removed in the future"
);
}
}
}
}
}
impl Display for IndexUrl {
@ -162,6 +184,28 @@ impl Verbatim for IndexUrl {
}
}
/// Checks if a path is disambiguated.
///
/// Disambiguated paths are absolute paths, paths with valid schemes,
/// and paths starting with "./" or "../" on Unix or ".\\", "..\\",
/// "./", or "../" on Windows.
fn is_disambiguated_path(path: &str) -> bool {
if cfg!(windows) {
if path.starts_with(".\\") || path.starts_with("..\\") || path.starts_with('/') {
return true;
}
}
if path.starts_with("./") || path.starts_with("../") || Path::new(path).is_absolute() {
return true;
}
// Check if the path has a scheme (like `file://`)
if let Some((scheme, _)) = split_scheme(path) {
return Scheme::parse(scheme).is_some();
}
// This is an ambiguous relative path
false
}
/// An error that can occur when parsing an [`IndexUrl`].
#[derive(Error, Debug)]
pub enum IndexUrlError {
@ -214,13 +258,20 @@ impl<'de> serde::de::Deserialize<'de> for IndexUrl {
}
impl From<VerbatimUrl> for IndexUrl {
fn from(url: VerbatimUrl) -> Self {
fn from(mut url: VerbatimUrl) -> Self {
if url.scheme() == "file" {
Self::Path(Arc::new(url))
} else if *url.raw() == *PYPI_URL {
Self::Pypi(Arc::new(url))
} else {
Self::Url(Arc::new(url))
// Remove trailing slashes for consistency. They'll be re-added if necessary when
// querying the Simple API.
if let Ok(mut path_segments) = url.raw_mut().path_segments_mut() {
path_segments.pop_if_empty();
}
if *url.raw() == *PYPI_URL {
Self::Pypi(Arc::new(url))
} else {
Self::Url(Arc::new(url))
}
}
}
}
@ -411,6 +462,19 @@ impl<'a> IndexLocations {
indexes
}
}
/// Add all authenticated sources to the cache.
pub fn cache_index_credentials(&self) {
for index in self.allowed_indexes() {
if let Some(credentials) = index.credentials() {
let credentials = Arc::new(credentials);
uv_auth::store_credentials(index.raw_url(), credentials.clone());
if let Some(root_url) = index.root_url() {
uv_auth::store_credentials(&root_url, credentials.clone());
}
}
}
}
}
impl From<&IndexLocations> for uv_auth::Indexes {
@ -511,30 +575,23 @@ impl<'a> IndexUrls {
/// iterator.
pub fn defined_indexes(&'a self) -> impl Iterator<Item = &'a Index> + 'a {
if self.no_index {
Either::Left(std::iter::empty())
} else {
Either::Right(
{
let mut seen = FxHashSet::default();
self.indexes
.iter()
.filter(move |index| {
index.name.as_ref().is_none_or(|name| seen.insert(name))
})
.filter(|index| !index.default)
}
.chain({
let mut seen = FxHashSet::default();
self.indexes
.iter()
.filter(move |index| {
index.name.as_ref().is_none_or(|name| seen.insert(name))
})
.find(|index| index.default)
.into_iter()
}),
)
return Either::Left(std::iter::empty());
}
let mut seen = FxHashSet::default();
let (non_default, default) = self
.indexes
.iter()
.filter(move |index| {
if let Some(name) = &index.name {
seen.insert(name)
} else {
true
}
})
.partition::<Vec<_>, _>(|index| !index.default);
Either::Right(non_default.into_iter().chain(default))
}
/// Return the `--no-index` flag.
@ -632,3 +689,41 @@ impl IndexCapabilities {
.insert(Flags::FORBIDDEN);
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_index_url_parse_valid_paths() {
// Absolute path
assert!(is_disambiguated_path("/absolute/path"));
// Relative path
assert!(is_disambiguated_path("./relative/path"));
assert!(is_disambiguated_path("../../relative/path"));
if cfg!(windows) {
// Windows absolute path
assert!(is_disambiguated_path("C:/absolute/path"));
// Windows relative path
assert!(is_disambiguated_path(".\\relative\\path"));
assert!(is_disambiguated_path("..\\..\\relative\\path"));
}
}
#[test]
fn test_index_url_parse_ambiguous_paths() {
// Test single-segment ambiguous path
assert!(!is_disambiguated_path("index"));
// Test multi-segment ambiguous path
assert!(!is_disambiguated_path("relative/path"));
}
#[test]
fn test_index_url_parse_with_schemes() {
assert!(is_disambiguated_path("file:///absolute/path"));
assert!(is_disambiguated_path("https://registry.com/simple/"));
assert!(is_disambiguated_path(
"git+https://github.com/example/repo.git"
));
}
}

View file

@ -73,6 +73,7 @@ pub use crate::pip_index::*;
pub use crate::prioritized_distribution::*;
pub use crate::requested::*;
pub use crate::requirement::*;
pub use crate::requires_python::*;
pub use crate::resolution::*;
pub use crate::resolved::*;
pub use crate::specified_requirement::*;
@ -100,6 +101,7 @@ mod pip_index;
mod prioritized_distribution;
mod requested;
mod requirement;
mod requires_python;
mod resolution;
mod resolved;
mod specified_requirement;

View file

@ -3,6 +3,8 @@
//! flags set.
use serde::{Deserialize, Deserializer, Serialize};
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::path::Path;
use crate::{Index, IndexUrl};
@ -50,14 +52,14 @@ macro_rules! impl_index {
#[cfg(feature = "schemars")]
impl schemars::JsonSchema for $name {
fn schema_name() -> String {
fn schema_name() -> Cow<'static, str> {
IndexUrl::schema_name()
}
fn json_schema(
r#gen: &mut schemars::r#gen::SchemaGenerator,
) -> schemars::schema::Schema {
IndexUrl::json_schema(r#gen)
generator: &mut schemars::generate::SchemaGenerator,
) -> schemars::Schema {
IndexUrl::json_schema(generator)
}
}
};

View file

@ -11,7 +11,7 @@ use uv_platform_tags::{AbiTag, IncompatibleTag, LanguageTag, PlatformTag, TagPri
use uv_pypi_types::{HashDigest, Yanked};
use crate::{
InstalledDist, KnownPlatform, RegistryBuiltDist, RegistryBuiltWheel, RegistrySourceDist,
File, InstalledDist, KnownPlatform, RegistryBuiltDist, RegistryBuiltWheel, RegistrySourceDist,
ResolvedDistRef,
};
@ -557,6 +557,20 @@ impl PrioritizedDist {
self.0.best_wheel_index.map(|i| &self.0.wheels[i])
}
/// Returns an iterator of all wheels and the source distribution, if any.
pub fn files(&self) -> impl Iterator<Item = &File> {
self.0
.wheels
.iter()
.map(|(wheel, _)| wheel.file.as_ref())
.chain(
self.0
.source
.as_ref()
.map(|(source_dist, _)| source_dist.file.as_ref()),
)
}
/// Returns an iterator over all Python tags for the distribution.
pub fn python_tags(&self) -> impl Iterator<Item = LanguageTag> + '_ {
self.0
@ -817,7 +831,7 @@ pub fn implied_markers(filename: &WheelFilename) -> MarkerTree {
tag_marker.and(MarkerTree::expression(MarkerExpression::String {
key: MarkerValueString::PlatformMachine,
operator: MarkerOperator::Equal,
value: arcstr::literal!("x86_64"),
value: arcstr::literal!("AMD64"),
}));
marker.or(tag_marker);
}
@ -911,7 +925,7 @@ mod tests {
);
assert_markers(
"numpy-2.2.1-cp313-cp313t-win_amd64.whl",
"sys_platform == 'win32' and platform_machine == 'x86_64'",
"sys_platform == 'win32' and platform_machine == 'AMD64'",
);
assert_markers(
"numpy-2.2.1-cp313-cp313t-win_arm64.whl",

View file

@ -1,6 +1,6 @@
use std::collections::Bound;
use pubgrub::Range;
use version_ranges::Ranges;
use uv_distribution_filename::WheelFilename;
use uv_pep440::{
@ -66,15 +66,8 @@ impl RequiresPython {
) -> Option<Self> {
// Convert to PubGrub range and perform an intersection.
let range = specifiers
.into_iter()
.map(|specifier| release_specifiers_to_ranges(specifier.clone()))
.fold(None, |range: Option<Range<Version>>, requires_python| {
if let Some(range) = range {
Some(range.intersection(&requires_python))
} else {
Some(requires_python)
}
})?;
.map(|specs| release_specifiers_to_ranges(specs.clone()))
.reduce(|acc, r| acc.intersection(&r))?;
// If the intersection is empty, return `None`.
if range.is_empty() {
@ -97,12 +90,12 @@ impl RequiresPython {
pub fn split(&self, bound: Bound<Version>) -> Option<(Self, Self)> {
let RequiresPythonRange(.., upper) = &self.range;
let upper = Range::from_range_bounds((bound, upper.clone().into()));
let upper = Ranges::from_range_bounds((bound, upper.clone().into()));
let lower = upper.complement();
// Intersect left and right with the existing range.
let lower = lower.intersection(&Range::from(self.range.clone()));
let upper = upper.intersection(&Range::from(self.range.clone()));
let lower = lower.intersection(&Ranges::from(self.range.clone()));
let upper = upper.intersection(&Ranges::from(self.range.clone()));
if lower.is_empty() || upper.is_empty() {
None
@ -353,7 +346,7 @@ impl RequiresPython {
/// a lock file are deserialized and turned into a `ResolutionGraph`, the
/// markers are "complexified" to put the `requires-python` assumption back
/// into the marker explicitly.
pub(crate) fn simplify_markers(&self, marker: MarkerTree) -> MarkerTree {
pub fn simplify_markers(&self, marker: MarkerTree) -> MarkerTree {
let (lower, upper) = (self.range().lower(), self.range().upper());
marker.simplify_python_versions(lower.as_ref(), upper.as_ref())
}
@ -373,7 +366,7 @@ impl RequiresPython {
/// ```text
/// python_full_version >= '3.8' and python_full_version < '3.12'
/// ```
pub(crate) fn complexify_markers(&self, marker: MarkerTree) -> MarkerTree {
pub fn complexify_markers(&self, marker: MarkerTree) -> MarkerTree {
let (lower, upper) = (self.range().lower(), self.range().upper());
marker.complexify_python_versions(lower.as_ref(), upper.as_ref())
}
@ -537,7 +530,7 @@ pub struct RequiresPythonRange(LowerBound, UpperBound);
impl RequiresPythonRange {
/// Initialize a [`RequiresPythonRange`] from a [`Range`].
pub fn from_range(range: &Range<Version>) -> Self {
pub fn from_range(range: &Ranges<Version>) -> Self {
let (lower, upper) = range
.bounding_range()
.map(|(lower_bound, upper_bound)| (lower_bound.cloned(), upper_bound.cloned()))
@ -575,9 +568,9 @@ impl Default for RequiresPythonRange {
}
}
impl From<RequiresPythonRange> for Range<Version> {
impl From<RequiresPythonRange> for Ranges<Version> {
fn from(value: RequiresPythonRange) -> Self {
Range::from_range_bounds::<(Bound<Version>, Bound<Version>), _>((
Ranges::from_range_bounds::<(Bound<Version>, Bound<Version>), _>((
value.0.into(),
value.1.into(),
))
@ -592,21 +585,18 @@ impl From<RequiresPythonRange> for Range<Version> {
/// a simplified marker, one must re-contextualize it by adding the
/// `requires-python` constraint back to the marker.
#[derive(Clone, Copy, Debug, Default, Eq, PartialEq, PartialOrd, Ord, serde::Deserialize)]
pub(crate) struct SimplifiedMarkerTree(MarkerTree);
pub struct SimplifiedMarkerTree(MarkerTree);
impl SimplifiedMarkerTree {
/// Simplifies the given markers by assuming the given `requires-python`
/// bound is true.
pub(crate) fn new(
requires_python: &RequiresPython,
marker: MarkerTree,
) -> SimplifiedMarkerTree {
pub fn new(requires_python: &RequiresPython, marker: MarkerTree) -> SimplifiedMarkerTree {
SimplifiedMarkerTree(requires_python.simplify_markers(marker))
}
/// Complexifies the given markers by adding the given `requires-python` as
/// a constraint to these simplified markers.
pub(crate) fn into_marker(self, requires_python: &RequiresPython) -> MarkerTree {
pub fn into_marker(self, requires_python: &RequiresPython) -> MarkerTree {
requires_python.complexify_markers(self.0)
}
@ -614,12 +604,12 @@ impl SimplifiedMarkerTree {
///
/// This only returns `None` when the underlying marker is always true,
/// i.e., it matches all possible marker environments.
pub(crate) fn try_to_string(self) -> Option<String> {
pub fn try_to_string(self) -> Option<String> {
self.0.try_to_string()
}
/// Returns the underlying marker tree without re-complexifying them.
pub(crate) fn as_simplified_marker_tree(self) -> MarkerTree {
pub fn as_simplified_marker_tree(self) -> MarkerTree {
self.0
}
}

View file

@ -1,3 +1,5 @@
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::ops::Deref;
use http::StatusCode;
@ -136,17 +138,17 @@ impl<'de> Deserialize<'de> for SerializableStatusCode {
#[cfg(feature = "schemars")]
impl schemars::JsonSchema for SerializableStatusCode {
fn schema_name() -> String {
"StatusCode".to_string()
fn schema_name() -> Cow<'static, str> {
Cow::Borrowed("StatusCode")
}
fn json_schema(r#gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
let mut schema = r#gen.subschema_for::<u16>().into_object();
schema.metadata().description = Some("HTTP status code (100-599)".to_string());
schema.number().minimum = Some(100.0);
schema.number().maximum = Some(599.0);
schema.into()
fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::json_schema!({
"type": "number",
"minimum": 100,
"maximum": 599,
"description": "HTTP status code (100-599)"
})
}
}

View file

@ -644,8 +644,8 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
})
.await
.map_err(|err| match err {
CachedClientError::Callback(err) => err,
CachedClientError::Client(err) => Error::Client(err),
CachedClientError::Callback { err, .. } => err,
CachedClientError::Client { err, .. } => Error::Client(err),
})?;
// If the archive is missing the required hashes, or has since been removed, force a refresh.
@ -663,8 +663,8 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
.skip_cache_with_retry(self.request(url)?, &http_entry, download)
.await
.map_err(|err| match err {
CachedClientError::Callback(err) => err,
CachedClientError::Client(err) => Error::Client(err),
CachedClientError::Callback { err, .. } => err,
CachedClientError::Client { err, .. } => Error::Client(err),
})
})
.await?
@ -811,8 +811,8 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
})
.await
.map_err(|err| match err {
CachedClientError::Callback(err) => err,
CachedClientError::Client(err) => Error::Client(err),
CachedClientError::Callback { err, .. } => err,
CachedClientError::Client { err, .. } => Error::Client(err),
})?;
// If the archive is missing the required hashes, or has since been removed, force a refresh.
@ -830,8 +830,8 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
.skip_cache_with_retry(self.request(url)?, &http_entry, download)
.await
.map_err(|err| match err {
CachedClientError::Callback(err) => err,
CachedClientError::Client(err) => Error::Client(err),
CachedClientError::Callback { err, .. } => err,
CachedClientError::Client { err, .. } => Error::Client(err),
})
})
.await?

View file

@ -108,6 +108,8 @@ pub enum Error {
CacheHeal(String, HashAlgorithm),
#[error("The source distribution requires Python {0}, but {1} is installed")]
RequiresPython(VersionSpecifiers, Version),
#[error("Failed to identify base Python interpreter")]
BaseInterpreter(#[source] std::io::Error),
/// A generic request middleware error happened while making a request.
/// Refer to the error message for more details.

View file

@ -4,7 +4,7 @@ pub use error::Error;
pub use index::{BuiltWheelIndex, RegistryWheelIndex};
pub use metadata::{
ArchiveMetadata, BuildRequires, FlatRequiresDist, LoweredRequirement, LoweringError, Metadata,
MetadataError, RequiresDist,
MetadataError, RequiresDist, SourcedDependencyGroups,
};
pub use reporter::Reporter;
pub use source::prune;

View file

@ -0,0 +1,208 @@
use std::collections::BTreeMap;
use std::path::{Path, PathBuf};
use uv_configuration::SourceStrategy;
use uv_distribution_types::{IndexLocations, Requirement};
use uv_normalize::{GroupName, PackageName};
use uv_workspace::dependency_groups::FlatDependencyGroups;
use uv_workspace::pyproject::{Sources, ToolUvSources};
use uv_workspace::{
DiscoveryOptions, MemberDiscovery, VirtualProject, WorkspaceCache, WorkspaceError,
};
use crate::metadata::{GitWorkspaceMember, LoweredRequirement, MetadataError};
/// Like [`crate::RequiresDist`] but only supporting dependency-groups.
///
/// PEP 735 says:
///
/// > A pyproject.toml file with only `[dependency-groups]` and no other tables is valid.
///
/// This is a special carveout to enable users to adopt dependency-groups without having
/// to learn about projects. It is supported by `pip install --group`, and thus interfaces
/// like `uv pip install --group` must also support it for interop and conformance.
///
/// On paper this is trivial to support because dependency-groups are so self-contained
/// that they're basically a `requirements.txt` embedded within a pyproject.toml, so it's
/// fine to just grab that section and handle it independently.
///
/// However several uv extensions make this complicated, notably, as of this writing:
///
/// * tool.uv.sources
/// * tool.uv.index
///
/// These fields may also be present in the pyproject.toml, and, critically,
/// may be defined and inherited in a parent workspace pyproject.toml.
///
/// Therefore, we need to gracefully degrade from a full workspacey situation all
/// the way down to one of these stub pyproject.tomls the PEP defines. This is why
/// we avoid going through `RequiresDist` -- we don't want to muddy up the "compile a package"
/// logic with support for non-project/workspace pyproject.tomls, and we don't want to
/// muddy this logic up with setuptools fallback modes that `RequiresDist` wants.
///
/// (We used to shove this feature into that path, and then we would see there's no metadata
/// and try to run setuptools to try to desperately find any metadata, and then error out.)
#[derive(Debug, Clone)]
pub struct SourcedDependencyGroups {
pub name: Option<PackageName>,
pub dependency_groups: BTreeMap<GroupName, Box<[Requirement]>>,
}
impl SourcedDependencyGroups {
/// Lower by considering `tool.uv` in `pyproject.toml` if present, used for Git and directory
/// dependencies.
pub async fn from_virtual_project(
pyproject_path: &Path,
git_member: Option<&GitWorkspaceMember<'_>>,
locations: &IndexLocations,
source_strategy: SourceStrategy,
cache: &WorkspaceCache,
) -> Result<Self, MetadataError> {
let discovery = DiscoveryOptions {
stop_discovery_at: git_member.map(|git_member| {
git_member
.fetch_root
.parent()
.expect("git checkout has a parent")
.to_path_buf()
}),
members: match source_strategy {
SourceStrategy::Enabled => MemberDiscovery::default(),
SourceStrategy::Disabled => MemberDiscovery::None,
},
};
// The subsequent API takes an absolute path to the dir the pyproject is in
let empty = PathBuf::new();
let absolute_pyproject_path =
std::path::absolute(pyproject_path).map_err(WorkspaceError::Normalize)?;
let project_dir = absolute_pyproject_path.parent().unwrap_or(&empty);
let project = VirtualProject::discover_defaulted(project_dir, &discovery, cache).await?;
// Collect the dependency groups.
let dependency_groups =
FlatDependencyGroups::from_pyproject_toml(project.root(), project.pyproject_toml())?;
// If sources/indexes are disabled we can just stop here
let SourceStrategy::Enabled = source_strategy else {
return Ok(Self {
name: project.project_name().cloned(),
dependency_groups: dependency_groups
.into_iter()
.map(|(name, group)| {
let requirements = group
.requirements
.into_iter()
.map(Requirement::from)
.collect();
(name, requirements)
})
.collect(),
});
};
// Collect any `tool.uv.index` entries.
let empty = vec![];
let project_indexes = project
.pyproject_toml()
.tool
.as_ref()
.and_then(|tool| tool.uv.as_ref())
.and_then(|uv| uv.index.as_deref())
.unwrap_or(&empty);
// Collect any `tool.uv.sources` and `tool.uv.dev_dependencies` from `pyproject.toml`.
let empty = BTreeMap::default();
let project_sources = project
.pyproject_toml()
.tool
.as_ref()
.and_then(|tool| tool.uv.as_ref())
.and_then(|uv| uv.sources.as_ref())
.map(ToolUvSources::inner)
.unwrap_or(&empty);
// Now that we've resolved the dependency groups, we can validate that each source references
// a valid extra or group, if present.
Self::validate_sources(project_sources, &dependency_groups)?;
// Lower the dependency groups.
let dependency_groups = dependency_groups
.into_iter()
.map(|(name, group)| {
let requirements = group
.requirements
.into_iter()
.flat_map(|requirement| {
let requirement_name = requirement.name.clone();
let group = name.clone();
let extra = None;
LoweredRequirement::from_requirement(
requirement,
project.project_name(),
project.root(),
project_sources,
project_indexes,
extra,
Some(&group),
locations,
project.workspace(),
git_member,
)
.map(move |requirement| match requirement {
Ok(requirement) => Ok(requirement.into_inner()),
Err(err) => Err(MetadataError::GroupLoweringError(
group.clone(),
requirement_name.clone(),
Box::new(err),
)),
})
})
.collect::<Result<Box<_>, _>>()?;
Ok::<(GroupName, Box<_>), MetadataError>((name, requirements))
})
.collect::<Result<BTreeMap<_, _>, _>>()?;
Ok(Self {
name: project.project_name().cloned(),
dependency_groups,
})
}
/// Validate the sources.
///
/// If a source is requested with `group`, ensure that the relevant dependency is
/// present in the relevant `dependency-groups` section.
fn validate_sources(
sources: &BTreeMap<PackageName, Sources>,
dependency_groups: &FlatDependencyGroups,
) -> Result<(), MetadataError> {
for (name, sources) in sources {
for source in sources.iter() {
if let Some(group) = source.group() {
// If the group doesn't exist at all, error.
let Some(flat_group) = dependency_groups.get(group) else {
return Err(MetadataError::MissingSourceGroup(
name.clone(),
group.clone(),
));
};
// If there is no such requirement with the group, error.
if !flat_group
.requirements
.iter()
.any(|requirement| requirement.name == *name)
{
return Err(MetadataError::IncompleteSourceGroup(
name.clone(),
group.clone(),
));
}
}
}
}
Ok(())
}
}

View file

@ -13,7 +13,7 @@ use uv_git_types::{GitReference, GitUrl, GitUrlParseError};
use uv_normalize::{ExtraName, GroupName, PackageName};
use uv_pep440::VersionSpecifiers;
use uv_pep508::{MarkerTree, VerbatimUrl, VersionOrUrl, looks_like_git_repository};
use uv_pypi_types::{ConflictItem, ParsedUrlError, VerbatimParsedUrl};
use uv_pypi_types::{ConflictItem, ParsedGitUrl, ParsedUrlError, VerbatimParsedUrl};
use uv_redacted::DisplaySafeUrl;
use uv_workspace::Workspace;
use uv_workspace::pyproject::{PyProjectToml, Source, Sources};
@ -700,17 +700,23 @@ fn path_source(
};
if is_dir {
if let Some(git_member) = git_member {
let git = git_member.git_source.git.clone();
let subdirectory = uv_fs::relative_to(install_path, git_member.fetch_root)
.expect("Workspace member must be relative");
let subdirectory = uv_fs::normalize_path_buf(subdirectory);
let subdirectory = if subdirectory == PathBuf::new() {
None
} else {
Some(subdirectory.into_boxed_path())
};
let url = DisplaySafeUrl::from(ParsedGitUrl {
url: git.clone(),
subdirectory: subdirectory.clone(),
});
return Ok(RequirementSource::Git {
git: git_member.git_source.git.clone(),
subdirectory: if subdirectory == PathBuf::new() {
None
} else {
Some(subdirectory.into_boxed_path())
},
url,
git,
subdirectory,
url: VerbatimUrl::from_url(url),
});
}

View file

@ -12,11 +12,13 @@ use uv_workspace::dependency_groups::DependencyGroupError;
use uv_workspace::{WorkspaceCache, WorkspaceError};
pub use crate::metadata::build_requires::BuildRequires;
pub use crate::metadata::dependency_groups::SourcedDependencyGroups;
pub use crate::metadata::lowering::LoweredRequirement;
pub use crate::metadata::lowering::LoweringError;
pub use crate::metadata::requires_dist::{FlatRequiresDist, RequiresDist};
mod build_requires;
mod dependency_groups;
mod lowering;
mod requires_dist;

View file

@ -6,7 +6,7 @@ use rustc_hash::FxHashSet;
use uv_configuration::SourceStrategy;
use uv_distribution_types::{IndexLocations, Requirement};
use uv_normalize::{DEV_DEPENDENCIES, ExtraName, GroupName, PackageName};
use uv_normalize::{ExtraName, GroupName, PackageName};
use uv_pep508::MarkerTree;
use uv_workspace::dependency_groups::FlatDependencyGroups;
use uv_workspace::pyproject::{Sources, ToolUvSources};
@ -107,41 +107,10 @@ impl RequiresDist {
SourceStrategy::Disabled => &empty,
};
// Collect the dependency groups.
let dependency_groups = {
// First, collect `tool.uv.dev_dependencies`
let dev_dependencies = project_workspace
.current_project()
.pyproject_toml()
.tool
.as_ref()
.and_then(|tool| tool.uv.as_ref())
.and_then(|uv| uv.dev_dependencies.as_ref());
// Then, collect `dependency-groups`
let dependency_groups = project_workspace
.current_project()
.pyproject_toml()
.dependency_groups
.iter()
.flatten()
.collect::<BTreeMap<_, _>>();
// Flatten the dependency groups.
let mut dependency_groups =
FlatDependencyGroups::from_dependency_groups(&dependency_groups)
.map_err(|err| err.with_dev_dependencies(dev_dependencies))?;
// Add the `dev` group, if `dev-dependencies` is defined.
if let Some(dev_dependencies) = dev_dependencies {
dependency_groups
.entry(DEV_DEPENDENCIES.clone())
.or_insert_with(Vec::new)
.extend(dev_dependencies.clone());
}
dependency_groups
};
let dependency_groups = FlatDependencyGroups::from_pyproject_toml(
project_workspace.current_project().root(),
project_workspace.current_project().pyproject_toml(),
)?;
// Now that we've resolved the dependency groups, we can validate that each source references
// a valid extra or group, if present.
@ -150,9 +119,10 @@ impl RequiresDist {
// Lower the dependency groups.
let dependency_groups = dependency_groups
.into_iter()
.map(|(name, requirements)| {
.map(|(name, flat_group)| {
let requirements = match source_strategy {
SourceStrategy::Enabled => requirements
SourceStrategy::Enabled => flat_group
.requirements
.into_iter()
.flat_map(|requirement| {
let requirement_name = requirement.name.clone();
@ -182,9 +152,11 @@ impl RequiresDist {
)
})
.collect::<Result<Box<_>, _>>(),
SourceStrategy::Disabled => {
Ok(requirements.into_iter().map(Requirement::from).collect())
}
SourceStrategy::Disabled => Ok(flat_group
.requirements
.into_iter()
.map(Requirement::from)
.collect()),
}?;
Ok::<(GroupName, Box<_>), MetadataError>((name, requirements))
})
@ -265,7 +237,7 @@ impl RequiresDist {
if let Some(group) = source.group() {
// If the group doesn't exist at all, error.
let Some(dependencies) = dependency_groups.get(group) else {
let Some(flat_group) = dependency_groups.get(group) else {
return Err(MetadataError::MissingSourceGroup(
name.clone(),
group.clone(),
@ -273,7 +245,8 @@ impl RequiresDist {
};
// If there is no such requirement with the group, error.
if !dependencies
if !flat_group
.requirements
.iter()
.any(|requirement| requirement.name == *name)
{

View file

@ -43,7 +43,7 @@ use uv_normalize::PackageName;
use uv_pep440::{Version, release_specifiers_to_ranges};
use uv_platform_tags::Tags;
use uv_pypi_types::{HashAlgorithm, HashDigest, HashDigests, PyProjectToml, ResolutionMetadata};
use uv_types::{BuildContext, BuildStack, SourceBuildTrait};
use uv_types::{BuildContext, BuildKey, BuildStack, SourceBuildTrait};
use uv_workspace::pyproject::ToolUvSources;
use crate::distribution_database::ManagedClient;
@ -728,8 +728,8 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
})
.await
.map_err(|err| match err {
CachedClientError::Callback(err) => err,
CachedClientError::Client(err) => Error::Client(err),
CachedClientError::Callback { err, .. } => err,
CachedClientError::Client { err, .. } => Error::Client(err),
})?;
// If the archive is missing the required hashes, force a refresh.
@ -747,8 +747,8 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
)
.await
.map_err(|err| match err {
CachedClientError::Callback(err) => err,
CachedClientError::Client(err) => Error::Client(err),
CachedClientError::Callback { err, .. } => err,
CachedClientError::Client { err, .. } => Error::Client(err),
})
})
.await
@ -1583,7 +1583,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
client
.unmanaged
.uncached_client(resource.git.repository())
.clone(),
.raw_client(),
)
.await
{
@ -1860,13 +1860,22 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
}
};
// If the URL is already precise, return it.
if self.build_context.git().get_precise(git).is_some() {
debug!("Precise commit already known: {source}");
return Ok(());
}
// If this is GitHub URL, attempt to resolve to a precise commit using the GitHub API.
if self
.build_context
.git()
.github_fast_path(
git,
client.unmanaged.uncached_client(git.repository()).clone(),
client
.unmanaged
.uncached_client(git.repository())
.raw_client(),
)
.await?
.is_some()
@ -2084,8 +2093,8 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
)
.await
.map_err(|err| match err {
CachedClientError::Callback(err) => err,
CachedClientError::Client(err) => Error::Client(err),
CachedClientError::Callback { err, .. } => err,
CachedClientError::Client { err, .. } => Error::Client(err),
})
})
.await
@ -2267,6 +2276,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
fs::create_dir_all(&cache_shard)
.await
.map_err(Error::CacheWrite)?;
// Try a direct build if that isn't disabled and the uv build backend is used.
let disk_filename = if let Some(name) = self
.build_context
@ -2287,27 +2297,73 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
// In the uv build backend, the normalized filename and the disk filename are the same.
name.to_string()
} else {
self.build_context
.setup_build(
source_root,
subdirectory,
source_root,
Some(&source.to_string()),
source.as_dist(),
source_strategy,
if source.is_editable() {
BuildKind::Editable
} else {
BuildKind::Wheel
},
BuildOutput::Debug,
self.build_stack.cloned().unwrap_or_default(),
)
.await
.map_err(|err| Error::Build(err.into()))?
.wheel(temp_dir.path())
.await
.map_err(Error::Build)?
// Identify the base Python interpreter to use in the cache key.
let base_python = if cfg!(unix) {
self.build_context
.interpreter()
.find_base_python()
.map_err(Error::BaseInterpreter)?
} else {
self.build_context
.interpreter()
.to_base_python()
.map_err(Error::BaseInterpreter)?
};
let build_kind = if source.is_editable() {
BuildKind::Editable
} else {
BuildKind::Wheel
};
let build_key = BuildKey {
base_python: base_python.into_boxed_path(),
source_root: source_root.to_path_buf().into_boxed_path(),
subdirectory: subdirectory
.map(|subdirectory| subdirectory.to_path_buf().into_boxed_path()),
source_strategy,
build_kind,
};
if let Some(builder) = self.build_context.build_arena().remove(&build_key) {
debug!("Creating build environment for: {source}");
let wheel = builder.wheel(temp_dir.path()).await.map_err(Error::Build)?;
// Store the build context.
self.build_context.build_arena().insert(build_key, builder);
wheel
} else {
debug!("Reusing existing build environment for: {source}");
let builder = self
.build_context
.setup_build(
source_root,
subdirectory,
source_root,
Some(&source.to_string()),
source.as_dist(),
source_strategy,
if source.is_editable() {
BuildKind::Editable
} else {
BuildKind::Wheel
},
BuildOutput::Debug,
self.build_stack.cloned().unwrap_or_default(),
)
.await
.map_err(|err| Error::Build(err.into()))?;
// Build the wheel.
let wheel = builder.wheel(temp_dir.path()).await.map_err(Error::Build)?;
// Store the build context.
self.build_context.build_arena().insert(build_key, builder);
wheel
}
};
// Read the metadata from the wheel.
@ -2362,6 +2418,26 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
}
}
// Identify the base Python interpreter to use in the cache key.
let base_python = if cfg!(unix) {
self.build_context
.interpreter()
.find_base_python()
.map_err(Error::BaseInterpreter)?
} else {
self.build_context
.interpreter()
.to_base_python()
.map_err(Error::BaseInterpreter)?
};
// Determine whether this is an editable or non-editable build.
let build_kind = if source.is_editable() {
BuildKind::Editable
} else {
BuildKind::Wheel
};
// Set up the builder.
let mut builder = self
.build_context
@ -2372,11 +2448,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
Some(&source.to_string()),
source.as_dist(),
source_strategy,
if source.is_editable() {
BuildKind::Editable
} else {
BuildKind::Wheel
},
build_kind,
BuildOutput::Debug,
self.build_stack.cloned().unwrap_or_default(),
)
@ -2385,6 +2457,21 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
// Build the metadata.
let dist_info = builder.metadata().await.map_err(Error::Build)?;
// Store the build context.
self.build_context.build_arena().insert(
BuildKey {
base_python: base_python.into_boxed_path(),
source_root: source_root.to_path_buf().into_boxed_path(),
subdirectory: subdirectory
.map(|subdirectory| subdirectory.to_path_buf().into_boxed_path()),
source_strategy,
build_kind,
},
builder,
);
// Return the `.dist-info` directory, if it exists.
let Some(dist_info) = dist_info else {
return Ok(None);
};

View file

@ -2,11 +2,11 @@ use std::{ffi::OsString, path::PathBuf};
#[derive(Debug, thiserror::Error)]
pub enum Error {
#[error(transparent)]
#[error("Failed to read from zip file")]
Zip(#[from] zip::result::ZipError),
#[error(transparent)]
#[error("Failed to read from zip file")]
AsyncZip(#[from] async_zip::error::ZipError),
#[error(transparent)]
#[error("I/O operation failed during extraction")]
Io(#[from] std::io::Error),
#[error(
"The top-level of the archive must only contain a list directory, but it contains: {0:?}"

View file

@ -16,7 +16,6 @@ doctest = false
workspace = true
[dependencies]
dunce = { workspace = true }
either = { workspace = true }
encoding_rs_io = { workspace = true }

View file

@ -575,8 +575,33 @@ pub fn is_temporary(path: impl AsRef<Path>) -> bool {
.is_some_and(|name| name.starts_with(".tmp"))
}
/// Checks if the grandparent directory of the given executable is the base
/// of a virtual environment.
///
/// The procedure described in PEP 405 includes checking both the parent and
/// grandparent directory of an executable, but in practice we've found this to
/// be unnecessary.
pub fn is_virtualenv_executable(executable: impl AsRef<Path>) -> bool {
executable
.as_ref()
.parent()
.and_then(Path::parent)
.is_some_and(is_virtualenv_base)
}
/// Returns `true` if a path is the base path of a virtual environment,
/// indicated by the presence of a `pyvenv.cfg` file.
///
/// The procedure described in PEP 405 includes scanning `pyvenv.cfg`
/// for a `home` key, but in practice we've found this to be
/// unnecessary.
pub fn is_virtualenv_base(path: impl AsRef<Path>) -> bool {
path.as_ref().join("pyvenv.cfg").is_file()
}
/// A file lock that is automatically released when dropped.
#[derive(Debug)]
#[must_use]
pub struct LockedFile(fs_err::File);
impl LockedFile {

View file

@ -277,21 +277,6 @@ fn normalized(path: &Path) -> PathBuf {
normalized
}
/// Like `fs_err::canonicalize`, but avoids attempting to resolve symlinks on Windows.
pub fn canonicalize_executable(path: impl AsRef<Path>) -> std::io::Result<PathBuf> {
let path = path.as_ref();
debug_assert!(
path.is_absolute(),
"path must be absolute: {}",
path.display()
);
if cfg!(windows) {
Ok(path.to_path_buf())
} else {
fs_err::canonicalize(path)
}
}
/// Compute a path describing `path` relative to `base`.
///
/// `lib/python/site-packages/foo/__init__.py` and `lib/python/site-packages` -> `foo/__init__.py`
@ -345,11 +330,11 @@ pub struct PortablePathBuf(Box<Path>);
#[cfg(feature = "schemars")]
impl schemars::JsonSchema for PortablePathBuf {
fn schema_name() -> String {
PathBuf::schema_name()
fn schema_name() -> Cow<'static, str> {
Cow::Borrowed("PortablePathBuf")
}
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
fn json_schema(_gen: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
PathBuf::json_schema(_gen)
}
}

View file

@ -17,7 +17,7 @@ fn get_binary_type(path: &Path) -> windows::core::Result<u32> {
.chain(Some(0))
.collect::<Vec<u16>>();
// SAFETY: winapi call
unsafe { GetBinaryTypeW(PCWSTR(name.as_ptr()), &mut binary_type)? };
unsafe { GetBinaryTypeW(PCWSTR(name.as_ptr()), &raw mut binary_type)? };
Ok(binary_type)
}

View file

@ -5,31 +5,36 @@ use thiserror::Error;
/// Unique identity of any Git object (commit, tree, blob, tag).
///
/// Note this type does not validate whether the input is a valid hash.
/// This type's `FromStr` implementation validates that it's exactly 40 hex characters, i.e. a
/// full-length git commit.
///
/// If Git's SHA-256 support becomes more widespread in the future (in particular if GitHub ever
/// adds support), we might need to make this an enum.
#[derive(Debug, Copy, Clone, PartialEq, Eq, PartialOrd, Ord, Hash)]
pub struct GitOid {
len: usize,
bytes: [u8; 40],
}
impl GitOid {
/// Return the string representation of an object ID.
pub fn as_str(&self) -> &str {
str::from_utf8(&self.bytes[..self.len]).unwrap()
str::from_utf8(&self.bytes).unwrap()
}
/// Return a truncated representation, i.e., the first 16 characters of the SHA.
pub fn as_short_str(&self) -> &str {
self.as_str().get(..16).unwrap_or(self.as_str())
&self.as_str()[..16]
}
}
#[derive(Debug, Error, PartialEq)]
pub enum OidParseError {
#[error("Object ID can be at most 40 hex characters")]
TooLong,
#[error("Object ID cannot be parsed from empty string")]
Empty,
#[error("Object ID must be exactly 40 hex characters")]
WrongLength,
#[error("Object ID must be valid hex characters")]
NotHex,
}
impl FromStr for GitOid {
@ -40,17 +45,17 @@ impl FromStr for GitOid {
return Err(OidParseError::Empty);
}
if s.len() > 40 {
return Err(OidParseError::TooLong);
if s.len() != 40 {
return Err(OidParseError::WrongLength);
}
let mut out = [0; 40];
out[..s.len()].copy_from_slice(s.as_bytes());
if !s.chars().all(|ch| ch.is_ascii_hexdigit()) {
return Err(OidParseError::NotHex);
}
Ok(GitOid {
len: s.len(),
bytes: out,
})
let mut bytes = [0; 40];
bytes.copy_from_slice(s.as_bytes());
Ok(GitOid { bytes })
}
}
@ -101,11 +106,20 @@ mod tests {
#[test]
fn git_oid() {
GitOid::from_str("4a23745badf5bf5ef7928f1e346e9986bd696d82").unwrap();
GitOid::from_str("4A23745BADF5BF5EF7928F1E346E9986BD696D82").unwrap();
assert_eq!(GitOid::from_str(""), Err(OidParseError::Empty));
assert_eq!(
GitOid::from_str(&str::repeat("a", 41)),
Err(OidParseError::TooLong)
Err(OidParseError::WrongLength)
);
assert_eq!(
GitOid::from_str(&str::repeat("a", 39)),
Err(OidParseError::WrongLength)
);
assert_eq!(
GitOid::from_str(&str::repeat("x", 40)),
Err(OidParseError::NotHex)
);
}
}

View file

@ -20,6 +20,8 @@ use uv_redacted::DisplaySafeUrl;
use uv_static::EnvVars;
use uv_version::version;
use crate::rate_limit::{GITHUB_RATE_LIMIT_STATUS, is_github_rate_limited};
/// A file indicates that if present, `git reset` has been done and a repo
/// checkout is ready to go. See [`GitCheckout::reset`] for why we need this.
const CHECKOUT_READY_LOCK: &str = ".ok";
@ -787,7 +789,15 @@ fn github_fast_path(
}
};
let url = format!("https://api.github.com/repos/{owner}/{repo}/commits/{github_branch_name}");
// Check if we're rate-limited by GitHub before determining the FastPathRev
if GITHUB_RATE_LIMIT_STATUS.is_active() {
debug!("Skipping GitHub fast path attempt for: {url} (rate-limited)");
return Ok(FastPathRev::Indeterminate);
}
let base_url = std::env::var(EnvVars::UV_GITHUB_FAST_PATH_URL)
.unwrap_or("https://api.github.com/repos".to_owned());
let url = format!("{base_url}/{owner}/{repo}/commits/{github_branch_name}");
let runtime = tokio::runtime::Builder::new_current_thread()
.enable_all()
@ -807,6 +817,11 @@ fn github_fast_path(
let response = request.send().await?;
if is_github_rate_limited(&response) {
// Mark that we are being rate-limited by GitHub
GITHUB_RATE_LIMIT_STATUS.activate();
}
// GitHub returns a 404 if the repository does not exist, and a 422 if it exists but GitHub
// is unable to resolve the requested revision.
response.error_for_status_ref()?;

View file

@ -7,5 +7,6 @@ pub use crate::source::{Fetch, GitSource, Reporter};
mod credentials;
mod git;
mod rate_limit;
mod resolver;
mod source;

View file

@ -0,0 +1,37 @@
use reqwest::{Response, StatusCode};
use std::sync::atomic::{AtomicBool, Ordering};
/// A global state on whether we are being rate-limited by GitHub's REST API.
/// If we are, avoid "fast-path" attempts.
pub(crate) static GITHUB_RATE_LIMIT_STATUS: GitHubRateLimitStatus = GitHubRateLimitStatus::new();
/// GitHub REST API rate limit status tracker.
///
/// ## Assumptions
///
/// The rate limit timeout duration is much longer than the runtime of a `uv` command.
/// And so we do not need to invalidate this state based on `x-ratelimit-reset`.
#[derive(Debug)]
pub(crate) struct GitHubRateLimitStatus(AtomicBool);
impl GitHubRateLimitStatus {
const fn new() -> Self {
Self(AtomicBool::new(false))
}
pub(crate) fn activate(&self) {
self.0.store(true, Ordering::Relaxed);
}
pub(crate) fn is_active(&self) -> bool {
self.0.load(Ordering::Relaxed)
}
}
/// Determine if GitHub is applying rate-limiting based on the response
pub(crate) fn is_github_rate_limited(response: &Response) -> bool {
// HTTP 403 and 429 are possible status codes in the event of a primary or secondary rate limit.
// Source: https://docs.github.com/en/rest/using-the-rest-api/troubleshooting-the-rest-api?apiVersion=2022-11-28#rate-limit-errors
let status_code = response.status();
status_code == StatusCode::FORBIDDEN || status_code == StatusCode::TOO_MANY_REQUESTS
}

View file

@ -12,9 +12,13 @@ use tracing::debug;
use uv_cache_key::{RepositoryUrl, cache_digest};
use uv_fs::LockedFile;
use uv_git_types::{GitHubRepository, GitOid, GitReference, GitUrl};
use uv_static::EnvVars;
use uv_version::version;
use crate::{Fetch, GitSource, Reporter};
use crate::{
Fetch, GitSource, Reporter,
rate_limit::{GITHUB_RATE_LIMIT_STATUS, is_github_rate_limited},
};
#[derive(Debug, thiserror::Error)]
pub enum GitResolverError {
@ -45,6 +49,21 @@ impl GitResolver {
self.0.get(reference)
}
pub fn get_precise(&self, url: &GitUrl) -> Option<GitOid> {
// If the URL is already precise, return it.
if let Some(precise) = url.precise() {
return Some(precise);
}
// If we know the precise commit already, return it.
let reference = RepositoryReference::from(url);
if let Some(precise) = self.get(&reference) {
return Some(*precise);
}
None
}
/// Resolve a Git URL to a specific commit without performing any Git operations.
///
/// Returns a [`GitOid`] if the URL has already been resolved (i.e., is available in the cache),
@ -52,18 +71,15 @@ impl GitResolver {
pub async fn github_fast_path(
&self,
url: &GitUrl,
client: ClientWithMiddleware,
client: &ClientWithMiddleware,
) -> Result<Option<GitOid>, GitResolverError> {
let reference = RepositoryReference::from(url);
// If the URL is already precise, return it.
if let Some(precise) = url.precise() {
return Ok(Some(precise));
if std::env::var_os(EnvVars::UV_NO_GITHUB_FAST_PATH).is_some() {
return Ok(None);
}
// If we know the precise commit already, return it.
if let Some(precise) = self.get(&reference) {
return Ok(Some(*precise));
// If the URL is already precise or we know the precise commit, return it.
if let Some(precise) = self.get_precise(url) {
return Ok(Some(precise));
}
// If the URL is a GitHub URL, attempt to resolve it via the GitHub API.
@ -72,13 +88,21 @@ impl GitResolver {
return Ok(None);
};
// Check if we're rate-limited by GitHub, before determining the Git reference
if GITHUB_RATE_LIMIT_STATUS.is_active() {
debug!("Rate-limited by GitHub. Skipping GitHub fast path attempt for: {url}");
return Ok(None);
}
// Determine the Git reference.
let rev = url.reference().as_rev();
let url = format!("https://api.github.com/repos/{owner}/{repo}/commits/{rev}");
let github_api_base_url = std::env::var(EnvVars::UV_GITHUB_FAST_PATH_URL)
.unwrap_or("https://api.github.com/repos".to_owned());
let github_api_url = format!("{github_api_base_url}/{owner}/{repo}/commits/{rev}");
debug!("Querying GitHub for commit at: {url}");
let mut request = client.get(&url);
debug!("Querying GitHub for commit at: {github_api_url}");
let mut request = client.get(&github_api_url);
request = request.header("Accept", "application/vnd.github.3.sha");
request = request.header(
"User-Agent",
@ -86,13 +110,20 @@ impl GitResolver {
);
let response = request.send().await?;
if !response.status().is_success() {
let status = response.status();
if !status.is_success() {
// Returns a 404 if the repository does not exist, and a 422 if GitHub is unable to
// resolve the requested rev.
debug!(
"GitHub API request failed for: {url} ({})",
"GitHub API request failed for: {github_api_url} ({})",
response.status()
);
if is_github_rate_limited(&response) {
// Mark that we are being rate-limited by GitHub
GITHUB_RATE_LIMIT_STATUS.activate();
}
return Ok(None);
}
@ -103,7 +134,7 @@ impl GitResolver {
// Insert the resolved URL into the in-memory cache. This ensures that subsequent fetches
// resolve to the same precise commit.
self.insert(reference, precise);
self.insert(RepositoryReference::from(url), precise);
Ok(Some(precise))
}
@ -112,7 +143,7 @@ impl GitResolver {
pub async fn fetch(
&self,
url: &GitUrl,
client: ClientWithMiddleware,
client: impl Into<ClientWithMiddleware>,
disable_ssl: bool,
offline: bool,
cache: PathBuf,

View file

@ -11,11 +11,11 @@ use reqwest_middleware::ClientWithMiddleware;
use tracing::{debug, instrument};
use uv_cache_key::{RepositoryUrl, cache_digest};
use uv_git_types::GitUrl;
use uv_git_types::{GitOid, GitReference, GitUrl};
use uv_redacted::DisplaySafeUrl;
use crate::GIT_STORE;
use crate::git::GitRemote;
use crate::git::{GitDatabase, GitRemote};
/// A remote Git source that can be checked out locally.
pub struct GitSource {
@ -86,40 +86,59 @@ impl GitSource {
Cow::Borrowed(self.git.repository())
};
let remote = GitRemote::new(&remote);
let (db, actual_rev, task) = match (self.git.precise(), remote.db_at(&db_path).ok()) {
// If we have a locked revision, and we have a preexisting database
// which has that revision, then no update needs to happen.
(Some(rev), Some(db)) if db.contains(rev) => {
debug!("Using existing Git source `{}`", self.git.repository());
(db, rev, None)
// Fetch the commit, if we don't already have it. Wrapping this section in a closure makes
// it easier to short-circuit this in the cases where we do have the commit.
let (db, actual_rev, maybe_task) = || -> Result<(GitDatabase, GitOid, Option<usize>)> {
let git_remote = GitRemote::new(&remote);
let maybe_db = git_remote.db_at(&db_path).ok();
// If we have a locked revision, and we have a pre-existing database which has that
// revision, then no update needs to happen.
if let (Some(rev), Some(db)) = (self.git.precise(), &maybe_db) {
if db.contains(rev) {
debug!("Using existing Git source `{}`", self.git.repository());
return Ok((maybe_db.unwrap(), rev, None));
}
}
// ... otherwise we use this state to update the git database. Note
// that we still check for being offline here, for example in the
// situation that we have a locked revision but the database
// doesn't have it.
(locked_rev, db) => {
debug!("Updating Git source `{}`", self.git.repository());
// Report the checkout operation to the reporter.
let task = self.reporter.as_ref().map(|reporter| {
reporter.on_checkout_start(remote.url(), self.git.reference().as_rev())
});
let (db, actual_rev) = remote.checkout(
&db_path,
db,
self.git.reference(),
locked_rev,
&self.client,
self.disable_ssl,
self.offline,
)?;
(db, actual_rev, task)
// If the revision isn't locked, but it looks like it might be an exact commit hash,
// and we do have a pre-existing database, then check whether it is, in fact, a commit
// hash. If so, treat it like it's locked.
if let Some(db) = &maybe_db {
if let GitReference::BranchOrTagOrCommit(maybe_commit) = self.git.reference() {
if let Ok(oid) = maybe_commit.parse::<GitOid>() {
if db.contains(oid) {
// This reference is an exact commit. Treat it like it's
// locked.
debug!("Using existing Git source `{}`", self.git.repository());
return Ok((maybe_db.unwrap(), oid, None));
}
}
}
}
};
// ... otherwise, we use this state to update the Git database. Note that we still check
// for being offline here, for example in the situation that we have a locked revision
// but the database doesn't have it.
debug!("Updating Git source `{}`", self.git.repository());
// Report the checkout operation to the reporter.
let task = self.reporter.as_ref().map(|reporter| {
reporter.on_checkout_start(git_remote.url(), self.git.reference().as_rev())
});
let (db, actual_rev) = git_remote.checkout(
&db_path,
maybe_db,
self.git.reference(),
self.git.precise(),
&self.client,
self.disable_ssl,
self.offline,
)?;
Ok((db, actual_rev, task))
}()?;
// Dont use the full hash, in order to contribute less to reaching the
// path length limit on Windows.
@ -137,9 +156,9 @@ impl GitSource {
db.copy_to(actual_rev, &checkout_path)?;
// Report the checkout operation to the reporter.
if let Some(task) = task {
if let Some(task) = maybe_task {
if let Some(reporter) = self.reporter.as_ref() {
reporter.on_checkout_complete(remote.url(), actual_rev.as_str(), task);
reporter.on_checkout_complete(remote.as_ref(), actual_rev.as_str(), task);
}
}

View file

@ -85,7 +85,7 @@ impl GlobDirFilter {
/// don't end up including any child.
pub fn match_directory(&self, path: &Path) -> bool {
let Some(dfa) = &self.dfa else {
return false;
return true;
};
// Allow the root path

View file

@ -26,20 +26,19 @@ PEP 440 has a lot of unintuitive features, including:
- An epoch that you can prefix the version with, e.g., `1!1.2.3`. Lower epoch always means lower
version (`1.0 <=2!0.1`)
* post versions, which can be attached to both stable releases and pre-releases
* dev versions, which can be attached to sbpth table releases and pre-releases. When attached to a
- Post versions, which can be attached to both stable releases and pre-releases
- Dev versions, which can be attached to sbpth table releases and pre-releases. When attached to a
pre-release the dev version is ordered just below the normal pre-release, however when attached to
a stable version, the dev version is sorted before a pre-releases
* pre-release handling is a mess: "Pre-releases of any kind, including developmental releases, are
- Pre-release handling is a mess: "Pre-releases of any kind, including developmental releases, are
implicitly excluded from all version specifiers, unless they are already present on the system,
explicitly requested by the user, or if the only available version that satisfies the version
specifier is a pre-release.". This means that we can't say whether a specifier matches without
also looking at the environment
* pre-release vs. pre-release incl. dev is fuzzy
* local versions on top of all the others, which are added with a + and have implicitly typed string
- Pre-release vs. pre-release incl. dev is fuzzy
- Local versions on top of all the others, which are added with a + and have implicitly typed string
and number segments
* no semver-caret (`^`), but a pseudo-semver tilde (`~=`)
* ordering contradicts matching: We have, e.g., `1.0+local > 1.0` when sorting, but `==1.0` matches
- No semver-caret (`^`), but a pseudo-semver tilde (`~=`)
- Ordering contradicts matching: We have, e.g., `1.0+local > 1.0` when sorting, but `==1.0` matches
`1.0+local`. While the ordering of versions itself is a total order the version matching needs to
catch all sorts of special cases

View file

@ -34,7 +34,7 @@ pub use {
VersionPatternParseError,
},
version_specifier::{
VersionSpecifier, VersionSpecifierBuildError, VersionSpecifiers,
TildeVersionSpecifier, VersionSpecifier, VersionSpecifierBuildError, VersionSpecifiers,
VersionSpecifiersParseError,
},
};

View file

@ -610,6 +610,24 @@ impl Version {
Self::new(self.release().iter().copied())
}
/// Return the version with any segments apart from the release removed, with trailing zeroes
/// trimmed.
#[inline]
#[must_use]
pub fn only_release_trimmed(&self) -> Self {
if let Some(last_non_zero) = self.release().iter().rposition(|segment| *segment != 0) {
if last_non_zero == self.release().len() {
// Already trimmed.
self.clone()
} else {
Self::new(self.release().iter().take(last_non_zero + 1).copied())
}
} else {
// `0` is a valid version.
Self::new([0])
}
}
/// Return the version with trailing `.0` release segments removed.
///
/// # Panics

View file

@ -132,7 +132,7 @@ impl From<VersionSpecifier> for Ranges<Version> {
pub fn release_specifiers_to_ranges(specifiers: VersionSpecifiers) -> Ranges<Version> {
let mut range = Ranges::full();
for specifier in specifiers {
range = range.intersection(&release_specifier_to_range(specifier));
range = range.intersection(&release_specifier_to_range(specifier, false));
}
range
}
@ -148,67 +148,57 @@ pub fn release_specifiers_to_ranges(specifiers: VersionSpecifiers) -> Ranges<Ver
/// is allowed for projects that declare `requires-python = ">3.13"`.
///
/// See: <https://github.com/pypa/pip/blob/a432c7f4170b9ef798a15f035f5dfdb4cc939f35/src/pip/_internal/resolution/resolvelib/candidates.py#L540>
pub fn release_specifier_to_range(specifier: VersionSpecifier) -> Ranges<Version> {
pub fn release_specifier_to_range(specifier: VersionSpecifier, trim: bool) -> Ranges<Version> {
let VersionSpecifier { operator, version } = specifier;
// Note(konsti): We switched strategies to trimmed for the markers, but we don't want to cause
// churn in lockfile requires-python, so we only trim for markers.
let version_trimmed = if trim {
version.only_release_trimmed()
} else {
version.only_release()
};
match operator {
Operator::Equal => {
let version = version.only_release();
Ranges::singleton(version)
}
Operator::ExactEqual => {
let version = version.only_release();
Ranges::singleton(version)
}
Operator::NotEqual => {
let version = version.only_release();
Ranges::singleton(version).complement()
}
// Trailing zeroes are not semantically relevant.
Operator::Equal => Ranges::singleton(version_trimmed),
Operator::ExactEqual => Ranges::singleton(version_trimmed),
Operator::NotEqual => Ranges::singleton(version_trimmed).complement(),
Operator::LessThan => Ranges::strictly_lower_than(version_trimmed),
Operator::LessThanEqual => Ranges::lower_than(version_trimmed),
Operator::GreaterThan => Ranges::strictly_higher_than(version_trimmed),
Operator::GreaterThanEqual => Ranges::higher_than(version_trimmed),
// Trailing zeroes are semantically relevant.
Operator::TildeEqual => {
let release = version.release();
let [rest @ .., last, _] = &*release else {
unreachable!("~= must have at least two segments");
};
let upper = Version::new(rest.iter().chain([&(last + 1)]));
let version = version.only_release();
Ranges::from_range_bounds(version..upper)
}
Operator::LessThan => {
let version = version.only_release();
Ranges::strictly_lower_than(version)
}
Operator::LessThanEqual => {
let version = version.only_release();
Ranges::lower_than(version)
}
Operator::GreaterThan => {
let version = version.only_release();
Ranges::strictly_higher_than(version)
}
Operator::GreaterThanEqual => {
let version = version.only_release();
Ranges::higher_than(version)
Ranges::from_range_bounds(version_trimmed..upper)
}
Operator::EqualStar => {
let low = version.only_release();
// For (not-)equal-star, trailing zeroes are still before the star.
let low_full = version.only_release();
let high = {
let mut high = low.clone();
let mut high = low_full.clone();
let mut release = high.release().to_vec();
*release.last_mut().unwrap() += 1;
high = high.with_release(release);
high
};
Ranges::from_range_bounds(low..high)
Ranges::from_range_bounds(version..high)
}
Operator::NotEqualStar => {
let low = version.only_release();
// For (not-)equal-star, trailing zeroes are still before the star.
let low_full = version.only_release();
let high = {
let mut high = low.clone();
let mut high = low_full.clone();
let mut release = high.release().to_vec();
*release.last_mut().unwrap() += 1;
high = high.with_release(release);
high
};
Ranges::from_range_bounds(low..high).complement()
Ranges::from_range_bounds(version..high).complement()
}
}
}
@ -223,8 +213,8 @@ impl LowerBound {
/// These bounds use release-only semantics when comparing versions.
pub fn new(bound: Bound<Version>) -> Self {
Self(match bound {
Bound::Included(version) => Bound::Included(version.only_release()),
Bound::Excluded(version) => Bound::Excluded(version.only_release()),
Bound::Included(version) => Bound::Included(version.only_release_trimmed()),
Bound::Excluded(version) => Bound::Excluded(version.only_release_trimmed()),
Bound::Unbounded => Bound::Unbounded,
})
}
@ -358,8 +348,8 @@ impl UpperBound {
/// These bounds use release-only semantics when comparing versions.
pub fn new(bound: Bound<Version>) -> Self {
Self(match bound {
Bound::Included(version) => Bound::Included(version.only_release()),
Bound::Excluded(version) => Bound::Excluded(version.only_release()),
Bound::Included(version) => Bound::Included(version.only_release_trimmed()),
Bound::Excluded(version) => Bound::Excluded(version.only_release_trimmed()),
Bound::Unbounded => Bound::Unbounded,
})
}

View file

@ -80,24 +80,38 @@ impl VersionSpecifiers {
// Add specifiers for the holes between the bounds.
for (lower, upper) in bounds {
match (next, lower) {
let specifier = match (next, lower) {
// Ex) [3.7, 3.8.5), (3.8.5, 3.9] -> >=3.7,!=3.8.5,<=3.9
(Bound::Excluded(prev), Bound::Excluded(lower)) if prev == lower => {
specifiers.push(VersionSpecifier::not_equals_version(prev.clone()));
Some(VersionSpecifier::not_equals_version(prev.clone()))
}
// Ex) [3.7, 3.8), (3.8, 3.9] -> >=3.7,!=3.8.*,<=3.9
(Bound::Excluded(prev), Bound::Included(lower))
if prev.release().len() == 2
&& *lower.release() == [prev.release()[0], prev.release()[1] + 1] =>
{
specifiers.push(VersionSpecifier::not_equals_star_version(prev.clone()));
}
_ => {
#[cfg(feature = "tracing")]
warn!(
"Ignoring unsupported gap in `requires-python` version: {next:?} -> {lower:?}"
);
(Bound::Excluded(prev), Bound::Included(lower)) => {
match *prev.only_release_trimmed().release() {
[major] if *lower.only_release_trimmed().release() == [major, 1] => {
Some(VersionSpecifier::not_equals_star_version(Version::new([
major, 0,
])))
}
[major, minor]
if *lower.only_release_trimmed().release() == [major, minor + 1] =>
{
Some(VersionSpecifier::not_equals_star_version(Version::new([
major, minor,
])))
}
_ => None,
}
}
_ => None,
};
if let Some(specifier) = specifier {
specifiers.push(specifier);
} else {
#[cfg(feature = "tracing")]
warn!(
"Ignoring unsupported gap in `requires-python` version: {next:?} -> {lower:?}"
);
}
next = upper;
}
@ -348,6 +362,33 @@ impl VersionSpecifier {
Ok(Self { operator, version })
}
/// Remove all non-release parts of the version.
///
/// The marker decision diagram relies on the assumption that the negation of a marker tree is
/// the complement of the marker space. However, pre-release versions violate this assumption.
///
/// For example, the marker `python_full_version > '3.9' or python_full_version <= '3.9'`
/// does not match `python_full_version == 3.9.0a0` and so cannot simplify to `true`. However,
/// its negation, `python_full_version > '3.9' and python_full_version <= '3.9'`, also does not
/// match `3.9.0a0` and simplifies to `false`, which violates the algebra decision diagrams
/// rely on. For this reason we ignore pre-release versions entirely when evaluating markers.
///
/// Note that `python_version` cannot take on pre-release values as it is truncated to just the
/// major and minor version segments. Thus using release-only specifiers is definitely necessary
/// for `python_version` to fully simplify any ranges, such as
/// `python_version > '3.9' or python_version <= '3.9'`, which is always `true` for
/// `python_version`. For `python_full_version` however, this decision is a semantic change.
///
/// For Python versions, the major.minor is considered the API version, so unlike the rules
/// for package versions in PEP 440, we Python `3.9.0a0` is acceptable for `>= "3.9"`.
#[must_use]
pub fn only_release(self) -> Self {
Self {
operator: self.operator,
version: self.version.only_release(),
}
}
/// `==<version>`
pub fn equals_version(version: Version) -> Self {
Self {
@ -416,7 +457,7 @@ impl VersionSpecifier {
&self.operator
}
/// Get the version, e.g. `<=` in `<= 2.0.0`
/// Get the version, e.g. `2.0.0` in `<= 2.0.0`
pub fn version(&self) -> &Version {
&self.version
}
@ -442,14 +483,23 @@ impl VersionSpecifier {
(Some(VersionSpecifier::equals_version(v1.clone())), None)
}
// `v >= 3.7 && v < 3.8` is equivalent to `v == 3.7.*`
(Bound::Included(v1), Bound::Excluded(v2))
if v1.release().len() == 2
&& *v2.release() == [v1.release()[0], v1.release()[1] + 1] =>
{
(
Some(VersionSpecifier::equals_star_version(v1.clone())),
None,
)
(Bound::Included(v1), Bound::Excluded(v2)) => {
match *v1.only_release_trimmed().release() {
[major] if *v2.only_release_trimmed().release() == [major, 1] => {
let version = Version::new([major, 0]);
(Some(VersionSpecifier::equals_star_version(version)), None)
}
[major, minor]
if *v2.only_release_trimmed().release() == [major, minor + 1] =>
{
let version = Version::new([major, minor]);
(Some(VersionSpecifier::equals_star_version(version)), None)
}
_ => (
VersionSpecifier::from_lower_bound(&Bound::Included(v1.clone())),
VersionSpecifier::from_upper_bound(&Bound::Excluded(v2.clone())),
),
}
}
(lower, upper) => (
VersionSpecifier::from_lower_bound(lower),
@ -627,7 +677,15 @@ impl FromStr for VersionSpecifier {
// operator but we don't know yet if it has a star
let operator = s.eat_while(['=', '!', '~', '<', '>']);
if operator.is_empty() {
return Err(ParseErrorKind::MissingOperator.into());
// Attempt to parse the version from the rest of the scanner to provide a more useful error message in MissingOperator.
// If it is not able to be parsed (i.e. not a valid version), it will just be None and no additional info will be added to the error message.
s.eat_while(|c: char| c.is_whitespace());
let version = s.eat_while(|c: char| !c.is_whitespace());
s.eat_while(|c: char| c.is_whitespace());
return Err(ParseErrorKind::MissingOperator(VersionOperatorBuildError {
version_pattern: VersionPattern::from_str(version).ok(),
})
.into());
}
let operator = Operator::from_str(operator).map_err(ParseErrorKind::InvalidOperator)?;
s.eat_while(|c: char| c.is_whitespace());
@ -695,6 +753,25 @@ impl std::fmt::Display for VersionSpecifierBuildError {
}
}
#[derive(Clone, Debug, Eq, PartialEq)]
struct VersionOperatorBuildError {
version_pattern: Option<VersionPattern>,
}
impl std::error::Error for VersionOperatorBuildError {}
impl std::fmt::Display for VersionOperatorBuildError {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(f, "Unexpected end of version specifier, expected operator")?;
if let Some(version_pattern) = &self.version_pattern {
let version_specifier =
VersionSpecifier::from_pattern(Operator::Equal, version_pattern.clone()).unwrap();
write!(f, ". Did you mean `{version_specifier}`?")?;
}
Ok(())
}
}
/// The specific kind of error that can occur when building a version specifier
/// from an operator and version pair.
#[derive(Clone, Debug, Eq, PartialEq)]
@ -748,9 +825,7 @@ impl std::fmt::Display for VersionSpecifierParseError {
ParseErrorKind::InvalidOperator(ref err) => err.fmt(f),
ParseErrorKind::InvalidVersion(ref err) => err.fmt(f),
ParseErrorKind::InvalidSpecifier(ref err) => err.fmt(f),
ParseErrorKind::MissingOperator => {
write!(f, "Unexpected end of version specifier, expected operator")
}
ParseErrorKind::MissingOperator(ref err) => err.fmt(f),
ParseErrorKind::MissingVersion => {
write!(f, "Unexpected end of version specifier, expected version")
}
@ -768,7 +843,7 @@ enum ParseErrorKind {
InvalidOperator(OperatorParseError),
InvalidVersion(VersionPatternParseError),
InvalidSpecifier(VersionSpecifierBuildError),
MissingOperator,
MissingOperator(VersionOperatorBuildError),
MissingVersion,
InvalidTrailing(String),
}
@ -813,6 +888,90 @@ pub(crate) fn parse_version_specifiers(
Ok(version_ranges)
}
/// A simple `~=` version specifier with a major, minor and (optional) patch version, e.g., `~=3.13`
/// or `~=3.13.0`.
#[derive(Clone, Debug)]
pub struct TildeVersionSpecifier<'a> {
inner: Cow<'a, VersionSpecifier>,
}
impl<'a> TildeVersionSpecifier<'a> {
/// Create a new [`TildeVersionSpecifier`] from a [`VersionSpecifier`] value.
///
/// If a [`Operator::TildeEqual`] is not used, or the version includes more than minor and patch
/// segments, this will return [`None`].
pub fn from_specifier(specifier: VersionSpecifier) -> Option<TildeVersionSpecifier<'a>> {
TildeVersionSpecifier::new(Cow::Owned(specifier))
}
/// Create a new [`TildeVersionSpecifier`] from a [`VersionSpecifier`] reference.
///
/// See [`TildeVersionSpecifier::from_specifier`].
pub fn from_specifier_ref(
specifier: &'a VersionSpecifier,
) -> Option<TildeVersionSpecifier<'a>> {
TildeVersionSpecifier::new(Cow::Borrowed(specifier))
}
fn new(specifier: Cow<'a, VersionSpecifier>) -> Option<Self> {
if specifier.operator != Operator::TildeEqual {
return None;
}
if specifier.version().release().len() < 2 || specifier.version().release().len() > 3 {
return None;
}
if specifier.version().any_prerelease()
|| specifier.version().is_local()
|| specifier.version().is_post()
{
return None;
}
Some(Self { inner: specifier })
}
/// Whether a patch version is present in this tilde version specifier.
pub fn has_patch(&self) -> bool {
self.inner.version.release().len() == 3
}
/// Construct the lower and upper bounding version specifiers for this tilde version specifier,
/// e.g., for `~=3.13` this would return `>=3.13` and `<4` and for `~=3.13.0` it would
/// return `>=3.13.0` and `<3.14`.
pub fn bounding_specifiers(&self) -> (VersionSpecifier, VersionSpecifier) {
let release = self.inner.version().release();
let lower = self.inner.version.clone();
let upper = if self.has_patch() {
Version::new([release[0], release[1] + 1])
} else {
Version::new([release[0] + 1])
};
(
VersionSpecifier::greater_than_equal_version(lower),
VersionSpecifier::less_than_version(upper),
)
}
/// Construct a new tilde `VersionSpecifier` with the given patch version appended.
pub fn with_patch_version(&self, patch: u64) -> TildeVersionSpecifier {
let mut release = self.inner.version.release().to_vec();
if self.has_patch() {
release.pop();
}
release.push(patch);
TildeVersionSpecifier::from_specifier(
VersionSpecifier::from_version(Operator::TildeEqual, Version::new(release))
.expect("We should always derive a valid new version specifier"),
)
.expect("We should always derive a new tilde version specifier")
}
}
impl std::fmt::Display for TildeVersionSpecifier<'_> {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.inner)
}
}
#[cfg(test)]
mod tests {
use std::{cmp::Ordering, str::FromStr};
@ -1356,6 +1515,32 @@ mod tests {
);
}
#[test]
fn test_parse_specifier_missing_operator_error() {
let result = VersionSpecifiers::from_str("3.12");
assert_eq!(
result.unwrap_err().to_string(),
indoc! {"
Failed to parse version: Unexpected end of version specifier, expected operator. Did you mean `==3.12`?:
3.12
^^^^
"}
);
}
#[test]
fn test_parse_specifier_missing_operator_invalid_version_error() {
let result = VersionSpecifiers::from_str("blergh");
assert_eq!(
result.unwrap_err().to_string(),
indoc! {r"
Failed to parse version: Unexpected end of version specifier, expected operator:
blergh
^^^^^^
"}
);
}
#[test]
fn test_non_star_after_star() {
let result = VersionSpecifiers::from_str("== 0.9.*.1");
@ -1386,7 +1571,10 @@ mod tests {
let result = VersionSpecifiers::from_str("blergh");
assert_eq!(
result.unwrap_err().inner.err,
ParseErrorKind::MissingOperator.into(),
ParseErrorKind::MissingOperator(VersionOperatorBuildError {
version_pattern: None
})
.into(),
);
}
@ -1395,7 +1583,13 @@ mod tests {
fn test_invalid_specifier() {
let specifiers = [
// Operator-less specifier
("2.0", ParseErrorKind::MissingOperator.into()),
(
"2.0",
ParseErrorKind::MissingOperator(VersionOperatorBuildError {
version_pattern: VersionPattern::from_str("2.0").ok(),
})
.into(),
),
// Invalid operator
(
"=>2.0",
@ -1735,7 +1929,9 @@ mod tests {
fn error_message_version_specifiers_parse_error() {
let specs = ">=1.2.3, 5.4.3, >=3.4.5";
let err = VersionSpecifierParseError {
kind: Box::new(ParseErrorKind::MissingOperator),
kind: Box::new(ParseErrorKind::MissingOperator(VersionOperatorBuildError {
version_pattern: VersionPattern::from_str("5.4.3").ok(),
})),
};
let inner = Box::new(VersionSpecifiersParseErrorInner {
err,
@ -1748,7 +1944,7 @@ mod tests {
assert_eq!(
err.to_string(),
"\
Failed to parse version: Unexpected end of version specifier, expected operator:
Failed to parse version: Unexpected end of version specifier, expected operator. Did you mean `==5.4.3`?:
>=1.2.3, 5.4.3, >=3.4.5
^^^^^^
"

View file

@ -41,7 +41,7 @@ version-ranges = { workspace = true }
[dev-dependencies]
insta = { version = "1.40.0" }
serde_json = { version = "1.0.128" }
serde_json = { workspace = true }
tracing-test = { version = "0.2.5" }
[features]

View file

@ -16,6 +16,8 @@
#![warn(missing_docs)]
#[cfg(feature = "schemars")]
use std::borrow::Cow;
use std::error::Error;
use std::fmt::{Debug, Display, Formatter};
use std::path::Path;
@ -334,22 +336,15 @@ impl Reporter for TracingReporter {
#[cfg(feature = "schemars")]
impl<T: Pep508Url> schemars::JsonSchema for Requirement<T> {
fn schema_name() -> String {
"Requirement".to_string()
fn schema_name() -> Cow<'static, str> {
Cow::Borrowed("Requirement")
}
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
schemars::schema::SchemaObject {
instance_type: Some(schemars::schema::InstanceType::String.into()),
metadata: Some(Box::new(schemars::schema::Metadata {
description: Some(
"A PEP 508 dependency specifier, e.g., `ruff >= 0.6.0`".to_string(),
),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
fn json_schema(_gen: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::json_schema!({
"type": "string",
"description": "A PEP 508 dependency specifier, e.g., `ruff >= 0.6.0`"
})
}
}

View file

@ -172,7 +172,7 @@ impl InternerGuard<'_> {
),
// Normalize `python_version` markers to `python_full_version` nodes.
MarkerValueVersion::PythonVersion => {
match python_version_to_full_version(normalize_specifier(specifier)) {
match python_version_to_full_version(specifier.only_release()) {
Ok(specifier) => (
Variable::Version(CanonicalMarkerValueVersion::PythonFullVersion),
Edges::from_specifier(specifier),
@ -1214,7 +1214,7 @@ impl Edges {
/// Returns the [`Edges`] for a version specifier.
fn from_specifier(specifier: VersionSpecifier) -> Edges {
let specifier = release_specifier_to_range(normalize_specifier(specifier));
let specifier = release_specifier_to_range(specifier.only_release(), true);
Edges::Version {
edges: Edges::from_range(&specifier),
}
@ -1227,9 +1227,9 @@ impl Edges {
let mut range: Ranges<Version> = versions
.into_iter()
.map(|version| {
let specifier = VersionSpecifier::equals_version(version.clone());
let specifier = VersionSpecifier::equals_version(version.only_release());
let specifier = python_version_to_full_version(specifier)?;
Ok(release_specifier_to_range(normalize_specifier(specifier)))
Ok(release_specifier_to_range(specifier, true))
})
.flatten_ok()
.collect::<Result<Ranges<_>, NodeId>>()?;
@ -1526,57 +1526,62 @@ impl Edges {
}
}
// Normalize a [`VersionSpecifier`] before adding it to the tree.
fn normalize_specifier(specifier: VersionSpecifier) -> VersionSpecifier {
let (operator, version) = specifier.into_parts();
// The decision diagram relies on the assumption that the negation of a marker tree is
// the complement of the marker space. However, pre-release versions violate this assumption.
//
// For example, the marker `python_full_version > '3.9' or python_full_version <= '3.9'`
// does not match `python_full_version == 3.9.0a0` and so cannot simplify to `true`. However,
// its negation, `python_full_version > '3.9' and python_full_version <= '3.9'`, also does not
// match `3.9.0a0` and simplifies to `false`, which violates the algebra decision diagrams
// rely on. For this reason we ignore pre-release versions entirely when evaluating markers.
//
// Note that `python_version` cannot take on pre-release values as it is truncated to just the
// major and minor version segments. Thus using release-only specifiers is definitely necessary
// for `python_version` to fully simplify any ranges, such as `python_version > '3.9' or python_version <= '3.9'`,
// which is always `true` for `python_version`. For `python_full_version` however, this decision
// is a semantic change.
let mut release = &*version.release();
// Strip any trailing `0`s.
//
// The [`Version`] type ignores trailing `0`s for equality, but still preserves them in its
// [`Display`] output. We must normalize all versions by stripping trailing `0`s to remove the
// distinction between versions like `3.9` and `3.9.0`. Otherwise, their output would depend on
// which form was added to the global marker interner first.
//
// Note that we cannot strip trailing `0`s for star equality, as `==3.0.*` is different from `==3.*`.
if !operator.is_star() {
if let Some(end) = release.iter().rposition(|segment| *segment != 0) {
if end > 0 {
release = &release[..=end];
}
}
}
VersionSpecifier::from_version(operator, Version::new(release)).unwrap()
}
/// Returns the equivalent `python_full_version` specifier for a `python_version` specifier.
///
/// Returns `Err` with a constant node if the equivalent comparison is always `true` or `false`.
fn python_version_to_full_version(specifier: VersionSpecifier) -> Result<VersionSpecifier, NodeId> {
// Trailing zeroes matter only for (not-)equals-star and tilde-equals. This means that below
// the next two blocks, we can use the trimmed release as the release.
if specifier.operator().is_star() {
// Input python_version python_full_version
// ==3.* 3.* 3.*
// ==3.0.* 3.0 3.0.*
// ==3.0.0.* 3.0 3.0.*
// ==3.9.* 3.9 3.9.*
// ==3.9.0.* 3.9 3.9.*
// ==3.9.0.0.* 3.9 3.9.*
// ==3.9.1.* FALSE FALSE
// ==3.9.1.0.* FALSE FALSE
// ==3.9.1.0.0.* FALSE FALSE
return match &*specifier.version().release() {
// `3.*`
[_major] => Ok(specifier),
// Ex) `3.9.*`, `3.9.0.*`, or `3.9.0.0.*`
[major, minor, rest @ ..] if rest.iter().all(|x| *x == 0) => {
let python_version = Version::new([major, minor]);
// Unwrap safety: A star operator with two version segments is always valid.
Ok(VersionSpecifier::from_version(*specifier.operator(), python_version).unwrap())
}
// Ex) `3.9.1.*` or `3.9.0.1.*`
_ => Err(NodeId::FALSE),
};
}
if *specifier.operator() == Operator::TildeEqual {
// python_version python_full_version
// ~=3 (not possible)
// ~= 3.0 >= 3.0, < 4.0
// ~= 3.9 >= 3.9, < 4.0
// ~= 3.9.0 == 3.9.*
// ~= 3.9.1 FALSE
// ~= 3.9.0.0 == 3.9.*
// ~= 3.9.0.1 FALSE
return match &*specifier.version().release() {
// Ex) `3.0`, `3.7`
[_major, _minor] => Ok(specifier),
// Ex) `3.9`, `3.9.0`, or `3.9.0.0`
[major, minor, rest @ ..] if rest.iter().all(|x| *x == 0) => {
let python_version = Version::new([major, minor]);
Ok(VersionSpecifier::equals_star_version(python_version))
}
// Ex) `3.9.1` or `3.9.0.1`
_ => Err(NodeId::FALSE),
};
}
// Extract the major and minor version segments if the specifier contains exactly
// those segments, or if it contains a major segment with an implied minor segment of `0`.
let major_minor = match *specifier.version().release() {
// For star operators, we cannot add a trailing `0`.
//
// `python_version == 3.*` is equivalent to `python_full_version == 3.*`. Adding a
// trailing `0` would result in `python_version == 3.0.*`, which is incorrect.
[_major] if specifier.operator().is_star() => return Ok(specifier),
let major_minor = match *specifier.version().only_release_trimmed().release() {
// Add a trailing `0` for the minor version, which is implied.
// For example, `python_version == 3` matches `3.0.1`, `3.0.2`, etc.
[major] => Some((major, 0)),
@ -1614,9 +1619,10 @@ fn python_version_to_full_version(specifier: VersionSpecifier) -> Result<Version
VersionSpecifier::less_than_version(Version::new([major, minor + 1]))
}
// `==3.7.*`, `!=3.7.*`, `~=3.7` already represent the equivalent `python_full_version`
// comparison.
Operator::EqualStar | Operator::NotEqualStar | Operator::TildeEqual => specifier,
Operator::EqualStar | Operator::NotEqualStar | Operator::TildeEqual => {
// Handled above.
unreachable!()
}
})
} else {
let [major, minor, ..] = *specifier.version().release() else {
@ -1624,13 +1630,14 @@ fn python_version_to_full_version(specifier: VersionSpecifier) -> Result<Version
};
Ok(match specifier.operator() {
// `python_version` cannot have more than two release segments, so equality is impossible.
Operator::Equal | Operator::ExactEqual | Operator::EqualStar | Operator::TildeEqual => {
// `python_version` cannot have more than two release segments, and we know
// that the following release segments aren't purely zeroes so equality is impossible.
Operator::Equal | Operator::ExactEqual => {
return Err(NodeId::FALSE);
}
// Similarly, inequalities are always `true`.
Operator::NotEqual | Operator::NotEqualStar => return Err(NodeId::TRUE),
Operator::NotEqual => return Err(NodeId::TRUE),
// `python_version {<,<=} 3.7.8` is equivalent to `python_full_version < 3.8`.
Operator::LessThan | Operator::LessThanEqual => {
@ -1641,6 +1648,11 @@ fn python_version_to_full_version(specifier: VersionSpecifier) -> Result<Version
Operator::GreaterThan | Operator::GreaterThanEqual => {
VersionSpecifier::greater_than_equal_version(Version::new([major, minor + 1]))
}
Operator::EqualStar | Operator::NotEqualStar | Operator::TildeEqual => {
// Handled above.
unreachable!()
}
})
}
}

View file

@ -64,8 +64,8 @@ fn collect_dnf(
continue;
}
// Detect whether the range for this edge can be simplified as a star inequality.
if let Some(specifier) = star_range_inequality(&range) {
// Detect whether the range for this edge can be simplified as a star specifier.
if let Some(specifier) = star_range_specifier(&range) {
path.push(MarkerExpression::Version {
key: marker.key().into(),
specifier,
@ -343,22 +343,34 @@ where
Some(excluded)
}
/// Returns `Some` if the version expression can be simplified as a star inequality with the given
/// specifier.
/// Returns `Some` if the version range can be simplified as a star specifier.
///
/// For example, `python_full_version < '3.8' or python_full_version >= '3.9'` can be simplified to
/// `python_full_version != '3.8.*'`.
fn star_range_inequality(range: &Ranges<Version>) -> Option<VersionSpecifier> {
/// Only for the two bounds case not covered by [`VersionSpecifier::from_release_only_bounds`].
///
/// For negative ranges like `python_full_version < '3.8' or python_full_version >= '3.9'`,
/// returns `!= '3.8.*'`.
fn star_range_specifier(range: &Ranges<Version>) -> Option<VersionSpecifier> {
if range.iter().count() != 2 {
return None;
}
// Check for negative star range: two segments [(Unbounded, Excluded(v1)), (Included(v2), Unbounded)]
let (b1, b2) = range.iter().collect_tuple()?;
match (b1, b2) {
((Bound::Unbounded, Bound::Excluded(v1)), (Bound::Included(v2), Bound::Unbounded))
if v1.release().len() == 2
&& *v2.release() == [v1.release()[0], v1.release()[1] + 1] =>
{
Some(VersionSpecifier::not_equals_star_version(v1.clone()))
if let ((Bound::Unbounded, Bound::Excluded(v1)), (Bound::Included(v2), Bound::Unbounded)) =
(b1, b2)
{
match *v1.only_release_trimmed().release() {
[major] if *v2.release() == [major, 1] => {
Some(VersionSpecifier::not_equals_star_version(Version::new([
major, 0,
])))
}
[major, minor] if *v2.release() == [major, minor + 1] => {
Some(VersionSpecifier::not_equals_star_version(v1.clone()))
}
_ => None,
}
_ => None,
} else {
None
}
}

View file

@ -1707,23 +1707,15 @@ impl Display for MarkerTreeContents {
#[cfg(feature = "schemars")]
impl schemars::JsonSchema for MarkerTree {
fn schema_name() -> String {
"MarkerTree".to_string()
fn schema_name() -> Cow<'static, str> {
Cow::Borrowed("MarkerTree")
}
fn json_schema(_gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
schemars::schema::SchemaObject {
instance_type: Some(schemars::schema::InstanceType::String.into()),
metadata: Some(Box::new(schemars::schema::Metadata {
description: Some(
"A PEP 508-compliant marker expression, e.g., `sys_platform == 'Darwin'`"
.to_string(),
),
..schemars::schema::Metadata::default()
})),
..schemars::schema::SchemaObject::default()
}
.into()
fn json_schema(_generator: &mut schemars::generate::SchemaGenerator) -> schemars::Schema {
schemars::json_schema!({
"type": "string",
"description": "A PEP 508-compliant marker expression, e.g., `sys_platform == 'Darwin'`"
})
}
}
@ -2279,13 +2271,13 @@ mod test {
#[test]
fn test_marker_simplification() {
assert_false("python_version == '3.9.1'");
assert_false("python_version == '3.9.0.*'");
assert_true("python_version != '3.9.1'");
// Technically these is are valid substring comparison, but we do not allow them.
// e.g., using a version with patch components with `python_version` is considered
// impossible to satisfy since the value it is truncated at the minor version
assert_false("python_version in '3.9.0'");
// This is an edge case that happens to be supported, but is not critical to support.
assert_simplifies(
"python_version in '3.9.0'",
"python_full_version == '3.9.*'",
);
// e.g., using a version that is not PEP 440 compliant is considered arbitrary
assert_true("python_version in 'foo'");
// e.g., including `*` versions, which would require tracking a version specifier
@ -2295,16 +2287,25 @@ mod test {
assert_true("python_version in '3.9,3.10'");
assert_true("python_version in '3.9 or 3.10'");
// e.g, when one of the values cannot be true
// TODO(zanieb): This seems like a quirk of the `python_full_version` normalization, this
// should just act as though the patch version isn't present
assert_false("python_version in '3.9 3.10.0 3.11'");
// This is an edge case that happens to be supported, but is not critical to support.
assert_simplifies(
"python_version in '3.9 3.10.0 3.11'",
"python_full_version >= '3.9' and python_full_version < '3.12'",
);
assert_simplifies("python_version == '3.9'", "python_full_version == '3.9.*'");
assert_simplifies(
"python_version == '3.9.0'",
"python_full_version == '3.9.*'",
);
assert_simplifies(
"python_version == '3.9.0.*'",
"python_full_version == '3.9.*'",
);
assert_simplifies(
"python_version == '3.*'",
"python_full_version >= '3' and python_full_version < '4'",
);
// `<version> in`
// e.g., when the range is not contiguous
@ -2515,7 +2516,7 @@ mod test {
#[test]
fn test_simplification_extra_versus_other() {
// Here, the `extra != 'foo'` cannot be simplified out, because
// `extra == 'foo'` can be true even when `extra == 'bar`' is true.
// `extra == 'foo'` can be true even when `extra == 'bar'`' is true.
assert_simplifies(
r#"extra != "foo" and (extra == "bar" or extra == "baz")"#,
"(extra == 'bar' and extra != 'foo') or (extra == 'baz' and extra != 'foo')",
@ -2536,6 +2537,68 @@ mod test {
);
}
#[test]
fn test_python_version_equal_star() {
// Input, equivalent with python_version, equivalent with python_full_version
let cases = [
("3.*", "3.*", "3.*"),
("3.0.*", "3.0", "3.0.*"),
("3.0.0.*", "3.0", "3.0.*"),
("3.9.*", "3.9", "3.9.*"),
("3.9.0.*", "3.9", "3.9.*"),
("3.9.0.0.*", "3.9", "3.9.*"),
];
for (input, equal_python_version, equal_python_full_version) in cases {
assert_eq!(
m(&format!("python_version == '{input}'")),
m(&format!("python_version == '{equal_python_version}'")),
"{input} {equal_python_version}"
);
assert_eq!(
m(&format!("python_version == '{input}'")),
m(&format!(
"python_full_version == '{equal_python_full_version}'"
)),
"{input} {equal_python_full_version}"
);
}
let cases_false = ["3.9.1.*", "3.9.1.0.*", "3.9.1.0.0.*"];
for input in cases_false {
assert!(
m(&format!("python_version == '{input}'")).is_false(),
"{input}"
);
}
}
#[test]
fn test_tilde_equal_normalization() {
assert_eq!(
m("python_version ~= '3.10.0'"),
m("python_version >= '3.10.0' and python_version < '3.11.0'")
);
// Two digit versions such as `python_version` get padded with a zero, so they can never
// match
assert_eq!(m("python_version ~= '3.10.1'"), MarkerTree::FALSE);
assert_eq!(
m("python_version ~= '3.10'"),
m("python_version >= '3.10' and python_version < '4.0'")
);
assert_eq!(
m("python_full_version ~= '3.10.0'"),
m("python_full_version >= '3.10.0' and python_full_version < '3.11.0'")
);
assert_eq!(
m("python_full_version ~= '3.10'"),
m("python_full_version >= '3.10' and python_full_version < '4.0'")
);
}
/// This tests marker implication.
///
/// Specifically, these test cases come from a [bug] where `foo` and `bar`
@ -3332,4 +3395,32 @@ mod test {
]
);
}
/// Case a: There is no version `3` (no trailing zero) in the interner yet.
#[test]
fn marker_normalization_a() {
let left_tree = m("python_version == '3.0.*'");
let left = left_tree.try_to_string().unwrap();
let right = "python_full_version == '3.0.*'";
assert_eq!(left, right, "{left} != {right}");
}
/// Case b: There is already a version `3` (no trailing zero) in the interner.
#[test]
fn marker_normalization_b() {
m("python_version >= '3' and python_version <= '3.0'");
let left_tree = m("python_version == '3.0.*'");
let left = left_tree.try_to_string().unwrap();
let right = "python_full_version == '3.0.*'";
assert_eq!(left, right, "{left} != {right}");
}
#[test]
fn marker_normalization_c() {
let left_tree = MarkerTree::from_str("python_version == '3.10.0.*'").unwrap();
let left = left_tree.try_to_string().unwrap();
let right = "python_full_version == '3.10.*'";
assert_eq!(left, right, "{left} != {right}");
}
}

View file

@ -12,8 +12,8 @@ pub enum RequirementOrigin {
File(PathBuf),
/// The requirement was provided via a local project (e.g., a `pyproject.toml` file).
Project(PathBuf, PackageName),
/// The requirement was provided via a local project (e.g., a `pyproject.toml` file).
Group(PathBuf, PackageName, GroupName),
/// The requirement was provided via a local project's group (e.g., a `pyproject.toml` file).
Group(PathBuf, Option<PackageName>, GroupName),
/// The requirement was provided via a workspace.
Workspace,
}

View file

@ -18,11 +18,16 @@ use uv_redacted::DisplaySafeUrl;
use crate::Pep508Url;
/// A wrapper around [`Url`] that preserves the original string.
///
/// The original string is not preserved after serialization/deserialization.
#[derive(Debug, Clone, Eq)]
pub struct VerbatimUrl {
/// The parsed URL.
url: DisplaySafeUrl,
/// The URL as it was provided by the user.
///
/// Even if originally set, this will be [`None`] after
/// serialization/deserialization.
given: Option<ArcStr>,
}
@ -106,10 +111,8 @@ impl VerbatimUrl {
let (path, fragment) = split_fragment(&path);
// Convert to a URL.
let mut url = DisplaySafeUrl::from(
Url::from_file_path(path.clone())
.unwrap_or_else(|()| panic!("path is absolute: {}", path.display())),
);
let mut url = DisplaySafeUrl::from_file_path(path.clone())
.unwrap_or_else(|()| panic!("path is absolute: {}", path.display()));
// Set the fragment, if it exists.
if let Some(fragment) = fragment {
@ -168,6 +171,11 @@ impl VerbatimUrl {
&self.url
}
/// Return a mutable reference to the underlying [`DisplaySafeUrl`].
pub fn raw_mut(&mut self) -> &mut DisplaySafeUrl {
&mut self.url
}
/// Convert a [`VerbatimUrl`] into a [`DisplaySafeUrl`].
pub fn to_url(&self) -> DisplaySafeUrl {
self.url.clone()

View file

@ -19,9 +19,9 @@ checksum = "b5aba8db14291edd000dfcc4d620c7ebfb122c613afb886ca8803fa4e128a20a"
[[package]]
name = "libmimalloc-sys"
version = "0.1.42"
version = "0.1.43"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec9d6fac27761dabcd4ee73571cdb06b7022dc99089acbe5435691edffaac0f4"
checksum = "bf88cd67e9de251c1781dbe2f641a1a3ad66eaae831b8a2c38fbdc5ddae16d4d"
dependencies = [
"cc",
"libc",
@ -29,9 +29,9 @@ dependencies = [
[[package]]
name = "mimalloc"
version = "0.1.46"
version = "0.1.47"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "995942f432bbb4822a7e9c3faa87a695185b0d09273ba85f097b54f4e458f2af"
checksum = "b1791cbe101e95af5764f06f20f6760521f7158f69dbf9d6baf941ee1bf6bc40"
dependencies = [
"libmimalloc-sys",
]

View file

@ -43,6 +43,7 @@ pub enum Os {
Manylinux { major: u16, minor: u16 },
Musllinux { major: u16, minor: u16 },
Windows,
Pyodide { major: u16, minor: u16 },
Macos { major: u16, minor: u16 },
FreeBsd { release: String },
NetBsd { release: String },
@ -67,6 +68,7 @@ impl fmt::Display for Os {
Self::Illumos { .. } => write!(f, "illumos"),
Self::Haiku { .. } => write!(f, "haiku"),
Self::Android { .. } => write!(f, "android"),
Self::Pyodide { .. } => write!(f, "pyodide"),
}
}
}
@ -109,6 +111,7 @@ pub enum Arch {
S390X,
LoongArch64,
Riscv64,
Wasm32,
}
impl fmt::Display for Arch {
@ -126,6 +129,7 @@ impl fmt::Display for Arch {
Self::S390X => write!(f, "s390x"),
Self::LoongArch64 => write!(f, "loongarch64"),
Self::Riscv64 => write!(f, "riscv64"),
Self::Wasm32 => write!(f, "wasm32"),
}
}
}
@ -168,7 +172,7 @@ impl Arch {
// manylinux_2_36
Self::LoongArch64 => Some(36),
// unsupported
Self::Powerpc | Self::Armv5TEL | Self::Armv6L => None,
Self::Powerpc | Self::Armv5TEL | Self::Armv6L | Self::Wasm32 => None,
}
}
@ -187,6 +191,7 @@ impl Arch {
Self::S390X => "s390x",
Self::LoongArch64 => "loongarch64",
Self::Riscv64 => "riscv64",
Self::Wasm32 => "wasm32",
}
}

View file

@ -71,6 +71,8 @@ pub enum PlatformTag {
Illumos { release_arch: SmallString },
/// Ex) `solaris_11_4_x86_64`
Solaris { release_arch: SmallString },
/// Ex) `pyodide_2024_0_wasm32`
Pyodide { major: u16, minor: u16 },
}
impl PlatformTag {
@ -97,6 +99,7 @@ impl PlatformTag {
PlatformTag::Haiku { .. } => Some("Haiku"),
PlatformTag::Illumos { .. } => Some("Illumos"),
PlatformTag::Solaris { .. } => Some("Solaris"),
PlatformTag::Pyodide { .. } => Some("Pyodide"),
}
}
}
@ -262,6 +265,7 @@ impl std::fmt::Display for PlatformTag {
Self::Haiku { release_arch } => write!(f, "haiku_{release_arch}"),
Self::Illumos { release_arch } => write!(f, "illumos_{release_arch}"),
Self::Solaris { release_arch } => write!(f, "solaris_{release_arch}_64bit"),
Self::Pyodide { major, minor } => write!(f, "pyodide_{major}_{minor}_wasm32"),
}
}
}
@ -616,6 +620,35 @@ impl FromStr for PlatformTag {
});
}
if let Some(rest) = s.strip_prefix("pyodide_") {
let mid =
rest.strip_suffix("_wasm32")
.ok_or_else(|| ParsePlatformTagError::InvalidArch {
platform: "pyodide",
tag: s.to_string(),
})?;
let underscore = memchr::memchr(b'_', mid.as_bytes()).ok_or_else(|| {
ParsePlatformTagError::InvalidFormat {
platform: "pyodide",
tag: s.to_string(),
}
})?;
let major: u16 = mid[..underscore].parse().map_err(|_| {
ParsePlatformTagError::InvalidMajorVersion {
platform: "pyodide",
tag: s.to_string(),
}
})?;
let minor: u16 = mid[underscore + 1..].parse().map_err(|_| {
ParsePlatformTagError::InvalidMinorVersion {
platform: "pyodide",
tag: s.to_string(),
}
})?;
return Ok(Self::Pyodide { major, minor });
}
Err(ParsePlatformTagError::UnknownFormat(s.to_string()))
}
}
@ -900,6 +933,27 @@ mod tests {
);
}
#[test]
fn pyodide_platform() {
let tag = PlatformTag::Pyodide {
major: 2024,
minor: 0,
};
assert_eq!(
PlatformTag::from_str("pyodide_2024_0_wasm32").as_ref(),
Ok(&tag)
);
assert_eq!(tag.to_string(), "pyodide_2024_0_wasm32");
assert_eq!(
PlatformTag::from_str("pyodide_2024_0_wasm64"),
Err(ParsePlatformTagError::InvalidArch {
platform: "pyodide",
tag: "pyodide_2024_0_wasm64".to_string()
})
);
}
#[test]
fn unknown_platform() {
assert_eq!(

Some files were not shown because too many files have changed in this diff Show more