This is a minor documentation update to a recently added section
"Package priority" in the pip compatibility guide. The aim of this PR is
clear up two things which I think the current paragraph implies but I
don't think are (always) true:
1. That pip doesn't use provided order to prioritize resolution
2. That uv relies solely on provided order to prioritize resolution
What is true, at least for now, is pip has more heuristics than uv to
prioritize during resolution, and so I've tried to rework this to make
it clear why changing the order might help uv come to a different
resolution whereas for pip it might not make a difference.
As described in #4242, we're currently incorrectly downloading glibc
python-build-standalone on musl target, but we also can't fix this by
using musl python-build-standalone on musl targets since the musl builds
are effectively broken.
We reintroduce the libc detection previously removed in #2381, using it
to detect which libc is the current one before we have a python
interpreter. I changed the strategy a big to support an empty `PATH`
which we use in the tests.
For simplicity, i've decided to just filter out the musl
python-build-standalone archives from the list of available archive,
given this is temporary. This means we show the same error message as if
we don't have a build for the platform. We could also add a dedicated
error message for musl.
Fixes#4242
## Test Plan
Tested manually.
On my ubuntu host, python downloads continue to pass:
```
target/x86_64-unknown-linux-musl/debug/uv python install
```
On alpine, we fail:
```
$ docker run -it --rm -v .:/io alpine /io/target/x86_64-unknown-linux-musl/debug/uv python install
Searching for Python installations
error: No download found for request: cpython-any-linux-x86_64-musl
```
## Summary
We now respect the `environments` field in `uv pip compile --universal`,
e.g.:
```toml
[tool.uv]
environments = ["platform_system == 'Emscripten'"]
```
Closes https://github.com/astral-sh/uv/issues/6641.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
The following Dockerfile command fails:
```
[...]
RUN --mount=from=uv,source=/uv,target=/bin/uv \
cd /opt/opencti-connector-webhook && \
uv pip install --system -r requirements.txt && \
apk del git build-base
[...]
```
Result
```
yo@opencti:~/connectors/stream/webhook$ docker build -t opencti/connector-webhook:d .
[+] Building 1.0s (3/3) FINISHED docker:default
=> [internal] load build definition from Dockerfile 0.1s
=> => transferring dockerfile: 557B 0.1s
=> ERROR [internal] load metadata for docker.io/library/uv:latest 0.8s
=> [internal] load metadata for docker.io/library/python:3.11-alpine 0.8s
------
> [internal] load metadata for docker.io/library/uv:latest:
------
ERROR: failed to solve: uv: failed to resolve source metadata for docker.io/library/uv:latest: pull access denied, repository does not exist or may require authorization: server message: insufficient_scope: authorization failed
```
Fix:
```
[...]
RUN --mount=from=ghcr.io/astral-sh/uv,source=/uv,target=/bin/uv \
cd /opt/opencti-connector-webhook && \
uv pip install --system -r requirements.txt && \
apk del git build-base
[...]
```
## Test Plan
<!-- How was it tested? -->
```
$ docker --version
Docker version 26.0.0, build 2ae903e
$ date
Mon Aug 26 20:31:53 UTC 2024
$ docker build -t opencti/connector-webhook:e .
[+] Building 41.8s (13/13) FINISHED docker:default
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 587B 0.0s
=> [internal] load metadata for ghcr.io/astral-sh/uv:latest 0.5s
=> [internal] load metadata for docker.io/library/python:3.11-alpine 0.5s
=> [internal] load .dockerignore 0.1s
=> => transferring context: 2B 0.0s
=> [stage-0 1/6] FROM docker.io/library/python:3.11-alpine@sha256:700b4aa84090748aafb348fc042b5970abb0a73c8f1b4fcfe0f4e3c2a4a9fcca 0.0s
=> [internal] load build context 0.1s
=> => transferring context: 130B 0.0s
=> CACHED FROM ghcr.io/astral-sh/uv:latest@sha256:f6b18f4a7408c5244374b00c8832089258d130f7a77a38807348072e714ffa0c 0.0s
=> CACHED [stage-0 2/6] COPY src /opt/opencti-connector-webhook 0.0s
=> CACHED [stage-0 3/6] RUN apk --no-cache add git build-base libmagic libffi-dev libxml2-dev libxslt-dev 0.0s
=> [stage-0 4/6] RUN --mount=from=ghcr.io/astral-sh/uv,source=/uv,target=/bin/uv cd /opt/opencti-connector-webhook && uv pip install --system -r requirements.txt 38.3s
=> [stage-0 5/6] COPY entrypoint.sh / 0.1s
=> [stage-0 6/6] RUN chmod +x /entrypoint.sh 0.8s
=> exporting to image 1.7s
=> => exporting layers 1.6s
=> => writing image sha256:aa6810f883d104c838f35e848c0d7d8b4df5c7c3929f18a88b7139d0ec892a0b 0.0s
=> => naming to docker.io/opencti/connector-webhook:e 0.0s
```
## Summary
This is similar to https://github.com/astral-sh/uv/pull/6171 but more
expansive... _Anywhere_ that we test requirements for platform
compatibility, we _need_ to respect the resolver-friendly markers. In
fixing the motivating issue (#6621), I also realized that we had a bunch
of bugs here around `pip install` with `--python-platform` and
`--python-version`, because we always performed our `satisfy` and `Plan`
operations on the interpreter's markers, not the adjusted markers!
Closes https://github.com/astral-sh/uv/issues/6621.
For various reasons, I have a preference for out of tree virtual
environments. Things just work if I symlink, but I don't know that this
is guaranteed, so I thought I'd add a test for it. It looks like there's
another code path that matters (`FoundInterpreter::discover ->
PythonEnvironment::from_root`) for the higher level commands, but
couldn't spot a good place to test that.
Related discussion:
https://github.com/astral-sh/uv/issues/1495#issuecomment-1950442191 /
https://github.com/astral-sh/uv/issues/1578#issuecomment-1949911871
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
This changes the behavior a bit of the per-dependency build-isolation
override. That, if the dist name is known, it is passed into the
`SourceBuild::Setup` function. This allows for this override to work for
projects without a `pyproject.toml`, like `detectron2`, using the
specified requirement name. Previously only the `pyproject.toml` name
could be used, which these projects are lacking. An example of a
use-case is given in the *Test Plan* section.
Additionally, the `no_build_isolation_package` has been adding to
`InstallerSettingsRef` and used in `sync` and other commands, as this
was not done yet.
This is useful if you want to **non**-isolate a single package, even
ones without a proper `pyproject.toml`
## Test Plan
<!-- How was it tested? -->
With the following pyproject.toml.
```toml
[project]
name = "detectron-uv"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.12"
dependencies = [
"detectron2",
"setuptools",
"torch",
]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.uv.sources]
detectron2 = { git = "https://github.com/facebookresearch/detectron2", rev = "bcfd464d0c810f0442d91a349c0f6df945467143" }
[tool.uv]
no-build-isolation-package = ["detectron2"]
```
The package `detectron2` is now correctly **non**-isolated. Before,
because the logic depended on getting the name from the
`pyproject.toml`, which is lacking in detectron2 you would get the
message, that the source could not be built. This was because it would
still be *isolated* in that case.
With these changes you can now install using (given that you are inside
a workspace with a venv):
```
uv pip install torch setuptools
uv sync
```
This would previously fail with something like:
```
error: Failed to prepare distributions
Caused by: Failed to fetch wheel: detectron2 @ git+https://github.com/facebookresearch/detectron2@bcfd464d0c810f0442d91a349c0f6df945467143
Caused by: Build backend failed to determine extra requires with `build_wheel()` with exit status: 1
--- stdout:
--- stderr:
Traceback (most recent call last):
File "<string>", line 14, in <module>
File "/Users/tdejager/Library/Caches/uv/builds-v0/.tmptloDcZ/lib/python3.12/site-packages/setuptools/build_meta.py", line 332, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=[])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/tdejager/Library/Caches/uv/builds-v0/.tmptloDcZ/lib/python3.12/site-packages/setuptools/build_meta.py", line 302, in _get_build_requires
self.run_setup()
File "/Users/tdejager/Library/Caches/uv/builds-v0/.tmptloDcZ/lib/python3.12/site-packages/setuptools/build_meta.py", line 502, in run_setup
super().run_setup(setup_script=setup_script)
File "/Users/tdejager/Library/Caches/uv/builds-v0/.tmptloDcZ/lib/python3.12/site-packages/setuptools/build_meta.py", line 318, in run_setup
exec(code, locals())
File "<string>", line 10, in <module>
ModuleNotFoundError: No module named 'torch'
---
Caused by: This error likely indicates that detectron2 @ git+https://github.com/facebookresearch/detectron2@bcfd464d0c810f0442d91a349c0f6df945467143 depends on torch, but doesn't declare it as a build dependency. If detectron2 @ git+https://github.com/facebookresearch/detectron2@bcfd464d0c810f0442d91a349c0f6df945467143 is a first-party package, consider adding torch to its `build-system.requires`. Otherwise, `uv pip install torch` into the environment and re-run with `--no-build-isolation`.
```
**Edit**:
Some wording, used isolated where it should be **non**-isolated.
## Summary
Fixes: #6615
Currently, some packages are not installable with `uv`, like `ziglang`
on Linux.
Everything is described in the issue! 😄
<!-- What's the purpose of the change? What does it do, and why? -->
## Test Plan
<!-- How was it tested? -->
I added a unit test for the problematic use case.
I also checked that previous unit test are still running in order to
ensure the backward compatibility.
## Summary
This was an oversight. The existing test was (correctly) failing, but
for the wrong reason (failing to build the package during _resolution_).
Updates the snapshot for the deployment from
https://github.com/astral-sh/pypi-proxy/pull/9 — for a while now, we've
only been failing on file requests not registry requests because the
proxy auth was setup wrong.
As a non-shell-wizard, I was unfamiliar with the `EOF` syntax used in
the existing example (just above the one I added). I thought including
an example where the output of `echo` is piped to `uv run` might be more
accessible. As a bonus, it should work across more shells: the `EOF`
example doesn't work in fish because fish [doesn't support
heredocs](https://fishshell.com/docs/current/fish_for_bash_users.html#heredocs),
while the `echo` example does.
Feel free to ignore if unwanted.
For users who were using absolute paths in the `pyproject.toml`
previously, this is a behavior change: We now convert all absolute paths
in `path` entries to relative paths. Since i assume that no-one relies
on absolute path in their lockfiles - they are intended to be portable -
I'm tagging this as a bugfix.
Closes https://github.com/astral-sh/uv/pull/6438
Fixes https://github.com/astral-sh/uv/issues/6371
## Summary
This PR parallelizes multi-platform builds using multiple workers (hence
the new docker-build / docker-publish jobs), this seems to save about ~8
minutes.
This is partial work extracted from
https://github.com/astral-sh/uv/pull/6053 than is standalone