The script stumbled over a newline introduced in
https://github.com/pypi/warehouse/pull/18266 (which is valid).
Also fixed: Don't read versions for the same package from other indexes.
We were using `project_name` here instead of `target`, while using the
latter and only reading from a single index simplifies the code too.
This PR provides a script that uses environment variables to determine
which registries to test. This script is being used to run automated
registry tests in CI for AWS, Azure, GCP, Artifactory, GitLab,
Cloudsmith, and Gemfury.
You must configure the following required env vars for each registry:
```
UV_TEST_<registry_name>_URL URL for the registry
UV_TEST_<registry_name>_TOKEN authentication token
UV_TEST_<registry_name>_PKG private package to install
```
The username defaults to "\_\_token\_\_" but can be optionally set with:
```
UV_TEST_<registry_name>_USERNAME
```
For each configured registry, the test will attempt to install the
specified package. Some registries can fall back to PyPI internally, so
it's important to choose a package that only exists in the registry you
are testing.
Currently, a successful test means that it finds the line “ +
<package_name>” in the output. This is because in its current form we
don’t know ahead of time what package it is and hence what the exact
expected output would be. The advantage if that anyone can run this
locally, though they would have to have access to the registries they
want to test.
You can also use the `--use-op` command line argument to derive these
test env vars from a 1Password vault (default is "RegistryTests" but can
be configured with `--op-vault`). It will look at all items in the vault
with names following the pattern `UV_TEST_<registry_name>` and will
derive the env vars as follows:
```
`UV_TEST_<registry_name>_USERNAME` from the `username` field
`UV_TEST_<registry_name>_TOKEN` from the `password` field
`UV_TEST_<registry_name>_URL` from a field with the label `url`
`UV_TEST_<registry_name>_PKG` from a field with the label `pkg`
```
For the case where there was no matching wheel on sync, we previously
added a note about which wheels are available vs. on which platform you
are on. We extend this error message to link directly towards
`tool.uv.required-environments`, which otherwise has a discovery
problem.
On Linux (Setting `tool.uv.required-environments` doesn't help here
either, but it's a clear example):
```
[project]
name = "debug"
version = "0.1.0"
requires-python = "==3.10.*"
dependencies = ["tensorflow-macos>=2.13.1"]
```
```
Resolved 41 packages in 24ms
error: Distribution `tensorflow-macos==2.16.2 @ registry+https://pypi.org/simple` can't be installed because it doesn't have a source distribution or wheel for the current platform
hint: You're on Linux (`manylinux_2_39_x86_64`), but there are no wheels for the current platform, consider configuring `tool.uv.required-environments`.
hint: `tensorflow-macos` (v2.16.2) only has wheels for the following platform: `macosx_12_0_arm64`.
```

---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
## Summary
This adds GraalPy download metadata so that `uv python install graalpy`
works. See https://github.com/astral-sh/uv/issues/13114
## Test Plan
The existing integration test was changed to test this functionality.
Some registries (like Azure Artifact) can require you to authenticate
separately for every package URL if you do not authenticate for the
/simple endpoint. These changes make the auth middleware aware of index
URL endpoints and attempts to fetch keyring credentials for such an
index URL when making a request to any URL it's a prefix of.
The current uv behavior is to cache credentials either at the request
URL or realm level. But with these changes, we also need to cache
credentials at the index level. Note that when uv does not detect an
index URL for a request URL, it will continue to apply the old behavior.
Addresses part of #4056Closes#4583Closes#11236Closes#11391Closes#11507
I wanted to consolidate these anyway, and apparently it's a huge pain to
make a Windows task fail early via GitHub's PowerShell setup so I
implement this in Python instead.
Previously, we required a username to perform a fetch from the keyring
because the `keyring` CLI only supported fetching password for a given
service and username. Unfortunately, this is different from the keyring
Python API which supported fetching a username _and_ password for a
given service. We can't (easily) use the Python API because we don't
expect `keyring` to be installed in a specific environment during
network requests. This means that we did not have parity with `pip`.
Way back in https://github.com/jaraco/keyring/pull/678 we got a `--mode
creds` flag added to `keyring`'s CLI which supports parity with the
Python API. Since `keyring` is expensive to invoke and we cannot be
certain that users are on the latest version of keyring, we've not added
support for invoking keyring with this flag. However, now that we have a
mode that says authentication is _required_ for an index (#11896), we
might as well _try_ to invoke keyring with `--mode creds` when there is
no username. This will address use-cases where the username is
non-constant and move us closer to `pip` parity.
I think it's important for the breaking changes to be at the _top_ of
the file instead of the bottom.
Now that it's not being rendered by GitHub's Releases markdown, we can
remove the prettier ignores.
In the publish client, we have to set the client retries to 0 as the
retry middleware is incompatible with upload bodies. This however also
sets `client.retry_policy()` to a zero-retry policy, so we need to
construct our own policy.
Fixes#12027
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
uv itself is a large package with many dependencies and lots of
features. To build a package using the uv build backend, you shouldn't
have to download and install the entirety of uv. For platform where we
don't provide wheels, it should be possible and fast to compile the uv
build backend. To that end, we're introducing a python package that
contains a trimmed down version of uv that only contains the build
backend, with a minimal dependency tree in rust.
The `uv_build` package is publish from CI just like uv itself. It is
part of the workspace, but has much less dependencies for its own
binary. We're using cargo deny to enforce that the network stack is not
part of the dependencies. A new build profile ensure we're getting the
minimum possible binary size for a rust binary.
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
To stay within quota, it now has just under 30 days of data, so the
filename has been updated. Both will be available for a while. See
https://github.com/hugovk/top-pypi-packages/pull/46.
## Test Plan
<!-- How was it tested? -->
```sh
curl https://hugovk.github.io/top-pypi-packages/top-pypi-packages-30-days.min.json | jq -r ❯
curl https://hugovk.github.io/top-pypi-packages/top-pypi-packages.min.json | jq -r ".rows | .
diff pypi_8k_downloads_original.txt pypi_8k_downloads_new.txt
```
Fixes#11793
On Windows, trying to read a file inside what is not a directory but
another file results in a not found error, while on Unix we get a not a
directory error. We check explicitly if something included in a
workspace glob is a non-directory to fix the behavior on Windows.
First of all, I want to test automatic managed installs (see #10913) and
need to set that up. Second of all, some tests were _implicitly_
downloading interpreters instead of using the one from their context —
which is unexpected and naughty and very slow.
## Summary
In preview mode on windows, register und un-register the managed python build standalone installations in the Windows registry following PEP 514.
We write the values defined in the PEP plus the download URL and hash. We add an entry when installing a version, remove an entry when uninstalling and removing all values when uninstalling with `--all`. We update entries only by overwriting existing values, there is no "syncing" involved.
Since they are not official builds, pbs gets a prefix. `py -V:Astral/CPython3.13.1` works, `py -3.13` doesn't.
```
$ py --list-paths
-V:3.12 * C:\Users\Konsti\AppData\Local\Programs\Python\Python312\python.exe
-V:3.11.9 C:\Users\Konsti\.pyenv\pyenv-win\versions\3.11.9\python.exe
-V:3.11 C:\Users\micro\AppData\Local\Programs\Python\Python311\python.exe
-V:3.8 C:\Users\micro\AppData\Local\Programs\Python\Python38\python.exe
-V:Astral/CPython3.13.1 C:\Users\Konsti\AppData\Roaming\uv\data\python\cpython-3.13.1-windows-x86_64-none\python.exe
```
Registry errors are reported but not fatal, except for operations on the company key since it's not bound to any specific python interpreter.
On uninstallation, we prune registry entries that have no matching Python installation (i.e. broken entries).
The code uses the official `windows_registry` crate of the `winreg` crate.
Best reviewed commit-by-commit.
## Test Plan
We're reusing an existing system check to test different (un)installation scenarios.
## Summary
After we resolve, we filter out any wheels that aren't applicable for
the target platforms. So, e.g., we remove macOS wheels if we find that
the user only asked to solve for Windows.
This PR extends the same logic to architectures, so that we filter out
ARM-only wheels when the user is only solving for x86, etc.
Closes#10571.
## Summary
Adds regular expression based version filter to python mirror script.
## Test Plan
Manually using `uv run ./scripts/create-python-mirror.py --name cpython
--arch x86_64 --os linux --version "3.13.\d+$"`
## Summary
The architecture details in `crates/uv-python/download-metadata.json` is
now a dictionary with family and variant data, whereas it used to be a
string. This patches the architecture filter in
`scripts/create-python-mirror.py` to support both scenarios.
## Test Plan
Tested manually using `uv run ./scripts/create-python-mirror.py --name
cpython --arch x86_64 --os linux --from-all-history`
## Summary
A revival of an old idea (#9344) that I have slightly more confidence in
now. I abandoned this idea because (1) it couldn't capture that, e.g.,
`platform_system == 'Windows' and sys_platform == 'foo'` (or some other
unknown value) are disjoint, and (2) I thought that Android returned
`"android"` for one of `sys_platform` or `platform_system`, which
would've made this logic incorrect.
However, it looks like Android... doesn't do that? And the values here
are almost always in a small, known set. So in the end, the tradeoffs
here actually seem pretty good.
Vis-a-vis our current solution, this can (e.g.) _simplify out_
expressions like `sys_platform == 'win32' or platform_system ==
'Windows'`.
Make the local packse workflow work again:
```
# In packse:
uv run --extra index --extra serve packse serve --no-hash scenarios &
# In uv:
UV_TEST_INDEX_URL="http://localhost:3141/simple/" ./scripts/scenarios/generate.py
```
Bugs fixed:
* The default scenario pattern didn't match anything.
* The snapshot update test command was wrong since the test
centralization
* Snapshot update failures would not be reported
When publishing, we currently ask the user to set `--publish-url` to the
upload URL and `--check-url` to the simple index URL, or the equivalent
configuration keys. But that's redundant with the `[[tool.uv.index]]`
declaration. Instead, we extend `[[tool.uv.index]]` with a `publish-url`
entry and allow passing `uv publish --index <name>`.
`uv publish --index <name>` requires the `pyproject.toml` to be present
when publishing, unlike using `--publish-url ... --check-url ...` which
can be used e.g. in CI without a checkout step. `--index` also always
uses the check URL feature to aid upload consistency.
The documentation tries to explain both approaches together, which
overlap for the check URL feature.
Fixes#8864
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
## Summary
I added `crates/uv-python/create-mirror.py` to make it easy to download
all the needed files to create a mirror for Python distributions in an
offline environment.
the script also has an option to iterate over the git history of the
`download-metadata.json` to make sure we have all the files needed for
all the uv versions
## Test Plan
```
uv run create-mirror.py --from-all-history --os linux --arch x86_64 --name cpython
2024-10-25 01:31:12,973 - INFO - Starting download of 466 files.
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 466/466 [06:11<00:00, 1.26file/s]
Successfully downloaded: 466
now you can run UV_PYTHON_INSTALL_MIRROR='file:///home/meitar/dev/uv/crates/uv-python/mirror' uv python install
```
then checked (the `unshare` command make sure that the process don't
have any netwok)
```
UV_PYTHON_INSTALL_MIRROR=file:///home/meitar/dev/uv/crates/uv-python/mirror sudo -E unshare -n /home/meitar/.local/bin/uv python install 3.13
Searching for Python versions matching: Python 3.13
Installed Python 3.13.0 in 2.91s
+ cpython-3.13.0-linux-x86_64-gnu
```
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
When adding excludes, we usually don't want to include python cache
files. On the contrary, I haven't seen any project in my ecosystem
research that would want any of `__pycache__`, `*.pyc`, `*.pyo` to be
included. By moving them behind a `default-excludes` toggle, they are
always active even when defining custom excludes, but can be deactivated
if the user so chooses.
With includes and excludes being this small again, we can roll back the
include-exclude anchored difference to always using anchored globs (i.e.
you would need to use `**/build-*.h` below).
A pyproject.toml with custom settings with the change applied:
```toml
[project]
name = "foo"
version = "0.1.0"
readme = "README.md"
license-files = ["LICENSE*", "third-party-licenses/*"]
[tool.uv.build-backend]
# A file we need for the source dist -> wheel step, but not in the wheel itself (currently unused)
source-include = ["data/build-script.py"]
# A temporary or generated file we want to ignore
source-exclude = ["/src/foo/not-packaged.txt"]
# Headers are build-only
wheel-exclude = ["build-*.h"]
[tool.uv.build-backend.data]
scripts = "scripts"
data = "assets"
headers = "header"
[build-system]
requires = ["uv>=0.5.5,<0.6"]
build-backend = "uv"
```
When building the source distribution, we always need to include
`pyproject.toml` and the module, when building the wheel, we always
include the module but nothing else at top level. Since we only allow a
single module per wheel, that means that there are no specific wheel
includes. This means we have source includes, source excludes, wheel
excludes, but no wheel includes: This is defined by the module root,
plus the metadata files and data directories separately.
Extra source dist includes are currently unused (they can't end up in
the wheel currently), but it makes sense to model them here, they will
be needed for any sort of procedural build step.
This results in the following fields being relevant for inclusions and
exclusion:
* `pyproject.toml` (always included in the source dist)
* project.readme: PEP 621
* project.license-files: PEP 639
* module_root: `Path`
* source_include: `Vec<Glob>`
* source_exclude: `Vec<Glob>`
* wheel_exclude: `Vec<Glob>`
* data: `Map<KnownDataName, Path>`
An opinionated choice is that that wheel excludes always contain the
source excludes: Otherwise you could have a path A in the source tree
that gets included when building the wheel directly from the source
tree, but not when going through the source dist as intermediary,
because A is in source excludes, but not in the wheel excludes. This has
been a source of errors previously.
In the process, I fixed a bug where we would skip directories and only
include the files and were missing license due to absolute globs.
When trying to upload without a password but with the keyring, check
that the keyring has a password for the upload URL and username and warn
if it doesn't.
Fixes#8781
We were previously not uploading all metadata in the formdata of an
upload request in the legacy api. Notably, we were missing the PEP 639
license-files field.
I had to switch to pdm due to https://github.com/pypa/hatch/issues/1828
Allow including data files in wheels, configured through
`pyproject.toml`. This configuration is currently only read in the build
backend. We'd only start using it in the frontend when we're adding a
fast path.
Each data entry is a directory, whose contents are copied to the
matching directory in the wheel in
`<name>-<version>.data/(purelib|platlib|headers|scripts|data)`. Upon
installation, this data is moved to its target location, as defined by
<https://docs.python.org/3.12/library/sysconfig.html#installation-paths>:
- `data`: Installed over the virtualenv environment root. Warning: This
may override existing files!
- `scripts`: Installed to the directory for executables, `<venv>/bin` on
Unix or `<venv>\Scripts` on Windows. This directory is added to PATH
when the virtual environment is activated or when using `uv run`, so
this data type can be used to install additional binaries. Consider
using `project.scripts` instead for starting Python code.
- `headers`: Installed to the include directory, where compilers
building Python packages with this package as built requirement will
search for header files.
- `purelib` and `platlib`: Installed to the `site-packages` directory.
It is not recommended to uses these two options.
For simplicity, for now we're just defining a directory to be copied for
each data directory, while using the glob based include mechanism in the
background. We thereby introduce a third mechanism next to the main
includes and the PEP 639 mechanism, which is not what we should finalize
on.
When building source distributions, we need to include the readme, so it
can become part the METADATA body when building the wheel. We also need
to support the license files from PEP 639. When building the source
distribution, we copy those file relative to their origin, when building
the wheel, we copy them to `.dist-info/licenses`.
The test for idempotence in wheel building is merged into the file
listing test, which also covers that source tree -> source dist -> wheel
is equivalent to source tree -> wheel, though we do need to check for
file inclusion stronger here.
Best reviewed commit-by-commit