Fixes#11793
On Windows, trying to read a file inside what is not a directory but
another file results in a not found error, while on Unix we get a not a
directory error. We check explicitly if something included in a
workspace glob is a non-directory to fix the behavior on Windows.
First of all, I want to test automatic managed installs (see #10913) and
need to set that up. Second of all, some tests were _implicitly_
downloading interpreters instead of using the one from their context —
which is unexpected and naughty and very slow.
## Summary
In preview mode on windows, register und un-register the managed python build standalone installations in the Windows registry following PEP 514.
We write the values defined in the PEP plus the download URL and hash. We add an entry when installing a version, remove an entry when uninstalling and removing all values when uninstalling with `--all`. We update entries only by overwriting existing values, there is no "syncing" involved.
Since they are not official builds, pbs gets a prefix. `py -V:Astral/CPython3.13.1` works, `py -3.13` doesn't.
```
$ py --list-paths
-V:3.12 * C:\Users\Konsti\AppData\Local\Programs\Python\Python312\python.exe
-V:3.11.9 C:\Users\Konsti\.pyenv\pyenv-win\versions\3.11.9\python.exe
-V:3.11 C:\Users\micro\AppData\Local\Programs\Python\Python311\python.exe
-V:3.8 C:\Users\micro\AppData\Local\Programs\Python\Python38\python.exe
-V:Astral/CPython3.13.1 C:\Users\Konsti\AppData\Roaming\uv\data\python\cpython-3.13.1-windows-x86_64-none\python.exe
```
Registry errors are reported but not fatal, except for operations on the company key since it's not bound to any specific python interpreter.
On uninstallation, we prune registry entries that have no matching Python installation (i.e. broken entries).
The code uses the official `windows_registry` crate of the `winreg` crate.
Best reviewed commit-by-commit.
## Test Plan
We're reusing an existing system check to test different (un)installation scenarios.
## Summary
After we resolve, we filter out any wheels that aren't applicable for
the target platforms. So, e.g., we remove macOS wheels if we find that
the user only asked to solve for Windows.
This PR extends the same logic to architectures, so that we filter out
ARM-only wheels when the user is only solving for x86, etc.
Closes#10571.
## Summary
Adds regular expression based version filter to python mirror script.
## Test Plan
Manually using `uv run ./scripts/create-python-mirror.py --name cpython
--arch x86_64 --os linux --version "3.13.\d+$"`
## Summary
The architecture details in `crates/uv-python/download-metadata.json` is
now a dictionary with family and variant data, whereas it used to be a
string. This patches the architecture filter in
`scripts/create-python-mirror.py` to support both scenarios.
## Test Plan
Tested manually using `uv run ./scripts/create-python-mirror.py --name
cpython --arch x86_64 --os linux --from-all-history`
## Summary
A revival of an old idea (#9344) that I have slightly more confidence in
now. I abandoned this idea because (1) it couldn't capture that, e.g.,
`platform_system == 'Windows' and sys_platform == 'foo'` (or some other
unknown value) are disjoint, and (2) I thought that Android returned
`"android"` for one of `sys_platform` or `platform_system`, which
would've made this logic incorrect.
However, it looks like Android... doesn't do that? And the values here
are almost always in a small, known set. So in the end, the tradeoffs
here actually seem pretty good.
Vis-a-vis our current solution, this can (e.g.) _simplify out_
expressions like `sys_platform == 'win32' or platform_system ==
'Windows'`.
Make the local packse workflow work again:
```
# In packse:
uv run --extra index --extra serve packse serve --no-hash scenarios &
# In uv:
UV_TEST_INDEX_URL="http://localhost:3141/simple/" ./scripts/scenarios/generate.py
```
Bugs fixed:
* The default scenario pattern didn't match anything.
* The snapshot update test command was wrong since the test
centralization
* Snapshot update failures would not be reported
When publishing, we currently ask the user to set `--publish-url` to the
upload URL and `--check-url` to the simple index URL, or the equivalent
configuration keys. But that's redundant with the `[[tool.uv.index]]`
declaration. Instead, we extend `[[tool.uv.index]]` with a `publish-url`
entry and allow passing `uv publish --index <name>`.
`uv publish --index <name>` requires the `pyproject.toml` to be present
when publishing, unlike using `--publish-url ... --check-url ...` which
can be used e.g. in CI without a checkout step. `--index` also always
uses the check URL feature to aid upload consistency.
The documentation tries to explain both approaches together, which
overlap for the check URL feature.
Fixes#8864
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
## Summary
I added `crates/uv-python/create-mirror.py` to make it easy to download
all the needed files to create a mirror for Python distributions in an
offline environment.
the script also has an option to iterate over the git history of the
`download-metadata.json` to make sure we have all the files needed for
all the uv versions
## Test Plan
```
uv run create-mirror.py --from-all-history --os linux --arch x86_64 --name cpython
2024-10-25 01:31:12,973 - INFO - Starting download of 466 files.
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 466/466 [06:11<00:00, 1.26file/s]
Successfully downloaded: 466
now you can run UV_PYTHON_INSTALL_MIRROR='file:///home/meitar/dev/uv/crates/uv-python/mirror' uv python install
```
then checked (the `unshare` command make sure that the process don't
have any netwok)
```
UV_PYTHON_INSTALL_MIRROR=file:///home/meitar/dev/uv/crates/uv-python/mirror sudo -E unshare -n /home/meitar/.local/bin/uv python install 3.13
Searching for Python versions matching: Python 3.13
Installed Python 3.13.0 in 2.91s
+ cpython-3.13.0-linux-x86_64-gnu
```
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
When adding excludes, we usually don't want to include python cache
files. On the contrary, I haven't seen any project in my ecosystem
research that would want any of `__pycache__`, `*.pyc`, `*.pyo` to be
included. By moving them behind a `default-excludes` toggle, they are
always active even when defining custom excludes, but can be deactivated
if the user so chooses.
With includes and excludes being this small again, we can roll back the
include-exclude anchored difference to always using anchored globs (i.e.
you would need to use `**/build-*.h` below).
A pyproject.toml with custom settings with the change applied:
```toml
[project]
name = "foo"
version = "0.1.0"
readme = "README.md"
license-files = ["LICENSE*", "third-party-licenses/*"]
[tool.uv.build-backend]
# A file we need for the source dist -> wheel step, but not in the wheel itself (currently unused)
source-include = ["data/build-script.py"]
# A temporary or generated file we want to ignore
source-exclude = ["/src/foo/not-packaged.txt"]
# Headers are build-only
wheel-exclude = ["build-*.h"]
[tool.uv.build-backend.data]
scripts = "scripts"
data = "assets"
headers = "header"
[build-system]
requires = ["uv>=0.5.5,<0.6"]
build-backend = "uv"
```
When building the source distribution, we always need to include
`pyproject.toml` and the module, when building the wheel, we always
include the module but nothing else at top level. Since we only allow a
single module per wheel, that means that there are no specific wheel
includes. This means we have source includes, source excludes, wheel
excludes, but no wheel includes: This is defined by the module root,
plus the metadata files and data directories separately.
Extra source dist includes are currently unused (they can't end up in
the wheel currently), but it makes sense to model them here, they will
be needed for any sort of procedural build step.
This results in the following fields being relevant for inclusions and
exclusion:
* `pyproject.toml` (always included in the source dist)
* project.readme: PEP 621
* project.license-files: PEP 639
* module_root: `Path`
* source_include: `Vec<Glob>`
* source_exclude: `Vec<Glob>`
* wheel_exclude: `Vec<Glob>`
* data: `Map<KnownDataName, Path>`
An opinionated choice is that that wheel excludes always contain the
source excludes: Otherwise you could have a path A in the source tree
that gets included when building the wheel directly from the source
tree, but not when going through the source dist as intermediary,
because A is in source excludes, but not in the wheel excludes. This has
been a source of errors previously.
In the process, I fixed a bug where we would skip directories and only
include the files and were missing license due to absolute globs.
When trying to upload without a password but with the keyring, check
that the keyring has a password for the upload URL and username and warn
if it doesn't.
Fixes#8781
We were previously not uploading all metadata in the formdata of an
upload request in the legacy api. Notably, we were missing the PEP 639
license-files field.
I had to switch to pdm due to https://github.com/pypa/hatch/issues/1828
Allow including data files in wheels, configured through
`pyproject.toml`. This configuration is currently only read in the build
backend. We'd only start using it in the frontend when we're adding a
fast path.
Each data entry is a directory, whose contents are copied to the
matching directory in the wheel in
`<name>-<version>.data/(purelib|platlib|headers|scripts|data)`. Upon
installation, this data is moved to its target location, as defined by
<https://docs.python.org/3.12/library/sysconfig.html#installation-paths>:
- `data`: Installed over the virtualenv environment root. Warning: This
may override existing files!
- `scripts`: Installed to the directory for executables, `<venv>/bin` on
Unix or `<venv>\Scripts` on Windows. This directory is added to PATH
when the virtual environment is activated or when using `uv run`, so
this data type can be used to install additional binaries. Consider
using `project.scripts` instead for starting Python code.
- `headers`: Installed to the include directory, where compilers
building Python packages with this package as built requirement will
search for header files.
- `purelib` and `platlib`: Installed to the `site-packages` directory.
It is not recommended to uses these two options.
For simplicity, for now we're just defining a directory to be copied for
each data directory, while using the glob based include mechanism in the
background. We thereby introduce a third mechanism next to the main
includes and the PEP 639 mechanism, which is not what we should finalize
on.
When building source distributions, we need to include the readme, so it
can become part the METADATA body when building the wheel. We also need
to support the license files from PEP 639. When building the source
distribution, we copy those file relative to their origin, when building
the wheel, we copy them to `.dist-info/licenses`.
The test for idempotence in wheel building is merged into the file
listing test, which also covers that source tree -> source dist -> wheel
is equivalent to source tree -> wheel, though we do need to check for
file inclusion stronger here.
Best reviewed commit-by-commit
A first milestone: source tree -> source dist -> wheel -> install works.
This PR adds a test for this.
There's obviously a lot still missing, including basics such as the
Readme inclusion.
Implement a full working version of local version semantics. The (AFAIA)
major move towards this was implemented in #2430. This added support
such that the version specifier `torch==2.1.0+cpu` would install
`torch@2.1.0+cpu` and consider `torch@2.1.0+cpu` a valid way to satisfy
the requirement `torch==2.1.0` in further dependency resolution.
In this feature, we more fully support local version semantics. Namely,
we now allow `torch==2.1.0` to install `torch@2.1.0+cpu` regardless of
whether `torch@2.1.0` (no local tag) actually exists.
We do this by adding an internal-only `Max` value to local versions that
compare greater to all other local versions. Then we can translate
`torch==2.1.0` into bounds: greater than 2.1.0 with no local tag and
less than 2.1.0 with the `Max` local tag.
Depends on https://github.com/astral-sh/packse/pull/227.
## Summary
This PR declares and documents all environment variables that are used
in one way or another in `uv`, either internally, or externally, or
transitively under a common struct.
I think over time as uv has grown there's been many environment
variables introduced. Its harder to know which ones exists, which ones
are missing, what they're used for, or where are they used across the
code. The docs only documents a handful of them, for others you'd have
to dive into the code and inspect across crates to know which crates
they're used on or where they're relevant.
This PR is a starting attempt to unify them, make it easier to discover
which ones we have, and maybe unlock future posibilities in automating
generating documentation for them.
I think we can split out into multiple structs later to better organize,
but given the high influx of PR's and possibly new environment variables
introduced/re-used, it would be hard to try to organize them all now
into their proper namespaced struct while this is all happening given
merge conflicts and/or keeping up to date.
I don't think this has any impact on performance as they all should
still be inlined, although it may affect local build times on changes to
the environment vars as more crates would likely need a rebuild. Lastly,
some of them are declared but not used in the code, for example those in
`build.rs`. I left them declared because I still think it's useful to at
least have a reference.
Did I miss any? Are their initial docs cohesive?
Note, `uv-static` is a terrible name for a new crate, thoughts? Others
considered `uv-vars`, `uv-consts`.
## Test Plan
Existing tests
As per
https://matklad.github.io/2021/02/27/delete-cargo-integration-tests.html
Before that, there were 91 separate integration tests binary.
(As discussed on Discord — I've done the `uv` crate, there's still a few
more commits coming before this is mergeable, and I want to see how it
performs in CI and locally).
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
I started learning `uv` by inspecting the source code.
I've noticed that your shell scripts are very good! Which is rare!
## Test Plan
I propose to add `shellcheck` to the CI.
It is a great tool to help finding bugs and style issues in shell code.
Techincal details:
- This CI job will only run when any `.sh` files are changed (or the job
definition file)
- It takes just several seconds even on local machine:
```
» time shellcheck -S style **/*.sh
shellcheck -S style **/*.sh 0.02s user 0.05s system 61% cpu 0.123 total
```
- It is easy to use, for example: I just fixed the single problem you
had in your code with `# shellcheck disable=SC1091`
- I am using this tool for around 8 years now and didn't have any
issues. Examples:
ca899f3b69/.github/workflows/test.yml (L22-L27)
and
https://github.com/wemake-services/wemake-django-template/blob/master/.github/workflows/shellcheck.yml
But, I understand that build / lint tools are very subjective. So, feel
free to close :)