Commit graph

23 commits

Author SHA1 Message Date
konsti
9bb0679618
Fix nightly cfg checker warnings (#3932)
Fixes these two warnings on nightly:

```
warning: unexpected `cfg` condition name: `codspeed`
 --> crates/bench/src/lib.rs:5:15
  |
5 |     #[cfg(not(codspeed))]
  |               ^^^^^^^^ help: found config with similar value: `feature = "codspeed"`
  |
  = help: expected names are: `clippy`, `debug_assertions`, `doc`, `docsrs`, `doctest`, `feature`, `miri`, `overflow_checks`, `panic`, `proc_macro`, `relocation_model`, `rustfmt`, `sanitize`, `sanitizer_cfi_generalize_pointers`, `sanitizer_cfi_normalize_integers`, `target_abi`, `target_arch`, `target_endian`, `target_env`, `target_family`, `target_feature`, `target_has_atomic`, `target_has_atomic_equal_alignment`, `target_has_atomic_load_store`, `target_os`, `target_pointer_width`, `target_thread_local`, `target_vendor`, `test`, `ub_checks`, `unix`, and `windows`
  = help: consider using a Cargo feature instead
  = help: or consider adding in `Cargo.toml` the `check-cfg` lint config for the lint:
           [lints.rust]
           unexpected_cfgs = { level = "warn", check-cfg = ['cfg(codspeed)'] }
  = help: or consider adding `println!("cargo::rustc-check-cfg=cfg(codspeed)");` to the top of the `build.rs`
  = note: see <https://doc.rust-lang.org/nightly/rustc/check-cfg/cargo-specifics.html> for more information about checking conditional configuration
  = note: `#[warn(unexpected_cfgs)]` on by default

warning: unexpected `cfg` condition name: `codspeed`
 --> crates/bench/src/lib.rs:8:11
  |
8 |     #[cfg(codspeed)]
  |           ^^^^^^^^ help: found config with similar value: `feature = "codspeed"`
  |
  = help: consider using a Cargo feature instead
  = help: or consider adding in `Cargo.toml` the `check-cfg` lint config for the lint:
           [lints.rust]
           unexpected_cfgs = { level = "warn", check-cfg = ['cfg(codspeed)'] }
  = help: or consider adding `println!("cargo::rustc-check-cfg=cfg(codspeed)");` to the top of the `build.rs`
  = note: see <https://doc.rust-lang.org/nightly/rustc/check-cfg/cargo-specifics.html> for more information about checking conditional configuration
```

```
warning: unexpected `cfg` condition value: `unix`
 --> crates/uv-extract/src/tar.rs:6:16
  |
6 | #[cfg_attr(not(target_os = "unix"), allow(dead_code))]
  |                ^^^^^^^^^^^^^^^^^^
  |
  = note: expected values for `target_os` are: `aix`, `android`, `cuda`, `dragonfly`, `emscripten`, `espidf`, `freebsd`, `fuchsia`, `haiku`, `hermit`, `horizon`, `hurd`, `illumos`, `ios`, `l4re`, `linux`, `macos`, `netbsd`, `none`, `nto`, `openbsd`, `psp`, `redox`, `solaris`, `solid_asp3`, `teeos`, `tvos`, `uefi`, `unknown`, `visionos`, `vita`, `vxworks`, `wasi`, `watchos`, and `windows` and 2 more
  = note: see <https://doc.rust-lang.org/nightly/rustc/check-cfg/cargo-specifics.html> for more information about checking conditional configuration
  = note: requested on the command line with `-W unexpected-cfgs`
```

<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
2024-05-31 09:35:52 +00:00
Charlie Marsh
2c3a6796aa
Remove unused seek methods (#3526) 2024-05-11 17:31:32 +00:00
DaniPopes
c59cb1381a
Use into_par_iter instead of par_bridge (#3435)
## Summary

Use the native rayon range iterator instead of bridging the standard
library's.
2024-05-07 19:38:35 +00:00
konsti
bed730571d
Fix single crate tokio features (#3234)
Previously, uv-auth would fail to compile due to a missing process
feature. I chose to make all tokio features we use top level features,
so we can share the tokio cache between all test invocations.
2024-04-24 08:55:15 +00:00
Sergey Kolosov
d2551bb2bd
Add support for .tar.bz2 source distributions (#3069)
## Summary

Source distributions in the .tar.bz2 format are still relatively common
within the existing code-bases, namely, the most common examples are the
Twisted source distributions up to the version 20.3.0. As quite so often
the ability to upgrade Twisted to a more recent version is not available
for a given project, we add the support for .tar.bz2 here to still allow
`uv` to be a drop-in replacement for `pip` in these projects.

## Test Plan

The feature was tested both by adding the corresponding test coverage,
and by directly installing a package of interest under a Python version
that doesn't have the corresponding wheel:

```sh
cargo run venv -p python3.8
cargo run pip install Twisted==20.3.0 --no-cache
```

The `--no-cache` argument in the example above serves the purpose of
cleaning the cached information regarding the unsatisfiability of the
requirements, as it may have been cached during some previous attempt to
install this package by `uv` version that didn't implement this feature
yet.
2024-04-16 18:34:55 +00:00
Charlie Marsh
1f3b5bb093
Add hash-checking support to install and sync (#2945)
## Summary

This PR adds support for hash-checking mode in `pip install` and `pip
sync`. It's a large change, both in terms of the size of the diff and
the modifications in behavior, but it's also one that's hard to merge in
pieces (at least, with any test coverage) since it needs to work
end-to-end to be useful and testable.

Here are some of the most important highlights:

- We store hashes in the cache. Where we previously stored pointers to
unzipped wheels in the `archives` directory, we now store pointers with
a set of known hashes. So every pointer to an unzipped wheel also
includes its known hashes.
- By default, we don't compute any hashes. If the user runs with
`--require-hashes`, and the cache doesn't contain those hashes, we
invalidate the cache, redownload the wheel, and compute the hashes as we
go. For users that don't run with `--require-hashes`, there will be no
change in performance. For users that _do_, the only change will be if
they don't run with `--generate-hashes` -- then they may see some
repeated work between resolution and installation, if they use `pip
compile` then `pip sync`.
- Many of the distribution types now include a `hashes` field, like
`CachedDist` and `LocalWheel`.
- Our behavior is similar to pip, in that we enforce hashes when pulling
any remote distributions, and when pulling from our own cache. Like pip,
though, we _don't_ enforce hashes if a distribution is _already_
installed.
- Hash validity is enforced in a few different places:
1. During resolution, we enforce hash validity based on the hashes
reported by the registry. If we need to access a source distribution,
though, we then enforce hash validity at that point too, prior to
running any untrusted code. (This is enforced in the distribution
database.)
2. In the install plan, we _only_ add cached distributions that have
matching hashes. If a cached distribution is missing any hashes, or the
hashes don't match, we don't return them from the install plan.
3. In the downloader, we _only_ return distributions with matching
hashes.
4. The final combination of "things we install" are: (1) the wheels from
the cache, and (2) the downloaded wheels. So this ensures that we never
install any mismatching distributions.
- Like pip, if `--require-hashes` is provided, we require that _all_
distributions are pinned with either `==` or a direct URL. We also
require that _all_ distributions have hashes.

There are a few notable TODOs:

- We don't support hash-checking mode for unnamed requirements. These
should be _somewhat_ rare, though? Since `pip compile` never outputs
unnamed requirements. I can fix this, it's just some additional work.
- We don't automatically enable `--require-hashes` with a hash exists in
the requirements file. We require `--require-hashes`.

Closes #474.

## Test Plan

I'd like to add some tests for registries that report incorrect hashes,
but otherwise: `cargo test`
2024-04-10 19:09:03 +00:00
Zanie Blue
44e39bdca3
Replace Python bootstrapping script with Rust implementation (#2842)
See https://github.com/astral-sh/uv/issues/2617

Note this also includes:
- #2918 
- #2931 (pending)

A first step towards Python toolchain management in Rust.

First, we add a new crate to manage Python download metadata:

- Adds a new `uv-toolchain` crate
- Adds Rust structs for Python version download metadata
- Duplicates the script which downloads Python version metadata
- Adds a script to generate Rust code from the JSON metadata
- Adds a utility to download and extract the Python version

I explored some alternatives like a build script using things like
`serde` and `uneval` to automatically construct the code from our
structs but deemed it to heavy. Unlike Rye, I don't generate the Rust
directly from the web requests and have an intermediate JSON layer to
speed up iteration on the Rust types.

Next, we add add a `uv-dev` command `fetch-python` to download Python
versions per the bootstrapping script.

- Downloads a requested version or reads from `.python-versions`
- Extracts to `UV_BOOTSTRAP_DIR`
- Links executables for path extension

This command is not really intended to be user facing, but it's a good
PoC for the `uv-toolchain` API. Hash checking (via the sha256) isn't
implemented yet, we can do that in a follow-up.

Finally, we remove the `scripts/bootstrap` directory, update CI to use
the new command, and update the CONTRIBUTING docs.

<img width="1023" alt="Screenshot 2024-04-08 at 17 12 15"
src="57bd3cf1-7477-4bb8-a8e9-802a00d772cb">
2024-04-10 11:22:41 -05:00
Zanie Blue
bdeab55193
Add extract support for zstd (#2861)
We need this to extract toolchain downloads
2024-04-08 15:34:08 -05:00
Charlie Marsh
dc2c289dff
Upgrade rs-async-zip to support data descriptors (#2809)
## Summary

Upgrading `rs-async-zip` enables us to support data descriptors in
streaming. This both greatly improves performance for indexes that use
data descriptors _and_ ensures that we support them in a few other
places (e.g., zipped source distributions created in Finder).

Closes #2808.
2024-04-04 01:31:40 +00:00
Micha Reiser
acbee166c0
Remove unused dependencies (#2543)
## Summary

I tried out `cargo shear` to see if there are any unused dependencies
that `cargo udeps` isn't reporting. It turned out, there are a few. This
PR removes those dependencies.

## Test Plan

`cargo build`
2024-03-19 13:10:10 -04:00
Charlie Marsh
a267a501b6
Add Seek fallback for zip files (#2320)
## Summary

Some zip files can't be streamed; in particular, `rs-async-zip` doesn't
support data descriptors right now (though it may in the future). This
PR adds a fallback path for such zips that downloads the entire zip file
to disk, then unzips it from disk (which gives us `Seek`).

Closes https://github.com/astral-sh/uv/issues/2216.

## Test Plan

`cargo run pip install --extra-index-url https://buf.build/gen/python
hashb_foxglove_protocolbuffers_python==25.3.0.1.20240226043130+465630478360
--force-reinstall -n`
2024-03-10 11:39:28 -04:00
Jonathon Belotti
e16140a849
Address #2220 (slow download perf against PyPi mirror) (#2319)
## Summary

Addressing the extremely slow performance detailed in
https://github.com/astral-sh/uv/issues/2220. There are two changes to
increase download performance:

1. setting `accept-encoding: identity`, in the spirit of
https://github.com/pypa/pip/pull/1688
2. increasing buffer from 8KiB to 128KiB. 

### 1. accept-encoding: identity

I think this related `pip` PR has a good explanation of what's going on:
https://github.com/pypa/pip/pull/1688

```
  # We use Accept-Encoding: identity here because requests
  # defaults to accepting compressed responses. This breaks in
  # a variety of ways depending on how the server is configured.
  # - Some servers will notice that the file isn't a compressible
  #   file and will leave the file alone and with an empty
  #   Content-Encoding
  # - Some servers will notice that the file is already
  #   compressed and will leave the file alone and will add a
  #   Content-Encoding: gzip header
  # - Some servers won't notice anything at all and will take
  #   a file that's already been compressed and compress it again
  #   and set the Content-Encoding: gzip header
```

The `files.pythonhosted.org` server is the 1st kind. Example debug log I
added in `uv` when installing against PyPI:

<img width="1459" alt="image"
src="ef10d758-46aa-4c8e-9dba-47f33437401b">

(there is no `content-encoding` header in this response, the `whl`
hasn't been compressed, and there is a content-length header)

Our internal mirror is the third case. It does seem sensible that our
mirror should be modified to act like the 1st kind. But `uv` should
handle all three cases like `pip` does.

### 2. buffer increase

In https://github.com/astral-sh/uv/issues/2220 I observed that `pip`'s
downloading was causing up-to 128KiB flushes in our mirror.

After fix 1, `uv` was still only causing up-to 8KiB flushes, and was
slower to download than `pip`. Increasing this buffer from the default
8KiB led to a download performance improvement against our mirror and
the expected observed 128KiB flushes.

## Test Plan

Ran benchmarking as instructed by @charliermarsh 

<img width="1447" alt="image"
src="840d9c8d-4b98-4bfa-89f3-073a2dec1f23">

No performance improvement or regression.
2024-03-09 19:49:29 -05:00
Taniguchi Yasufumi
70e877d11c
Add fs_err to disallowed_method in clippy.toml (#1950)
## Summary

Resolve #1916

---------

Co-authored-by: konsti <konstin@mailbox.org>
2024-02-26 14:15:07 +00:00
danieleades
8d721830db
Clippy pedantic (#1963)
Address a few pedantic lints

lints are separated into separate commits so they can be reviewed
individually.

I've not added enforcement for any of these lints, but that could be
added if desirable.
2024-02-25 14:04:05 -05:00
Charlie Marsh
a1f50418fd
Avoid erroring for source distributions with symlinks in archive (#1944)
## Summary

For context, we have three extraction paths:

- untar (async) - used for any `.tar.gz`, local or remote.
- unzip (async) - used to unzip remote wheels, or local or remote source
distributions.
- unzip (sync) - used to untar locally-available wheels into the cache.

We use three different crates for these:

- [`tokio-tar`](https://github.com/vorot93/tokio-tar)
- [`async-zip`](https://github.com/Majored/rs-async-zip)
- [`zip-rs`](https://github.com/zip-rs/zip)

These all seem to have different support for symlinks:

- `tokio-tar` tries to create a symlink (which works fine on Unix but
errors on Windows, since we typically don't have elevated permissions).
- `async-zip` _seems_ to write the target contents directly to the file
(which is what we want).
- `zip-rs` _apparently_ writes the _name_ of the target to the file
(which isn't what we want).

Thankfully, symlinks are not allowed in wheels
(https://github.com/pypa/pip/issues/5919,
https://discuss.python.org/t/symbolic-links-in-wheels/1945), so we can
ignore `zip-rs`.

For `tokio-tar`, we now _skip_ (and warn) if we see a symlink on
Windows. We could do what pip does, and recursively copy, but it's
difficult because we don't have `Seek` on the file. (Alternatively, we
could use hard links and junctions, though those also might need to
exist already.) Let's see how far this gets us.

(We also no longer attempt to set permissions on symlinks on Unix, which
caused another failure.)

Closes https://github.com/astral-sh/uv/issues/1858.
2024-02-24 03:22:13 +00:00
Charlie Marsh
19890feb77
Preserve executable bit when untarring archives (#1790)
## Summary

Closes https://github.com/astral-sh/uv/issues/1767.
2024-02-21 14:18:44 +00:00
Charlie Marsh
88a0c13865
Use async unzip for local source distributions (#1809)
## Summary

We currently maintain separate untar methods for sync and async, but we
only use the sync version when the user provides a local source
distribution. (Otherwise, we untar as we download the distribution.) In
my testing, this is actually slower anyway:

```
❯ python -m scripts.bench \
        --uv-path ./target/release/main \
        --uv-path ./target/release/uv \
        ./requirements.in --benchmark resolve-cold --min-runs 50
Benchmark 1: ./target/release/main (resolve-cold)
  Time (mean ± σ):     835.2 ms ± 107.4 ms    [User: 346.0 ms, System: 151.3 ms]
  Range (min … max):   639.2 ms … 1051.0 ms    50 runs

Benchmark 2: ./target/release/uv (resolve-cold)
  Time (mean ± σ):     750.7 ms ±  91.9 ms    [User: 345.7 ms, System: 149.4 ms]
  Range (min … max):   637.9 ms … 905.7 ms    50 runs

Summary
  './target/release/uv (resolve-cold)' ran
    1.11 ± 0.20 times faster than './target/release/main (resolve-cold)'
```
2024-02-21 14:11:37 +00:00
Andrew Gallant
d6aaf7be30 uv-extract: call target.as_ref() once
This is just a style thing.
2024-02-20 10:57:51 -05:00
konsti
9a720a87c8
Only preserve the executable bit (#1743)
A file in a zip can set arbitrary unix permissions, but we, like pip,
want to preserve only the executable bit and otherwise use the OS
defaults.

This should be faster for wheels with many files since we now avoid the
blocking fs call to set the permissions in most cases.

Fixes #1740.
2024-02-20 16:41:05 +01:00
konsti
f7722c02c1
Don't preserve timestamp in streaming unzip (#1749)
## Summary

Don't preserve mtime to work around alexcrichton/tar-rs#349. Same as
#634 except for the streaming unzip.

Fixes #1748.

## Test Plan

Added the tomli source dist as test case.
2024-02-20 14:54:28 +00:00
Charlie Marsh
340cb67a8b
Allow non-nested archives for hexdump and others (#1564)
## Summary#1562 

It turns out that `hexdump` uses an invalid source distribution format
whereby the contents aren't nested in a top-level directory -- instead,
they're all just flattened at the top-level. In looking at pip's source
(51de88ca64/src/pip/_internal/utils/unpacking.py (L62)),
it only strips the top-level directory if all entries have the same
directory prefix (i.e., if it's the only thing in the directory). This
PR accommodates these "invalid" distributions.

I can't find any history on this method in `pip`. It looks like it dates
back over 15 years ago, to before `pip` was even called `pip`.

Closes https://github.com/astral-sh/uv/issues/1376.
2024-02-16 23:17:36 -05:00
Robin Krahl
f9a9f53476
Improve error message for invalid sdist archives (#1389)
This PR improves the error message for the problem described in
https://github.com/astral-sh/uv/issues/1376. The original output
duplicates the actual error message and includes lots of noise
(`DirEntry { inner: DirEntry(...) }`).

```
$ uv pip install hexdump==3.3
error: Failed to download and build: hexdump==3.3
  Caused by: Failed to extract source distribution: The top level of the archive must only contain a list directory, but it contains: [DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/__main__.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/hexdump.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/data") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/PKG-INFO") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/setup.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/README.txt") }]
  Caused by: The top level of the archive must only contain a list directory, but it contains: [DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/__main__.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/hexdump.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/data") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/PKG-INFO") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/setup.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/README.txt") }]
```

This PR removes the duplication and `DirEntry` internals so that the
error message is easier to grasp:

```
$ uv pip install hexdump==3.3
error: Failed to download and build: hexdump==3.3
  Caused by: Failed to extract source distribution
  Caused by: The top level of the archive must only contain a list directory, but it contains: ["__main__.py", "hexdump.py", "data", "PKG-INFO", "setup.py", "README.txt"]
```
2024-02-15 18:03:23 -06:00
Zanie Blue
2586f655bb
Rename to uv (#1302)
First, replace all usages in files in-place. I used my editor for this.
If someone wants to add a one-liner that'd be fun.

Then, update directory and file names:

```
# Run twice for nested directories
find . -type d -print0 | xargs -0 rename s/puffin/uv/g
find . -type d -print0 | xargs -0 rename s/puffin/uv/g

# Update files
find . -type f -print0 | xargs -0 rename s/puffin/uv/g
```

Then add all the files again

```
# Add all the files again
git add crates
git add python/uv

# This one needs a force-add
git add -f crates/uv-trampoline
```
2024-02-15 11:19:46 -06:00