## Summary
I found some of these too bare (e.g., when they _just_ show a package
name with no other information). For me, this makes it easier to
differentiate error message copy from data. But open to other opinions.
Take a look at the fixture changes and LMK!
Previously, we got `pypi_types::DirectUrl` (the pypa spec
direct_url.json format) and `distribution_types::DirectUrl` (an enum of
all the url types we support). This lead me to confusion, so i'm
renaming the latter one to the more appropriate `ParsedUrl`.
Add a dedicated error type for direct url parsing. This change is broken
out from the new uv requirement type, which uses direct url parsing
internally.
## Summary
resolves https://github.com/astral-sh/uv/issues/3106
## Test Plan
added a simple test where the password provided in `UV_INDEX_URL` is
hidden in the output as expected.
## Summary
In all of these ID types, we pass values to `cache_key::digest` prior to
passing to `DistributionId` or `ResourceId`. The `DistributionId` and
`ResourceId` are then hashed later, since they're used as keys in hash
maps.
It seems wasteful to hash the value, then hash the hashed value? So this
PR modifies those structs to be enums that can take one of a fixed set
of types.
## Summary
This PR enables `--require-hashes` with unnamed requirements. The key
change is that `PackageId` becomes `VersionId` (since it refers to a
package at a specific version), and the new `PackageId` consists of
_either_ a package name _or_ a URL. The hashes are keyed by `PackageId`,
so we can generate the `RequiredHashes` before we have names for all
packages, and enforce them throughout.
Closes#2979.
## Summary
This PR enables hash generation for URL requirements when the user
provides `--generate-hashes` to `pip compile`. While we include the
hashes from the registry already, today, we omit hashes for URLs.
To power hash generation, we introduce a `HashPolicy` abstraction:
```rust
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum HashPolicy<'a> {
/// No hash policy is specified.
None,
/// Hashes should be generated (specifically, a SHA-256 hash), but not validated.
Generate,
/// Hashes should be validated against a pre-defined list of hashes. If necessary, hashes should
/// be generated so as to ensure that the archive is valid.
Validate(&'a [HashDigest]),
}
```
All of the methods on the distribution database now accept this policy,
instead of accepting `&'a [HashDigest]`.
Closes#2378.
## Summary
This represents a change to `--require-hashes` in the event that we
don't find a matching hash from the registry. The behavior in this PR is
closer to pip's.
Prior to this PR, if a distribution had no reported hash, or only
mismatched hashes, we would mark it as incompatible. Now, we mark it as
compatible, but we use the hash-agreement as part of the ordering, such
that we prefer any distribution with a matching hash, then any
distribution with no hash, then any distribution with a mismatched hash.
As a result, if an index reports incorrect hashes, but the user provides
the correct one, resolution now succeeds, where it would've failed.
Similarly, if an index omits hashes altogether, but the user provides
the correct one, resolution now succeeds, where it would've failed.
If we end up picking a distribution whose hash ultimately doesn't match,
we'll reject it later, after resolution.
## Summary
This PR adds support for hash-checking mode in `pip install` and `pip
sync`. It's a large change, both in terms of the size of the diff and
the modifications in behavior, but it's also one that's hard to merge in
pieces (at least, with any test coverage) since it needs to work
end-to-end to be useful and testable.
Here are some of the most important highlights:
- We store hashes in the cache. Where we previously stored pointers to
unzipped wheels in the `archives` directory, we now store pointers with
a set of known hashes. So every pointer to an unzipped wheel also
includes its known hashes.
- By default, we don't compute any hashes. If the user runs with
`--require-hashes`, and the cache doesn't contain those hashes, we
invalidate the cache, redownload the wheel, and compute the hashes as we
go. For users that don't run with `--require-hashes`, there will be no
change in performance. For users that _do_, the only change will be if
they don't run with `--generate-hashes` -- then they may see some
repeated work between resolution and installation, if they use `pip
compile` then `pip sync`.
- Many of the distribution types now include a `hashes` field, like
`CachedDist` and `LocalWheel`.
- Our behavior is similar to pip, in that we enforce hashes when pulling
any remote distributions, and when pulling from our own cache. Like pip,
though, we _don't_ enforce hashes if a distribution is _already_
installed.
- Hash validity is enforced in a few different places:
1. During resolution, we enforce hash validity based on the hashes
reported by the registry. If we need to access a source distribution,
though, we then enforce hash validity at that point too, prior to
running any untrusted code. (This is enforced in the distribution
database.)
2. In the install plan, we _only_ add cached distributions that have
matching hashes. If a cached distribution is missing any hashes, or the
hashes don't match, we don't return them from the install plan.
3. In the downloader, we _only_ return distributions with matching
hashes.
4. The final combination of "things we install" are: (1) the wheels from
the cache, and (2) the downloaded wheels. So this ensures that we never
install any mismatching distributions.
- Like pip, if `--require-hashes` is provided, we require that _all_
distributions are pinned with either `==` or a direct URL. We also
require that _all_ distributions have hashes.
There are a few notable TODOs:
- We don't support hash-checking mode for unnamed requirements. These
should be _somewhat_ rare, though? Since `pip compile` never outputs
unnamed requirements. I can fix this, it's just some additional work.
- We don't automatically enable `--require-hashes` with a hash exists in
the requirements file. We require `--require-hashes`.
Closes#474.
## Test Plan
I'd like to add some tests for registries that report incorrect hashes,
but otherwise: `cargo test`
## Summary
If we build a source distribution from the registry, and the version
doesn't match that of the filename, we should error, just as we do for
mismatched package names. However, we should also backtrack here, which
we didn't previously.
Closes https://github.com/astral-sh/uv/issues/2953.
## Test Plan
Verified that `cargo run pip install docutils --verbose --no-cache
--reinstall` installs `docutils==0.21` instead of the invalid
`docutils==0.21.post1`.
In the logs, I see:
```
WARN Unable to extract metadata for docutils: Package metadata version `0.21` does not match given version `0.21.post1`
```
## Summary
Right now, we have a `Hashes` representation that looks like:
```rust
/// A dictionary mapping a hash name to a hex encoded digest of the file.
///
/// PEP 691 says multiple hashes can be included and the interpretation is left to the client.
#[derive(Debug, Clone, Eq, PartialEq, Default, Deserialize)]
pub struct Hashes {
pub md5: Option<Box<str>>,
pub sha256: Option<Box<str>>,
pub sha384: Option<Box<str>>,
pub sha512: Option<Box<str>>,
}
```
It stems from the PyPI API, which returns a dictionary of hashes.
We tend to pass these around as a vector of `Vec<Hashes>`. But it's a
bit strange because each entry in that vector could contain multiple
hashes. And it makes it difficult to ask questions like "Is
`sha256:ab21378ca980a8` in the set of hashes"?
This PR instead treats `Hashes` as the PyPI-internal type, and uses a
new `Vec<HashDigest>` everywhere in our own APIs.
## Summary
We have a heuristic in `File` that attempts to detect whether a URL is
absolute or relative. However, `contains("://")` is prone to false
positive. In the linked issues, the URLs look like:
```
/packages/5a/d8/4d75d1e4287ad9d051aab793c68f902c9c55c4397636b5ee540ebd15aedf/pytz-2005k.tar.bz2?hash=597b596dc1c2c130cd0a57a043459c3bd6477c640c07ac34ca3ce8eed7e6f30c&remote=4d75d1e428/pytz-2005k.tar.bz2 (sha256)=597b596dc1c2c130cd0a57a043459c3bd6477c640c07ac34ca3ce8eed7e6f30c
```
Which is relative, but includes `://`.
Instead, we should determine whether the URL has a _scheme_ which
matches the `Url` crate internally.
Closes https://github.com/astral-sh/uv/issues/2899.
With pubgrub being fast for complex ranges, we can now compute the next
n candidates without taking a performance hit. This speeds up cold cache
`urllib3<1.25.4` `boto3` from maybe 40s - 50s to ~2s. See docstrings for
details on the heuristics.
**Before**

**After**

---
We need two parts of the prefetching, first looking for compatible
version and then falling back to flat next versions. After we selected a
boto3 version, there is only one compatible botocore version remaining,
so when won't find other compatible candidates for prefetching. We see
this as a pattern where we only prefetch boto3 (stack bars), but not
botocore (sequential requests between the stacked bars).

The risk is that we're completely wrong with the guess and cause a lot
of useless network requests. I think this is acceptable since this
mechanism only triggers when we're already on the bad path and we should
simply have fetched all versions after some seconds (assuming a fast
index like pypi).
---
It would be even better if the pubgrub state was copy-on-write so we
could simulate more progress than we actually have; currently we're
guessing what the next version is which could be completely wrong, but i
think this is still a valuable heuristic.
Fixes#170.
## Summary
I noticed in #2769 that I was now stripping `.git` suffixes from Git
URLs after resolving to a precise commit. This PR cleans up the internal
caching to use a better canonical representation: a `RepositoryUrl`
along with a `GitReference`, instead of a `GitUrl` which can contain
non-canonical data. This gives us both better fidelity (preserving the
`.git`, along with any casing that the user provided when defining the
URL) and is overall cleaner and more robust.
## Summary
Ensures that if we resolve any distributions before the resolver, we
cache the metadata in-memory.
_Also_ ensures that we lock (important!) when resolving Git
distributions.
Previously, we did not consider installed distributions as candidates
while performing resolution. Here, we update the resolver to use
installed distributions that satisfy requirements instead of pulling new
distributions from the registry.
The implementation details are as follows:
- We now provide `SitePackages` to the `CandidateSelector`
- If an installed distribution satisfies the requirement, we prefer it
over remote distributions
- We do not want to allow installed distributions in some cases, i.e.,
upgrade and reinstall
- We address this by introducing an `Exclusions` type which tracks
installed packages to ignore during selection
- There's a new `ResolvedDist` wrapper with `Installed(InstalledDist)`
and `Installable(Dist)` variants
- This lets us pass already installed distributions throughout the
resolver
The user-facing behavior is thoroughly covered in the tests, but
briefly:
- Installing a package that depends on an already-installed package
prefers the local version over the index
- Installing a package with a name that matches an already-installed URL
package does not reinstall from the index
- Reinstalling (--reinstall) a package by name _will_ pull from the
index even if an already-installed URL package is present
- To reinstall the URL package, you must specify the URL in the request
Closes https://github.com/astral-sh/uv/issues/1661
Addresses:
- https://github.com/astral-sh/uv/issues/1476
- https://github.com/astral-sh/uv/issues/1856
- https://github.com/astral-sh/uv/issues/2093
- https://github.com/astral-sh/uv/issues/2282
- https://github.com/astral-sh/uv/issues/2383
- https://github.com/astral-sh/uv/issues/2560
## Test plan
- [x] Reproduction at `charlesnicholson/uv-pep420-bug` passes
- [x] Unit test for editable package
([#1476](https://github.com/astral-sh/uv/issues/1476))
- [x] Unit test for previously installed package with empty registry
- [x] Unit test for local non-editable package
- [x] Unit test for new version available locally but not in registry
([#2093](https://github.com/astral-sh/uv/issues/2093))
- ~[ ] Unit test for wheel not available in registry but already
installed locally
([#2282](https://github.com/astral-sh/uv/issues/2282))~ (seems
complicated and not worthwhile)
- [x] Unit test for install from URL dependency then with matching
version ([#2383](https://github.com/astral-sh/uv/issues/2383))
- [x] Unit test for install of new package that depends on installed
package does not change version
([#2560](https://github.com/astral-sh/uv/issues/2560))
- [x] Unit test that `pip compile` does _not_ consider installed
packages
## Summary
This PR enables the resolver to "accept" URLs, prereleases, and local
version specifiers for direct dependencies of path dependencies. As a
result, `uv pip install .` and `uv pip install -e .` now behave
identically, in that neither has a restriction on URL dependencies and
the like.
Closes https://github.com/astral-sh/uv/issues/2643.
Closes https://github.com/astral-sh/uv/issues/1853.
## Summary
`uv` was failing to install requirements defined like:
```
file://localhost/Users/crmarsh/Downloads/iniconfig-2.0.0-py3-none-any.whl
```
Closes https://github.com/astral-sh/uv/issues/2652.
## Summary
Closes Issue:
- https://github.com/astral-sh/uv/issues/2626
## Test Plan
```
cargo run -- pip install -r dev-requirements.txt -r requirements.txt
```
where both requirements files have same `--index-url`
## Summary
This PR enables the source distribution database to be used with unnamed
requirements (i.e., URLs without a package name). The (significant)
upside here is that we can now use PEP 517 hooks to resolve unnamed
requirement metadata _and_ reuse any computation in the cache.
The changes to `crates/uv-distribution/src/source/mod.rs` are quite
extensive, but mostly mechanical. The core idea is that we introduce a
new `BuildableSource` abstraction, which can either be a distribution,
or an unnamed URL:
```rust
/// A reference to a source that can be built into a built distribution.
///
/// This can either be a distribution (e.g., a package on a registry) or a direct URL.
///
/// Distributions can _also_ point to URLs in lieu of a registry; however, the primary distinction
/// here is that a distribution will always include a package name, while a URL will not.
#[derive(Debug, Clone, Copy)]
pub enum BuildableSource<'a> {
Dist(&'a SourceDist),
Url(SourceUrl<'a>),
}
```
All the methods on the source distribution database now accept
`BuildableSource`. `BuildableSource` has a `name()` method, but it
returns `Option<&PackageName>`, and everything is required to work with
and without a package name.
The main drawback of this approach (which isn't a terrible one) is that
we can no longer include the package name in the cache. (We do continue
to use the package name for registry-based distributions, since those
always have a name.). The package name was included in the cache route
for two reasons: (1) it's nice for debugging; and (2) we use it to power
`uv cache clean flask`, to identify the entries that are relevant for
Flask.
To solve this, I changed the `uv cache clean` code to look one level
deeper. So, when we want to determine whether to remove the cache entry
for a given URL, we now look into the directory to see if there are any
wheels that match the package name. This isn't as nice, but it does work
(and we have test coverage for it -- all passing).
I also considered removing the package name from the cache routes for
non-registry _wheels_, for consistency... But, it would require a cache
bump, and it didn't feel important enough to merit that.
Scott schafer got me the idea: We can avoid repeating the path for
workspaces dependencies everywhere if we declare them in the virtual
package once and treat them as workspace dependencies from there on.
## Summary
This PR changes our user-facing representation for paths to use relative
paths, when the path is within the current working directory. This
mirrors what we do in Ruff. (If the path is _outside_ the current
working directory, we print an absolute path.)
Before:
```shell
❯ uv venv .venv2
Using Python 3.12.2 interpreter at: /Users/crmarsh/workspace/uv/.venv/bin/python3
Creating virtualenv at: .venv2
Activate with: source .venv2/bin/activate
```
After:
```shell
❯ cargo run venv .venv2
Finished dev [unoptimized + debuginfo] target(s) in 0.15s
Running `target/debug/uv venv .venv2`
Using Python 3.12.2 interpreter at: .venv/bin/python3
Creating virtualenv at: .venv2
Activate with: source .venv2/bin/activate
```
Note that we still want to use the existing `.simplified_display()`
anywhere that the path is being simplified, but _still_ intended for
machine consumption (e.g., when passing to `.current_dir()`).
## Summary
I tried out `cargo shear` to see if there are any unused dependencies
that `cargo udeps` isn't reporting. It turned out, there are a few. This
PR removes those dependencies.
## Test Plan
`cargo build`
## Summary
Right now, the middleware doesn't apply credentials that were
_originally_ sourced from a URL. This requires that we call
`with_url_encoded_auth` whenever we create a request to ensure that any
credentials that were passed in as part of an index URL (for example)
are respected.
This PR modifies `uv-auth` to instead apply those credentials in the
middleware itself. This seems preferable to me. As far as I can tell, we
can _only_ add in-URL credentials to the store ourselves (since in-URL
credentials are converted to headers by the time they reach the
middleware). And if we ever _didn't_ apply those credentials to new
URLs, it'd be a bug in the logic that precedes the middleware (i.e., us
forgetting to call `with_url_encoded_auth`).
## Test Plan
`cargo run pip install` with an authenticated index.
## Summary
The authentication middleware extracts in-URL credentials from URLs that
pass through it; however, by the time a request reaches the store, the
credentials will have already been removed, and relocated to the header.
So we were never propagating in-URL credentials.
This PR adds an explicit pass wherein we pass in-URL credentials to the
store prior to doing any work.
Closes https://github.com/astral-sh/uv/issues/2444.
## Test Plan
`cargo run pip install` against an authenticated AWS registry.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
Adds basic keyring auth support for `uv` commands. Adds clone of `pip`'s
`--keyring-provider subprocess` argument (using CLI `keyring` tool).
See issue: https://github.com/astral-sh/uv/issues/1520
## Test Plan
<!-- How was it tested? -->
Hard to write full-suite unit tests due to reliance on
`process::Command` for `keyring` cli
Manually tested end-to-end in a project with GCP artifact registry using
keyring password:
```bash
➜ uv pip uninstall watchdog
Uninstalled 1 package in 46ms
- watchdog==4.0.0
➜ cargo run -- pip install --index-url https://<redacted>/python/simple/ --extra-index-url https://<redacted>/pypi-mirror/simple/ watchdog
Finished dev [unoptimized + debuginfo] target(s) in 0.18s
Running `target/debug/uv pip install --index-url 'https://<redacted>/python/simple/' --extra-index-url 'https://<redacted>/pypi-mirror/simple/' watchdog`
error: HTTP status client error (401 Unauthorized) for url (https://<redacted>/pypi-mirror/simple/watchdog/)
➜ cargo run -- pip install --keyring-provider subprocess --index-url https://<redacted>/python/simple/ --extra-index-url https://<redacted>/pypi-mirror/simple/ watchdog
Finished dev [unoptimized + debuginfo] target(s) in 0.17s
Running `target/debug/uv pip install --keyring-provider subprocess --index-url 'https://<redacted>/python/simple/' --extra-index-url 'https://<redacted>/pypi-mirror/simple/' watchdog`
Resolved 1 package in 2.34s
Installed 1 package in 27ms
+ watchdog==4.0.0
```
`requirements.txt`
```
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# .bin/generate-requirements
#
--index-url https://<redacted>/python/simple/
--extra-index-url https://<redacted>/pypi-mirror/simple/
...
```
```bash
➜ cargo run -- pip install --keyring-provider subprocess -r requirements.txt
Finished dev [unoptimized + debuginfo] target(s) in 0.19s
Running `target/debug/uv pip install --keyring-provider subprocess -r requirements.txt`
Resolved 205 packages in 23.52s
Built <redacted>
...
Downloaded 47 packages in 19.32s
Installed 195 packages in 276ms
+ <redacted>
...
```
---------
Co-authored-by: Thomas Gilgenast <thomas@vant.ai>
Co-authored-by: Zanie Blue <contact@zanie.dev>
## Summary
This PR ensures that we expand environment variables _before_ sniffing
for the URL scheme (e.g., `file://` vs. `https://` vs. something else).
Closes https://github.com/astral-sh/uv/issues/2375.
Extends the "compatibility" types introduced in #1293 to apply to source
distributions as well as wheels.
- We now track the most-relevant incompatible source distribution
- Exclude newer, Python requirements, and yanked versions are all
tracked as incompatibilities in the new model (this lets us remove
`DistMetadata`!)
## Summary
PyPI now supports Metadata 2.2, which means distributions with Metadata
2.2-compliant metadata will start to appear. The upside is that if a
source distribution includes a `PKG-INFO` file with (1) a metadata
version of 2.2 or greater, and (2) no dynamic fields (at least, of the
fields we rely on), we can read the metadata from the `PKG-INFO` file
directly rather than running _any_ of the PEP 517 build hooks.
Closes https://github.com/astral-sh/uv/issues/2009.
Previously, `uv` would always prioritize the index given by
`--index-url`. It would then try any indexes after that given by zero
or more `--extra-index-url` flags. This differed from `pip` in that any
priority was given at all, where `pip` doesn't guarantee any priority
ordering of indexes.
We could go in the direction of mimicing `pip`'s behavior here, but it
at present has issues with dependency confusion attacks where packages
may get installed from indexes you don't control. More specifically,
there is an issue of different trust levels. See discussion in #171 and
[PEP-0708] for more on the security impact.
In contrast, `uv` will only select versions for a package from a single
index. That is, even if `foo` is in indexes `a` and `b`, it will
only consider the versions from the index that it checks first. This
probably helps with respect to dependency confusion attacks, but also
means that `uv` doesn't quite cover all of the same use cases as `pip`.
In this PR, we retain the notion of prioritizing indexes, but
tweak it so that PyPI is preferred last as opposed to first. Or
more precisely, the `--index-url` flag specifies a fallback index,
not the primary index, and is deprioritized beneath every index
specified by `--extra-index-url`. The ordering among indexes given by
`--extra-index-url` remains the same: earlier indexes are prioritized
over later indexes.
While this tweak likely won't hit all use cases, I believe it will
resolve some of the most common pain points without exacerbating
dependency confusion problems.
Ref #171, Fixes#1377, Fixes#1451, Fixes#1600
[PEP-0708]: https://peps.python.org/pep-0708/
## Summary
Internal refactor to `PrioritizedDistribution` that I think should
reduce the size? Although the motivation here is simplicity, not perf.
Instead of storing:
```rust
/// The highest-priority, installable wheel for the package version.
compatible_wheel: Option<(DistMetadata, TagPriority)>,
/// The most-relevant, incompatible wheel for the package version.
incompatible_wheel: Option<(DistMetadata, IncompatibleWheel)>,
```
We now store:
```rust
wheel: Option<(DistMetadata, WheelCompatibility)>,
```
Where `WheelCompatibility` is an enum of `TagPriority` or
`IncompatibleWheel`.
## Summary
This also preserves the environment variables in the output file, e.g.:
```
Resolved 1 package in 216ms
# This file was autogenerated by uv via the following command:
# uv pip compile requirements.in --emit-index-url
--index-url https://test.pypi.org/${SUFFIX}
requests==2.5.4.1
```
I'm torn on whether that's correct or undesirable here.
Closes#2035.
Hi, love your work on `uv` 👋!
Opening a Draft PR early to check if there are any existing rust table
formatting libs that I am unaware of (either already in `uv`/`ruff`, or
the rust ecosystem) before spending much time on inventing the wheel
myself and cleaning it up. Any other pointers are also welcome (e.g. on
the editable filtering).
Editable project locations in `uv pip list` include the file scheme
(`file://`), where they are omitted in `pip list`. Is this desired, or
should it replicate pip?
## Summary
Implementation for #1401
`--editable` flag is implemented.
`--outdated` and `--uptodate` out of scope for this PR (requires latest
version information, and type wheel/sdist)
## Test Plan
Not yet implemented as I couldn't locate the tests for `uv pip freeze`.
We can compare to `pip` in
`scripts/compare_with_pip/compare_with_pip.py`?
Address a few pedantic lints
lints are separated into separate commits so they can be reviewed
individually.
I've not added enforcement for any of these lints, but that could be
added if desirable.