## Summary
This is annoying both locally in CI. If anyone wants to fuss with the
filters to fix it, that's fine too, but IMO it's better to disable than
leave it enabled on macOS for now.
When using `tool.uv.sources`, we warn that requirements have a bound,
i.e. at least a lower version constraint.
When using a library, the symbols you import were introduced in
different versions, creating an implicit lower bound. This warning makes
this explicit. This is crucial to prevent backtracking resolvers from
selecting an ancient versions that is not compatible (or worse, doesn't
build), and a performance optimization on top.
This feature is gated to `tool.uv.sources` (as it should have been to
begin with for #3263/#3443) to not unnecessarily break legacy workflows.
It is also helpful specifically when using a `tool.uv.sources` section
that contains constraints that are not published to pypi, e.g. for
workspace dependencies. We can adjust those later to e.g. not constrain
workspace dependencies with `publish = false`, but i think it's the
right setting to start with.
## Summary
These aren't intended for production use; instead, I'm just trying to
frame out the overall data flows and code-sharing for these commands. We
now have `uv sync` (sync the environment to match the lockfile, without
refreshing or resolving) and `uv lock` (generate the lockfile). Both
_require_ a virtual environment to exist (something we should change).
`uv sync`, `uv run`, and `uv lock` all share code for the underlying
subroutines (resolution and installation), so the commands themselves
are relatively small (~100 lines) and mostly consist of reading
arguments and such.
`uv lock` and `uv sync` don't actually really work yet, because we have
no way to include the project itself in the lockfile (that's a TODO in
the lockfile implementation).
Closes https://github.com/astral-sh/uv/issues/3432.
We would previously show the parsed version when erroring due to
trailing content after a valid version, which can look different than
the input. E.g. when encountering `0.1-bulbasaur`, we would display:
```
after parsing '0.1b0', found 'ulbasaur', which is not part of a valid version
```
With storing the input string instead of the input version, we now show:
```
after parsing '0.1-b', found 'ulbasaur', which is not part of a valid version
```
It turns out setuptools often uses Metadata-Version 2.1 in their
PKG-INFO:
4e766834d7/setuptools/dist.py (L64)
`Metadata23` requires Metadata-Version of at least 2.2.
This means that uv doesn't actually recognise legacy editable
installations from the most common way you'd actually get legacy
editable installations (works great for most legacy editables I make at
work though!)
Anyway, over here we only need the version and don't care about anything
else. Rather than make a `Metadata21`, I just add a version field to
`Metadata10`. The one slightly tricky thing is that only
Metadata-Version 1.2 and greater guarantee that the [version number is
PEP 440 compatible](https://peps.python.org/pep-0345/#version), so I
store the version in `Metadata10` as a `String` and only parse to
`Version` at time of use.
Also did you know that back in 2004, paramiko had a pokemon based
versioning system?
Pubgrub got a new feature where all unavailability is a custom, instead
of the reasonless `UnavailableDependencies` and our custom `String` type
previously (https://github.com/pubgrub-rs/pubgrub/pull/208). This PR
introduces a `UnavailableReason` that tracks either an entire version
being unusable, or a specific version. The error messages now also track
this difference properly.
The pubgrub commit is our main rebased onto the merged
https://github.com/pubgrub-rs/pubgrub/pull/208, i'll push
`konsti/main-rebase-generic-reason` to `main` after checking for rebase
problems.
## Summary
It's not clear to me that this should exist at all, but it's causing
errors in projects that don't use `tool.uv.sources`, so we should
definitely remove it for now.
## Summary
We already have a global `--isolated`, which means "ignore any on-disk
configuration". I think we should reuse this for the "ignore the
workspace" setting in `uv run`, rather than `--no-workspace`.
I've also merged the existing `--isolated` and `--no-workspace`
behaviors in `uv run` into a single flag. We may not need separate flags
for this, since the current intent seems to be "ignore the workspace
environment"? Though we could always re-add later.
Closes https://github.com/astral-sh/uv/issues/3421.
Resolves [#3419](https://github.com/astral-sh/uv/issues/3419)
## Summary
Add compatargs to pip install command and hint the user to create a venv
for --user arg.
## Test Plan
Tested it locally.
```bash
cargo run pip install --user flask
Compiling uv v0.1.39 (/home/ahmedilyas/uv/crates/uv)
Finished `dev` profile [unoptimized + debuginfo] target(s) in 8.96s
Running `target/debug/uv pip install --user flask`
error: pip install's `--user` is unsupported (use a virtual environment instead).
```
This change allows switching out the url type for requirements. The
original idea was to allow different types for different requirement
origins, so that core metadata reads can ban non-pep 508 requirements
while we only allow them for requirements.txt. This didn't work out
because we expect `&Requirement`s from all sources to match.
I also tried to split `pep508_rs` into a PEP 508 compliant crate and
into our extensions, but they are to tightly coupled.
I think this change is an improvement still as it reduces the hardcoded
dependence on `VerbatimUrl`.
We now correctly emit relative paths in `uv pip compile` with
`tool.uv.sources` path inputs.
`tool.uv.sources` is mainly intended to be used with the uv lock file
over requirements.txt, but it's good to have basic `uv pip` support
working.
Fixes#3366
## Summary
All of the resolver code is run on the main thread, so a lot of the
`Send` bounds and uses of `DashMap` and `Arc` are unnecessary. We could
also switch to using single-threaded versions of `Mutex` and `Notify` in
some places, but there isn't really a crate that provides those I would
be comfortable with using.
The `Arc` in `OnceMap` can't easily be removed because of the uv-auth
code which uses the
[reqwest-middleware](https://docs.rs/reqwest-middleware/latest/reqwest_middleware/trait.Middleware.html)
crate, that seems to adds unnecessary `Send` bounds because of
`async-trait`. We could duplicate the code and create a `OnceMapLocal`
variant, but I don't feel that's worth it.
## Summary
Users often find themselves dropped into environments that contain
`.egg-info` packages. While we won't support installing these, it's not
hard to support identifying them (e.g., in `pip freeze`) and
_uninstalling_ them.
Closes https://github.com/astral-sh/uv/issues/2841.
Closes#2928.
Closes#3341.
## Test Plan
Ran `cargo run pip freeze --python
/opt/homebrew/Caskroom/miniforge/base/envs/TEST/bin/python`, with an
environment that includes `pip` as an `.egg-info`
(`/opt/homebrew/Caskroom/miniforge/base/envs/TEST/lib/python3.12/site-packages/pip-24.0-py3.12.egg-info`):
```
cffi @ file:///Users/runner/miniforge3/conda-bld/cffi_1696001825047/work
pip==24.0
pycparser @ file:///home/conda/feedstock_root/build_artifacts/pycparser_1711811537435/work
setuptools==69.5.1
wheel==0.43.0
```
Then ran `cargo run pip uninstall`, verified that `pip` was uninstalled,
and no longer listed in `pip freeze`.
Fixes#3371
It seems like uv doesn't proactively enforce 3.8+ and in most cases just
issues a warning. This PR keeps that property, only adding the new check
when it is known to fail. I checked the imports in this file and the
other ones seem fine.
## Summary
Refreshes some of the activation scripts, and fixes some bugs in
`activate_this.py` that were likely the rest of some erroneous
copy-pasting.
Closes https://github.com/astral-sh/uv/issues/3346.
## Test Plan
```
❯ python
Python 3.12.0 (main, Feb 28 2024, 09:44:16) [Clang 15.0.0 (clang-1500.1.0.2.5)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import httpx
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'httpx'
>>> activator = '.venv/bin/activate_this.py'
>>> with open(activator) as f:
... exec(f.read(), {'__file__': activator})
...
>>> import httpx
```
Only allow using `tool.uv.sources` with preview mode, the design isn't
finalized yet.
Not sure what to label this, do we want a preview section and label for
the release notes?
## Summary
We need to partition the editable and non-editable requirements. As-is,
`editable = true` requirements were still being installed as
non-editable.
## Summary
We were writing the build dependencies into the `--target` directory,
which both made builds fail and led to them leaking into the user's
directory.
Closes https://github.com/astral-sh/uv/issues/3349.
## Introduction
PEP 621 is limited. Specifically, it lacks
* Relative path support
* Editable support
* Workspace support
* Index pinning or any sort of index specification
The semantics of urls are a custom extension, PEP 440 does not specify
how to use git references or subdirectories, instead pip has a custom
stringly format. We need to somehow support these while still stying
compatible with PEP 621.
## `tool.uv.source`
Drawing inspiration from cargo, poetry and rye, we add `tool.uv.sources`
or (for now stub only) `tool.uv.workspace`:
```toml
[project]
name = "albatross"
version = "0.1.0"
dependencies = [
"tqdm >=4.66.2,<5",
"torch ==2.2.2",
"transformers[torch] >=4.39.3,<5",
"importlib_metadata >=7.1.0,<8; python_version < '3.10'",
"mollymawk ==0.1.0"
]
[tool.uv.sources]
tqdm = { git = "https://github.com/tqdm/tqdm", rev = "cc372d09dcd5a5eabdc6ed4cf365bdb0be004d44" }
importlib_metadata = { url = "https://github.com/python/importlib_metadata/archive/refs/tags/v7.1.0.zip" }
torch = { index = "torch-cu118" }
mollymawk = { workspace = true }
[tool.uv.workspace]
include = [
"packages/mollymawk"
]
[tool.uv.indexes]
torch-cu118 = "https://download.pytorch.org/whl/cu118"
```
See `docs/specifying_dependencies.md` for a detailed explanation of the
format. The basic gist is that `project.dependencies` is what ends up on
pypi, while `tool.uv.sources` are your non-published additions. We do
support the full range or PEP 508, we just hide it in the docs and
prefer the exploded table for easier readability and less confusing with
actual url parts.
This format should eventually be able to subsume requirements.txt's
current use cases. While we will continue to support the legacy `uv pip`
interface, this is a piece of the uv's own top level interface. Together
with `uv run` and a lockfile format, you should only need to write
`pyproject.toml` and do `uv run`, which generates/uses/updates your
lockfile behind the scenes, no more pip-style requirements involved. It
also lays the groundwork for implementing index pinning.
## Changes
This PR implements:
* Reading and lowering `project.dependencies`,
`project.optional-dependencies` and `tool.uv.sources` into a new
requirements format, including:
* Git dependencies
* Url dependencies
* Path dependencies, including relative and editable
* `pip install` integration
* Error reporting for invalid `tool.uv.sources`
* Json schema integration (works in pycharm, see below)
* Draft user-level docs (see `docs/specifying_dependencies.md`)
It does not implement:
* No `pip compile` testing, deprioritizing towards our own lockfile
* Index pinning (stub definitions only)
* Development dependencies
* Workspace support (stub definitions only)
* Overrides in pyproject.toml
* Patching/replacing dependencies
One technically breaking change is that we now require user provided
pyproject.toml to be valid wrt to PEP 621. Included files still fall
back to PEP 517. That means `pip install -r requirements.txt` requires
it to be valid while `pip install -r requirements.txt` with `-e .` as
content falls back to PEP 517 as before.
## Implementation
The `pep508` requirement is replaced by a new `UvRequirement` (name up
for bikeshedding, not particularly attached to the uv prefix). The still
existing `pep508_rs::Requirement` type is a url format copied from pip's
requirements.txt and doesn't appropriately capture all features we
want/need to support. The bulk of the diff is changing the requirement
type throughout the codebase.
We still use `VerbatimUrl` in many places, where we would expect a
parsed/decomposed url type, specifically:
* Reading core metadata except top level pyproject.toml files, we fail a
step later instead if the url isn't supported.
* Allowed `Urls`.
* `PackageId` with a custom `CanonicalUrl` comparison, instead of
canonicalizing urls eagerly.
* `PubGrubPackage`: We eventually convert the `VerbatimUrl` back to a
`Dist` (`Dist::from_url`), instead of remembering the url.
* Source dist types: We use verbatim url even though we know and require
that these are supported urls we can and have parsed.
I tried to make improve the situation be replacing `VerbatimUrl`, but
these changes would require massive invasive changes (see e.g.
https://github.com/astral-sh/uv/pull/3253). A main problem is the ref
`VersionOrUrl` and applying overrides, which assume the same
requirement/url type everywhere. In its current form, this PR increases
this tech debt.
I've tried to split off PRs and commits, but the main refactoring is
still a single monolith commit to make it compile and the tests pass.
## Demo
Adding
d1ae3b85d5/pyproject.json
as json schema (v7) to pycharm for `pyproject.toml`, you can try the IDE
support already:

[dove.webm](c293c272-c80b-459d-8c95-8c46a8d198a1)
In *some* places in our crates, `serde` (and `rkyv`) are optional
dependencies. I believe this was done out of reasons of "good sense,"
that is, it follows a Rust ecosystem pattern where serde integration
tends to be an opt-in crate feature. (And similarly for `rkyv`.)
However, ultimately, `uv` itself requires `serde` and `rkyv` to
function. Since our crates are strictly internal, there are limited
consumers for our crates without `serde` (and `rkyv`) enabled. I think
one possibility is that optional `serde` (and `rkyv`) integration means
that someone can do this:
cargo test -p pep440_rs
And this will run tests _without_ `serde` or `rkyv` enabled. That in
turn could lead to faster iteration time by reducing compile times. But,
I'm not sure this is worth supporting. The iterative compilation times
of
individual crates are probably fast enough in debug mode, even with
`serde` and `rkyv` enabled. Namely, `serde` and `rkyv` themselves
shouldn't need to be re-compiled in most cases. On `main`:
```
from-scratch: `cargo test -p pep440_rs --lib` 0.685
incremental: `cargo test -p pep440_rs --lib` 0.278s
from-scratch: `cargo test -p pep440_rs --features serde,rkyv --lib` 3.948s
incremental: `cargo test -p pep440_rs --features serde,rkyv --lib` 0.321s
```
So while a from-scratch build does take significantly longer, an
incremental build is about the same.
The benefit of doing this change is two-fold:
1. It brings out crates into alignment with "reality." In particular,
some crates were _implicitly_ relying on `serde` being enabled
without explicitly declaring it. This technically means that our
`Cargo.toml`s were wrong in some cases, but it is hard to observe it
because of feature unification in a Cargo workspace.
2. We no longer need to deal with the cognitive burden of writing
`#[cfg_attr(feature = "serde", ...)]` everywhere.
This PR principally adds a routine for converting a `Lock` to a
`Resolution`, where a `Resolution` is a map of package names pinned to
a specific version.
I'm not sure that a `Resolution` is ultimately what we want here (we
might need more stuff), but this was the quickest route I could find to
plug a `Lock` into our existing `uv pip install` infrastructure.
This commit also does a little refactoring of the `Lock` types. The
main thing is to permit extra state on some of the types (like a
`by_id` map on `Lock` for quick lookups of distributions) that aren't
included in the serialization format of a `Lock`. We achieve this
by defining separate `Wire` types that are automatically converted
to-and-from via `serde`.
Note that like with the lock file format types themselves, we leave a
few `todo!()` expressions around. The main idea is to get something
minimally working without spending too much effort here. (A fair bit
of refactoring will be required to generate a lock file, and it's
not clear how much this code will wind up needing to change anyway.)
In particular, we only handle the case of installing wheels from a
registry.
A demonstration of the full flow:
```
$ requirements.in
anyio
$ cargo run -p uv -- pip compile -p3.10 requirements.in --unstable-uv-lock-file
$ uv venv
$ cargo run -p uv -- pip install --unstable-uv-lock-file anyio -r requirements.in
Installed 5 packages in 7ms
+ anyio==4.3.0
+ exceptiongroup==1.2.1
+ idna==3.7
+ sniffio==1.3.1
+ typing-extensions==4.11.0
```
In order to install from a lock file, we start from the root and do a
breadth first traversal over its dependencies. We aren't yet filtering
on marker expressions (since they aren't in the lock file yet), but we
should be able to add that in the future. In so doing, the traversal
should select only the subset of distributions relevant for the current
platform.
Split out of #3266
The "selector" concept doesn't seem well enough defined as-is. For
example, `PythonVersion` belongs there but isn't present. Going for
smaller modules instead.