mirror of
https://github.com/astral-sh/uv.git
synced 2025-08-04 19:08:04 +00:00
Add basic tool.uv.sources
support (#3263)
## Introduction PEP 621 is limited. Specifically, it lacks * Relative path support * Editable support * Workspace support * Index pinning or any sort of index specification The semantics of urls are a custom extension, PEP 440 does not specify how to use git references or subdirectories, instead pip has a custom stringly format. We need to somehow support these while still stying compatible with PEP 621. ## `tool.uv.source` Drawing inspiration from cargo, poetry and rye, we add `tool.uv.sources` or (for now stub only) `tool.uv.workspace`: ```toml [project] name = "albatross" version = "0.1.0" dependencies = [ "tqdm >=4.66.2,<5", "torch ==2.2.2", "transformers[torch] >=4.39.3,<5", "importlib_metadata >=7.1.0,<8; python_version < '3.10'", "mollymawk ==0.1.0" ] [tool.uv.sources] tqdm = { git = "https://github.com/tqdm/tqdm", rev = "cc372d09dcd5a5eabdc6ed4cf365bdb0be004d44" } importlib_metadata = { url = "https://github.com/python/importlib_metadata/archive/refs/tags/v7.1.0.zip" } torch = { index = "torch-cu118" } mollymawk = { workspace = true } [tool.uv.workspace] include = [ "packages/mollymawk" ] [tool.uv.indexes] torch-cu118 = "https://download.pytorch.org/whl/cu118" ``` See `docs/specifying_dependencies.md` for a detailed explanation of the format. The basic gist is that `project.dependencies` is what ends up on pypi, while `tool.uv.sources` are your non-published additions. We do support the full range or PEP 508, we just hide it in the docs and prefer the exploded table for easier readability and less confusing with actual url parts. This format should eventually be able to subsume requirements.txt's current use cases. While we will continue to support the legacy `uv pip` interface, this is a piece of the uv's own top level interface. Together with `uv run` and a lockfile format, you should only need to write `pyproject.toml` and do `uv run`, which generates/uses/updates your lockfile behind the scenes, no more pip-style requirements involved. It also lays the groundwork for implementing index pinning. ## Changes This PR implements: * Reading and lowering `project.dependencies`, `project.optional-dependencies` and `tool.uv.sources` into a new requirements format, including: * Git dependencies * Url dependencies * Path dependencies, including relative and editable * `pip install` integration * Error reporting for invalid `tool.uv.sources` * Json schema integration (works in pycharm, see below) * Draft user-level docs (see `docs/specifying_dependencies.md`) It does not implement: * No `pip compile` testing, deprioritizing towards our own lockfile * Index pinning (stub definitions only) * Development dependencies * Workspace support (stub definitions only) * Overrides in pyproject.toml * Patching/replacing dependencies One technically breaking change is that we now require user provided pyproject.toml to be valid wrt to PEP 621. Included files still fall back to PEP 517. That means `pip install -r requirements.txt` requires it to be valid while `pip install -r requirements.txt` with `-e .` as content falls back to PEP 517 as before. ## Implementation The `pep508` requirement is replaced by a new `UvRequirement` (name up for bikeshedding, not particularly attached to the uv prefix). The still existing `pep508_rs::Requirement` type is a url format copied from pip's requirements.txt and doesn't appropriately capture all features we want/need to support. The bulk of the diff is changing the requirement type throughout the codebase. We still use `VerbatimUrl` in many places, where we would expect a parsed/decomposed url type, specifically: * Reading core metadata except top level pyproject.toml files, we fail a step later instead if the url isn't supported. * Allowed `Urls`. * `PackageId` with a custom `CanonicalUrl` comparison, instead of canonicalizing urls eagerly. * `PubGrubPackage`: We eventually convert the `VerbatimUrl` back to a `Dist` (`Dist::from_url`), instead of remembering the url. * Source dist types: We use verbatim url even though we know and require that these are supported urls we can and have parsed. I tried to make improve the situation be replacing `VerbatimUrl`, but these changes would require massive invasive changes (see e.g. https://github.com/astral-sh/uv/pull/3253). A main problem is the ref `VersionOrUrl` and applying overrides, which assume the same requirement/url type everywhere. In its current form, this PR increases this tech debt. I've tried to split off PRs and commits, but the main refactoring is still a single monolith commit to make it compile and the tests pass. ## Demo Addingd1ae3b85d5/pyproject.json
as json schema (v7) to pycharm for `pyproject.toml`, you can try the IDE support already:  [dove.webm](c293c272
-c80b-459d-8c95-8c46a8d198a1)
This commit is contained in:
parent
2ffb252498
commit
4f87edbe66
87 changed files with 3039 additions and 1007 deletions
21
Cargo.lock
generated
21
Cargo.lock
generated
|
@ -1118,6 +1118,7 @@ dependencies = [
|
|||
"distribution-filename",
|
||||
"fs-err",
|
||||
"git2",
|
||||
"indexmap",
|
||||
"itertools 0.12.1",
|
||||
"once_cell",
|
||||
"pep440_rs",
|
||||
|
@ -2522,6 +2523,7 @@ name = "pep508_rs"
|
|||
version = "0.4.2"
|
||||
dependencies = [
|
||||
"derivative",
|
||||
"indexmap",
|
||||
"insta",
|
||||
"log",
|
||||
"once_cell",
|
||||
|
@ -2537,6 +2539,7 @@ dependencies = [
|
|||
"unicode-width",
|
||||
"url",
|
||||
"uv-fs",
|
||||
"uv-git",
|
||||
"uv-normalize",
|
||||
]
|
||||
|
||||
|
@ -3070,10 +3073,12 @@ version = "0.0.1"
|
|||
dependencies = [
|
||||
"anyhow",
|
||||
"assert_fs",
|
||||
"distribution-types",
|
||||
"fs-err",
|
||||
"indoc",
|
||||
"insta",
|
||||
"itertools 0.12.1",
|
||||
"pep440_rs",
|
||||
"pep508_rs",
|
||||
"regex",
|
||||
"reqwest",
|
||||
|
@ -3081,6 +3086,7 @@ dependencies = [
|
|||
"serde",
|
||||
"tempfile",
|
||||
"test-case",
|
||||
"thiserror",
|
||||
"tokio",
|
||||
"tracing",
|
||||
"unscanny",
|
||||
|
@ -3446,6 +3452,7 @@ dependencies = [
|
|||
"schemars_derive",
|
||||
"serde",
|
||||
"serde_json",
|
||||
"url",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
@ -4482,6 +4489,7 @@ dependencies = [
|
|||
"filetime",
|
||||
"flate2",
|
||||
"fs-err",
|
||||
"indexmap",
|
||||
"indicatif",
|
||||
"indoc",
|
||||
"insta",
|
||||
|
@ -4490,6 +4498,7 @@ dependencies = [
|
|||
"miette",
|
||||
"mimalloc",
|
||||
"owo-colors",
|
||||
"pep440_rs",
|
||||
"pep508_rs",
|
||||
"platform-tags",
|
||||
"predicates",
|
||||
|
@ -4661,6 +4670,7 @@ version = "0.0.1"
|
|||
dependencies = [
|
||||
"anyhow",
|
||||
"clap",
|
||||
"distribution-types",
|
||||
"itertools 0.12.1",
|
||||
"pep508_rs",
|
||||
"platform-tags",
|
||||
|
@ -4669,6 +4679,7 @@ dependencies = [
|
|||
"serde",
|
||||
"serde_json",
|
||||
"uv-auth",
|
||||
"uv-cache",
|
||||
"uv-normalize",
|
||||
]
|
||||
|
||||
|
@ -4714,6 +4725,7 @@ dependencies = [
|
|||
"uv-installer",
|
||||
"uv-interpreter",
|
||||
"uv-normalize",
|
||||
"uv-requirements",
|
||||
"uv-resolver",
|
||||
"uv-types",
|
||||
"uv-workspace",
|
||||
|
@ -4850,6 +4862,7 @@ version = "0.0.1"
|
|||
dependencies = [
|
||||
"anyhow",
|
||||
"async-channel",
|
||||
"distribution-filename",
|
||||
"distribution-types",
|
||||
"fs-err",
|
||||
"futures",
|
||||
|
@ -4876,6 +4889,7 @@ dependencies = [
|
|||
"uv-fs",
|
||||
"uv-interpreter",
|
||||
"uv-normalize",
|
||||
"uv-requirements",
|
||||
"uv-types",
|
||||
"uv-warnings",
|
||||
"walkdir",
|
||||
|
@ -4943,11 +4957,17 @@ dependencies = [
|
|||
"distribution-types",
|
||||
"fs-err",
|
||||
"futures",
|
||||
"glob",
|
||||
"indexmap",
|
||||
"indoc",
|
||||
"insta",
|
||||
"path-absolutize",
|
||||
"pep440_rs",
|
||||
"pep508_rs",
|
||||
"pypi-types",
|
||||
"requirements-txt",
|
||||
"rustc-hash",
|
||||
"schemars",
|
||||
"serde",
|
||||
"thiserror",
|
||||
"toml",
|
||||
|
@ -4957,6 +4977,7 @@ dependencies = [
|
|||
"uv-configuration",
|
||||
"uv-distribution",
|
||||
"uv-fs",
|
||||
"uv-git",
|
||||
"uv-normalize",
|
||||
"uv-resolver",
|
||||
"uv-types",
|
||||
|
|
|
@ -116,7 +116,7 @@ rmp-serde = { version = "1.1.2" }
|
|||
rust-netrc = { version = "0.1.1" }
|
||||
rustc-hash = { version = "1.1.0" }
|
||||
same-file = { version = "1.0.6" }
|
||||
schemars = { version = "0.8.16" }
|
||||
schemars = { version = "0.8.16", features = ["url"] }
|
||||
seahash = { version = "4.1.0" }
|
||||
serde = { version = "1.0.197" }
|
||||
serde_json = { version = "1.0.114" }
|
||||
|
|
|
@ -2,8 +2,7 @@ use std::str::FromStr;
|
|||
|
||||
use bench::criterion::black_box;
|
||||
use bench::criterion::{criterion_group, criterion_main, measurement::WallTime, Criterion};
|
||||
|
||||
use pep508_rs::Requirement;
|
||||
use distribution_types::Requirement;
|
||||
use uv_cache::Cache;
|
||||
use uv_client::RegistryClientBuilder;
|
||||
use uv_resolver::Manifest;
|
||||
|
@ -15,7 +14,10 @@ fn resolve_warm_jupyter(c: &mut Criterion<WallTime>) {
|
|||
.unwrap();
|
||||
|
||||
let cache = &Cache::from_path(".cache").unwrap();
|
||||
let manifest = &Manifest::simple(vec![Requirement::from_str("jupyter").unwrap()]);
|
||||
let manifest = &Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("jupyter").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let client = &RegistryClientBuilder::new(cache.clone()).build();
|
||||
|
||||
let run = || {
|
||||
|
@ -35,13 +37,14 @@ criterion_group!(uv, resolve_warm_jupyter);
|
|||
criterion_main!(uv);
|
||||
|
||||
mod resolver {
|
||||
use anyhow::Result;
|
||||
use once_cell::sync::Lazy;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::str::FromStr;
|
||||
|
||||
use distribution_types::{IndexLocations, Resolution, SourceDist};
|
||||
use pep508_rs::{MarkerEnvironment, Requirement, StringVersion};
|
||||
use anyhow::Result;
|
||||
use once_cell::sync::Lazy;
|
||||
|
||||
use distribution_types::{IndexLocations, Requirement, Resolution, SourceDist};
|
||||
use pep508_rs::{MarkerEnvironment, StringVersion};
|
||||
use platform_tags::{Arch, Os, Platform, Tags};
|
||||
use uv_cache::Cache;
|
||||
use uv_client::RegistryClient;
|
||||
|
|
|
@ -26,6 +26,7 @@ uv-normalize = { workspace = true }
|
|||
anyhow = { workspace = true }
|
||||
fs-err = { workspace = true }
|
||||
git2 = { workspace = true }
|
||||
indexmap = { workspace = true }
|
||||
itertools = { workspace = true }
|
||||
once_cell = { workspace = true }
|
||||
rkyv = { workspace = true }
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
use std::path::{Path, PathBuf};
|
||||
|
||||
use anyhow::Result;
|
||||
use anyhow::{anyhow, Result};
|
||||
|
||||
use distribution_filename::WheelFilename;
|
||||
use pep508_rs::VerbatimUrl;
|
||||
|
@ -111,10 +111,14 @@ impl CachedDist {
|
|||
assert_eq!(dist.url.scheme(), "file", "{}", dist.url);
|
||||
Ok(Some(ParsedUrl::LocalFile(ParsedLocalFileUrl {
|
||||
url: dist.url.raw().clone(),
|
||||
path: dist
|
||||
.url
|
||||
.to_file_path()
|
||||
.map_err(|()| anyhow!("Invalid path in file URL"))?,
|
||||
editable: dist.editable,
|
||||
})))
|
||||
} else {
|
||||
Ok(Some(ParsedUrl::try_from(dist.url.raw())?))
|
||||
Ok(Some(ParsedUrl::try_from(dist.url.to_url())?))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,4 +1,6 @@
|
|||
use url::Url;
|
||||
|
||||
use pep508_rs::VerbatimUrl;
|
||||
use uv_normalize::PackageName;
|
||||
|
||||
#[derive(thiserror::Error, Debug)]
|
||||
|
@ -23,4 +25,7 @@ pub enum Error {
|
|||
|
||||
#[error("Requested package name `{0}` does not match `{1}` in the distribution filename: {2}")]
|
||||
PackageNameMismatch(PackageName, PackageName, String),
|
||||
|
||||
#[error("Only directories can be installed as editable, not filenames: `{0}`")]
|
||||
EditableFile(VerbatimUrl),
|
||||
}
|
||||
|
|
|
@ -7,6 +7,7 @@ use tracing::warn;
|
|||
use url::Url;
|
||||
|
||||
use pep440_rs::Version;
|
||||
use pypi_types::DirectUrl;
|
||||
use uv_fs::Simplified;
|
||||
use uv_normalize::PackageName;
|
||||
|
||||
|
@ -32,6 +33,7 @@ pub struct InstalledRegistryDist {
|
|||
pub struct InstalledDirectUrlDist {
|
||||
pub name: PackageName,
|
||||
pub version: Version,
|
||||
pub direct_url: Box<DirectUrl>,
|
||||
pub url: Url,
|
||||
pub editable: bool,
|
||||
pub path: PathBuf,
|
||||
|
@ -60,7 +62,8 @@ impl InstalledDist {
|
|||
Ok(url) => Ok(Some(Self::Url(InstalledDirectUrlDist {
|
||||
name,
|
||||
version,
|
||||
editable: matches!(&direct_url, pypi_types::DirectUrl::LocalDirectory { dir_info, .. } if dir_info.editable == Some(true)),
|
||||
editable: matches!(&direct_url, DirectUrl::LocalDirectory { dir_info, .. } if dir_info.editable == Some(true)),
|
||||
direct_url: Box::new(direct_url),
|
||||
url,
|
||||
path: path.to_path_buf(),
|
||||
}))),
|
||||
|
@ -101,12 +104,12 @@ impl InstalledDist {
|
|||
}
|
||||
|
||||
/// Read the `direct_url.json` file from a `.dist-info` directory.
|
||||
pub fn direct_url(path: &Path) -> Result<Option<pypi_types::DirectUrl>> {
|
||||
pub fn direct_url(path: &Path) -> Result<Option<DirectUrl>> {
|
||||
let path = path.join("direct_url.json");
|
||||
let Ok(file) = fs_err::File::open(path) else {
|
||||
return Ok(None);
|
||||
};
|
||||
let direct_url = serde_json::from_reader::<fs_err::File, pypi_types::DirectUrl>(file)?;
|
||||
let direct_url = serde_json::from_reader::<fs_err::File, DirectUrl>(file)?;
|
||||
Ok(Some(direct_url))
|
||||
}
|
||||
|
||||
|
|
|
@ -56,8 +56,10 @@ pub use crate::index_url::*;
|
|||
pub use crate::installed::*;
|
||||
pub use crate::parsed_url::*;
|
||||
pub use crate::prioritized_distribution::*;
|
||||
pub use crate::requirement::*;
|
||||
pub use crate::resolution::*;
|
||||
pub use crate::resolved::*;
|
||||
pub use crate::specified_requirement::*;
|
||||
pub use crate::traits::*;
|
||||
|
||||
mod any;
|
||||
|
@ -72,8 +74,10 @@ mod index_url;
|
|||
mod installed;
|
||||
mod parsed_url;
|
||||
mod prioritized_distribution;
|
||||
mod requirement;
|
||||
mod resolution;
|
||||
mod resolved;
|
||||
mod specified_requirement;
|
||||
mod traits;
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
|
@ -228,8 +232,8 @@ impl Dist {
|
|||
}
|
||||
|
||||
/// A remote built distribution (`.whl`) or source distribution from a `http://` or `https://`
|
||||
/// url.
|
||||
fn from_http_url(name: PackageName, url: VerbatimUrl) -> Result<Dist, Error> {
|
||||
/// URL.
|
||||
pub fn from_http_url(name: PackageName, url: VerbatimUrl) -> Result<Dist, Error> {
|
||||
if Path::new(url.path())
|
||||
.extension()
|
||||
.is_some_and(|ext| ext.eq_ignore_ascii_case("whl"))
|
||||
|
@ -256,8 +260,12 @@ impl Dist {
|
|||
}
|
||||
}
|
||||
|
||||
/// A local built or source distribution from a `file://` url.
|
||||
fn from_file_url(name: PackageName, url: VerbatimUrl) -> Result<Dist, Error> {
|
||||
/// A local built or source distribution from a `file://` URL.
|
||||
pub fn from_file_url(
|
||||
name: PackageName,
|
||||
url: VerbatimUrl,
|
||||
editable: bool,
|
||||
) -> Result<Dist, Error> {
|
||||
// Store the canonicalized path, which also serves to validate that it exists.
|
||||
let path = match url
|
||||
.to_file_path()
|
||||
|
@ -285,6 +293,10 @@ impl Dist {
|
|||
));
|
||||
}
|
||||
|
||||
if editable {
|
||||
return Err(Error::EditableFile(url));
|
||||
}
|
||||
|
||||
Ok(Self::Built(BuiltDist::Path(PathBuiltDist {
|
||||
filename,
|
||||
url,
|
||||
|
@ -295,7 +307,7 @@ impl Dist {
|
|||
name,
|
||||
url,
|
||||
path,
|
||||
editable: false,
|
||||
editable,
|
||||
})))
|
||||
}
|
||||
}
|
||||
|
@ -305,11 +317,12 @@ impl Dist {
|
|||
Ok(Self::Source(SourceDist::Git(GitSourceDist { name, url })))
|
||||
}
|
||||
|
||||
// TODO(konsti): We should carry the parsed URL through the codebase.
|
||||
/// Create a [`Dist`] for a URL-based distribution.
|
||||
pub fn from_url(name: PackageName, url: VerbatimUrl) -> Result<Self, Error> {
|
||||
match Scheme::parse(url.scheme()) {
|
||||
Some(Scheme::Http | Scheme::Https) => Self::from_http_url(name, url),
|
||||
Some(Scheme::File) => Self::from_file_url(name, url),
|
||||
Some(Scheme::File) => Self::from_file_url(name, url, false),
|
||||
Some(Scheme::GitSsh | Scheme::GitHttps) => Self::from_git_url(name, url),
|
||||
Some(Scheme::GitGit | Scheme::GitHttp) => Err(Error::UnsupportedScheme(
|
||||
url.scheme().to_owned(),
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
use anyhow::{Error, Result};
|
||||
use std::path::PathBuf;
|
||||
|
||||
use anyhow::{Error, Result};
|
||||
use thiserror::Error;
|
||||
use url::Url;
|
||||
|
||||
|
@ -15,13 +16,11 @@ pub enum ParsedUrlError {
|
|||
GitShaParse(Url, #[source] git2::Error),
|
||||
#[error("Not a valid URL: `{0}`")]
|
||||
UrlParse(String, #[source] url::ParseError),
|
||||
#[error("Missing `git+` prefix for Git URL: `{0}`")]
|
||||
MissingUrlPrefix(Url),
|
||||
}
|
||||
|
||||
/// We support three types of URLs for distributions:
|
||||
/// * The path to a file or directory (`file://`)
|
||||
/// * A git repository (`git+https://` or `git+ssh://`), optionally with a subdirectory and/or
|
||||
/// * A Git repository (`git+https://` or `git+ssh://`), optionally with a subdirectory and/or
|
||||
/// string to checkout.
|
||||
/// * A remote archive (`https://`), optional with a subdirectory (source dist only)
|
||||
/// A URL in a requirement `foo @ <url>` must be one of the above.
|
||||
|
@ -39,45 +38,38 @@ pub enum ParsedUrl {
|
|||
///
|
||||
/// Examples:
|
||||
/// * `file:///home/ferris/my_project`
|
||||
#[derive(Debug)]
|
||||
#[derive(Debug, Eq, PartialEq)]
|
||||
pub struct ParsedLocalFileUrl {
|
||||
pub url: Url,
|
||||
pub path: PathBuf,
|
||||
pub editable: bool,
|
||||
}
|
||||
|
||||
/// A git repository url
|
||||
/// A Git repository URL.
|
||||
///
|
||||
/// Examples:
|
||||
/// * `git+https://git.example.com/MyProject.git`
|
||||
/// * `git+https://git.example.com/MyProject.git@v1.0#egg=pkg&subdirectory=pkg_dir`
|
||||
#[derive(Debug)]
|
||||
#[derive(Debug, Eq, PartialEq)]
|
||||
pub struct ParsedGitUrl {
|
||||
pub url: GitUrl,
|
||||
pub subdirectory: Option<PathBuf>,
|
||||
}
|
||||
|
||||
/// An archive url
|
||||
///
|
||||
/// Examples:
|
||||
/// * wheel: `https://download.pytorch.org/whl/torch-2.0.1-cp39-cp39-manylinux2014_aarch64.whl#sha256=423e0ae257b756bb45a4b49072046772d1ad0c592265c5080070e0767da4e490`
|
||||
/// * source dist, correctly named: `https://files.pythonhosted.org/packages/62/06/d5604a70d160f6a6ca5fd2ba25597c24abd5c5ca5f437263d177ac242308/tqdm-4.66.1.tar.gz`
|
||||
/// * source dist, only extension recognizable: `https://github.com/foo-labs/foo/archive/master.zip#egg=pkg&subdirectory=packages/bar`
|
||||
#[derive(Debug)]
|
||||
pub struct ParsedArchiveUrl {
|
||||
pub url: Url,
|
||||
pub subdirectory: Option<PathBuf>,
|
||||
}
|
||||
|
||||
impl TryFrom<&Url> for ParsedGitUrl {
|
||||
impl TryFrom<Url> for ParsedGitUrl {
|
||||
type Error = ParsedUrlError;
|
||||
|
||||
fn try_from(url_in: &Url) -> Result<Self, Self::Error> {
|
||||
let subdirectory = get_subdirectory(url_in);
|
||||
/// Supports URLS with and without the `git+` prefix.
|
||||
///
|
||||
/// When the URL includes a prefix, it's presumed to come from a PEP 508 requirement; when it's
|
||||
/// excluded, it's presumed to come from `tool.uv.sources`.
|
||||
fn try_from(url_in: Url) -> Result<Self, Self::Error> {
|
||||
let subdirectory = get_subdirectory(&url_in);
|
||||
|
||||
let url = url_in
|
||||
.as_str()
|
||||
.strip_prefix("git+")
|
||||
.ok_or_else(|| ParsedUrlError::MissingUrlPrefix(url_in.clone()))?;
|
||||
.unwrap_or(url_in.as_str());
|
||||
let url = Url::parse(url).map_err(|err| ParsedUrlError::UrlParse(url.to_string(), err))?;
|
||||
let url = GitUrl::try_from(url)
|
||||
.map_err(|err| ParsedUrlError::GitShaParse(url_in.clone(), err))?;
|
||||
|
@ -85,12 +77,22 @@ impl TryFrom<&Url> for ParsedGitUrl {
|
|||
}
|
||||
}
|
||||
|
||||
impl From<&Url> for ParsedArchiveUrl {
|
||||
fn from(url: &Url) -> Self {
|
||||
Self {
|
||||
url: url.clone(),
|
||||
subdirectory: get_subdirectory(url),
|
||||
}
|
||||
/// An archive URL.
|
||||
///
|
||||
/// Examples:
|
||||
/// * wheel: `https://download.pytorch.org/whl/torch-2.0.1-cp39-cp39-manylinux2014_aarch64.whl#sha256=423e0ae257b756bb45a4b49072046772d1ad0c592265c5080070e0767da4e490`
|
||||
/// * source dist, correctly named: `https://files.pythonhosted.org/packages/62/06/d5604a70d160f6a6ca5fd2ba25597c24abd5c5ca5f437263d177ac242308/tqdm-4.66.1.tar.gz`
|
||||
/// * source dist, only extension recognizable: `https://github.com/foo-labs/foo/archive/master.zip#egg=pkg&subdirectory=packages/bar`
|
||||
#[derive(Debug, Eq, PartialEq)]
|
||||
pub struct ParsedArchiveUrl {
|
||||
pub url: Url,
|
||||
pub subdirectory: Option<PathBuf>,
|
||||
}
|
||||
|
||||
impl From<Url> for ParsedArchiveUrl {
|
||||
fn from(url: Url) -> Self {
|
||||
let subdirectory = get_subdirectory(&url);
|
||||
Self { url, subdirectory }
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -109,15 +111,15 @@ fn get_subdirectory(url: &Url) -> Option<PathBuf> {
|
|||
}
|
||||
|
||||
/// Return the Git reference of the given URL, if it exists.
|
||||
pub fn git_reference(url: &Url) -> Result<Option<GitSha>, Error> {
|
||||
pub fn git_reference(url: Url) -> Result<Option<GitSha>, Error> {
|
||||
let ParsedGitUrl { url, .. } = ParsedGitUrl::try_from(url)?;
|
||||
Ok(url.precise())
|
||||
}
|
||||
|
||||
impl TryFrom<&Url> for ParsedUrl {
|
||||
impl TryFrom<Url> for ParsedUrl {
|
||||
type Error = ParsedUrlError;
|
||||
|
||||
fn try_from(url: &Url) -> Result<Self, Self::Error> {
|
||||
fn try_from(url: Url) -> Result<Self, Self::Error> {
|
||||
if let Some((prefix, ..)) = url.scheme().split_once('+') {
|
||||
match prefix {
|
||||
"git" => Ok(Self::Git(ParsedGitUrl::try_from(url)?)),
|
||||
|
@ -127,8 +129,12 @@ impl TryFrom<&Url> for ParsedUrl {
|
|||
}),
|
||||
}
|
||||
} else if url.scheme().eq_ignore_ascii_case("file") {
|
||||
let path = url
|
||||
.to_file_path()
|
||||
.map_err(|()| ParsedUrlError::InvalidFileUrl(url.clone()))?;
|
||||
Ok(Self::LocalFile(ParsedLocalFileUrl {
|
||||
url: url.clone(),
|
||||
url,
|
||||
path,
|
||||
editable: false,
|
||||
}))
|
||||
} else {
|
||||
|
@ -239,33 +245,38 @@ mod tests {
|
|||
|
||||
#[test]
|
||||
fn direct_url_from_url() -> Result<()> {
|
||||
let expected = Url::parse("file:///path/to/directory")?;
|
||||
let actual = Url::from(ParsedUrl::try_from(&expected)?);
|
||||
assert_eq!(expected, actual);
|
||||
|
||||
let expected = Url::parse("git+https://github.com/pallets/flask.git")?;
|
||||
let actual = Url::from(ParsedUrl::try_from(&expected)?);
|
||||
let actual = Url::from(ParsedUrl::try_from(expected.clone())?);
|
||||
assert_eq!(expected, actual);
|
||||
|
||||
let expected = Url::parse("git+https://github.com/pallets/flask.git#subdirectory=pkg_dir")?;
|
||||
let actual = Url::from(ParsedUrl::try_from(&expected)?);
|
||||
let actual = Url::from(ParsedUrl::try_from(expected.clone())?);
|
||||
assert_eq!(expected, actual);
|
||||
|
||||
let expected = Url::parse("git+https://github.com/pallets/flask.git@2.0.0")?;
|
||||
let actual = Url::from(ParsedUrl::try_from(&expected)?);
|
||||
let actual = Url::from(ParsedUrl::try_from(expected.clone())?);
|
||||
assert_eq!(expected, actual);
|
||||
|
||||
let expected =
|
||||
Url::parse("git+https://github.com/pallets/flask.git@2.0.0#subdirectory=pkg_dir")?;
|
||||
let actual = Url::from(ParsedUrl::try_from(&expected)?);
|
||||
let actual = Url::from(ParsedUrl::try_from(expected.clone())?);
|
||||
assert_eq!(expected, actual);
|
||||
|
||||
// TODO(charlie): Preserve other fragments.
|
||||
let expected =
|
||||
Url::parse("git+https://github.com/pallets/flask.git#egg=flask&subdirectory=pkg_dir")?;
|
||||
let actual = Url::from(ParsedUrl::try_from(&expected)?);
|
||||
let actual = Url::from(ParsedUrl::try_from(expected.clone())?);
|
||||
assert_ne!(expected, actual);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[cfg(unix)]
|
||||
fn direct_url_from_url_absolute() -> Result<()> {
|
||||
let expected = Url::parse("file:///path/to/directory")?;
|
||||
let actual = Url::from(ParsedUrl::try_from(expected.clone())?);
|
||||
assert_eq!(expected, actual);
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
|
198
crates/distribution-types/src/requirement.rs
Normal file
198
crates/distribution-types/src/requirement.rs
Normal file
|
@ -0,0 +1,198 @@
|
|||
use std::fmt::{Display, Formatter};
|
||||
use std::path::PathBuf;
|
||||
|
||||
use indexmap::IndexMap;
|
||||
use url::Url;
|
||||
|
||||
use pep440_rs::VersionSpecifiers;
|
||||
use pep508_rs::{MarkerEnvironment, MarkerTree, VerbatimUrl, VersionOrUrl};
|
||||
use uv_git::GitReference;
|
||||
use uv_normalize::{ExtraName, PackageName};
|
||||
|
||||
use crate::{ParsedUrl, ParsedUrlError};
|
||||
|
||||
/// The requirements of a distribution, an extension over PEP 508's requirements.
|
||||
#[derive(Debug, Clone, Eq, PartialEq)]
|
||||
pub struct Requirements {
|
||||
pub dependencies: Vec<Requirement>,
|
||||
pub optional_dependencies: IndexMap<ExtraName, Vec<Requirement>>,
|
||||
}
|
||||
|
||||
/// A representation of dependency on a package, an extension over a PEP 508's requirement.
|
||||
///
|
||||
/// The main change is using [`RequirementSource`] to represent all supported package sources over
|
||||
/// [`pep508_rs::VersionOrUrl`], which collapses all URL sources into a single stringly type.
|
||||
#[derive(Hash, Debug, Clone, Eq, PartialEq)]
|
||||
pub struct Requirement {
|
||||
pub name: PackageName,
|
||||
pub extras: Vec<ExtraName>,
|
||||
pub marker: Option<MarkerTree>,
|
||||
pub source: RequirementSource,
|
||||
}
|
||||
|
||||
impl Requirement {
|
||||
/// Returns whether the markers apply for the given environment.
|
||||
pub fn evaluate_markers(&self, env: &MarkerEnvironment, extras: &[ExtraName]) -> bool {
|
||||
if let Some(marker) = &self.marker {
|
||||
marker.evaluate(env, extras)
|
||||
} else {
|
||||
true
|
||||
}
|
||||
}
|
||||
|
||||
/// Convert a [`pep508_rs::Requirement`] to a [`Requirement`].
|
||||
pub fn from_pep508(requirement: pep508_rs::Requirement) -> Result<Self, ParsedUrlError> {
|
||||
let source = match requirement.version_or_url {
|
||||
None => RequirementSource::Registry {
|
||||
specifier: VersionSpecifiers::empty(),
|
||||
index: None,
|
||||
},
|
||||
// The most popular case: Just a name, a version range and maybe extras.
|
||||
Some(VersionOrUrl::VersionSpecifier(specifier)) => RequirementSource::Registry {
|
||||
specifier,
|
||||
index: None,
|
||||
},
|
||||
Some(VersionOrUrl::Url(url)) => {
|
||||
let direct_url = ParsedUrl::try_from(url.to_url())?;
|
||||
RequirementSource::from_parsed_url(direct_url, url)
|
||||
}
|
||||
};
|
||||
Ok(Requirement {
|
||||
name: requirement.name,
|
||||
extras: requirement.extras,
|
||||
marker: requirement.marker,
|
||||
source,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl Display for Requirement {
|
||||
/// Display the [`Requirement`], with the intention of being shown directly to a user, rather
|
||||
/// than for inclusion in a `requirements.txt` file.
|
||||
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
|
||||
write!(f, "{}", self.name)?;
|
||||
if !self.extras.is_empty() {
|
||||
write!(
|
||||
f,
|
||||
"[{}]",
|
||||
self.extras
|
||||
.iter()
|
||||
.map(ToString::to_string)
|
||||
.collect::<Vec<_>>()
|
||||
.join(",")
|
||||
)?;
|
||||
}
|
||||
match &self.source {
|
||||
RequirementSource::Registry { specifier, index } => {
|
||||
write!(f, "{specifier}")?;
|
||||
if let Some(index) = index {
|
||||
write!(f, " (index: {index})")?;
|
||||
}
|
||||
}
|
||||
RequirementSource::Url { url, .. } => {
|
||||
write!(f, " @ {url}")?;
|
||||
}
|
||||
RequirementSource::Git {
|
||||
url: _,
|
||||
repository,
|
||||
reference,
|
||||
subdirectory,
|
||||
} => {
|
||||
write!(f, " @ git+{repository}")?;
|
||||
if let Some(reference) = reference.as_str() {
|
||||
write!(f, "@{reference}")?;
|
||||
}
|
||||
if let Some(subdirectory) = subdirectory {
|
||||
writeln!(f, "#subdirectory={}", subdirectory.display())?;
|
||||
}
|
||||
}
|
||||
RequirementSource::Path { url, .. } => {
|
||||
write!(f, " @ {url}")?;
|
||||
}
|
||||
}
|
||||
if let Some(marker) = &self.marker {
|
||||
write!(f, " ; {marker}")?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// The different locations with can install a distribution from: Version specifier (from an index),
|
||||
/// HTTP(S) URL, git repository, and path.
|
||||
///
|
||||
/// We store both the parsed fields (such as the plain url and the subdirectory) and the joined
|
||||
/// PEP 508 style url (e.g. `file:///<path>#subdirectory=<subdirectory>`) since we need both in
|
||||
/// different locations.
|
||||
#[derive(Hash, Debug, Clone, Eq, PartialEq)]
|
||||
pub enum RequirementSource {
|
||||
/// The requirement has a version specifier, such as `foo >1,<2`.
|
||||
Registry {
|
||||
specifier: VersionSpecifiers,
|
||||
/// Choose a version from the index with this name.
|
||||
index: Option<String>,
|
||||
},
|
||||
// TODO(konsti): Track and verify version specifier from `project.dependencies` matches the
|
||||
// version in remote location.
|
||||
/// A remote `http://` or `https://` URL, either a built distribution,
|
||||
/// e.g. `foo @ https://example.org/foo-1.0-py3-none-any.whl`, or a source distribution,
|
||||
/// e.g.`foo @ https://example.org/foo-1.0.zip`.
|
||||
Url {
|
||||
/// For source distributions, the path to the distribution if it is not in the archive
|
||||
/// root.
|
||||
subdirectory: Option<PathBuf>,
|
||||
/// The remote location of the archive file, without subdirectory fragment.
|
||||
location: Url,
|
||||
/// The PEP 508 style URL in the format
|
||||
/// `<scheme>://<domain>/<path>#subdirectory=<subdirectory>`.
|
||||
url: VerbatimUrl,
|
||||
},
|
||||
/// A remote git repository, either over HTTPS or over SSH.
|
||||
Git {
|
||||
/// The repository URL (without `git+` prefix).
|
||||
repository: Url,
|
||||
/// Optionally, the revision, tag, or branch to use.
|
||||
reference: GitReference,
|
||||
/// The path to the source distribution if it is not in the repository root.
|
||||
subdirectory: Option<PathBuf>,
|
||||
/// The PEP 508 style url in the format
|
||||
/// `git+<scheme>://<domain>/<path>@<rev>#subdirectory=<subdirectory>`.
|
||||
url: VerbatimUrl,
|
||||
},
|
||||
/// A local built or source distribution, either from a path or a `file://` URL. It can either
|
||||
/// be a binary distribution (a `.whl` file), a source distribution archive (a `.zip` or
|
||||
/// `.tag.gz` file) or a source tree (a directory with a pyproject.toml in, or a legacy
|
||||
/// source distribution with only a setup.py but non pyproject.toml in it).
|
||||
Path {
|
||||
path: PathBuf,
|
||||
/// For a source tree (a directory), whether to install as an editable.
|
||||
editable: Option<bool>,
|
||||
/// The PEP 508 style URL in the format
|
||||
/// `file:///<path>#subdirectory=<subdirectory>`.
|
||||
url: VerbatimUrl,
|
||||
},
|
||||
}
|
||||
|
||||
impl RequirementSource {
|
||||
/// Construct a [`RequirementSource`] for a URL source, given a URL parsed into components and
|
||||
/// the PEP 508 string (after the `@`) as [`VerbatimUrl`].
|
||||
pub fn from_parsed_url(parsed_url: ParsedUrl, url: VerbatimUrl) -> Self {
|
||||
match parsed_url {
|
||||
ParsedUrl::LocalFile(local_file) => RequirementSource::Path {
|
||||
path: local_file.path,
|
||||
url,
|
||||
editable: None,
|
||||
},
|
||||
ParsedUrl::Git(git) => RequirementSource::Git {
|
||||
url,
|
||||
repository: git.url.repository().clone(),
|
||||
reference: git.url.reference().clone(),
|
||||
subdirectory: git.subdirectory,
|
||||
},
|
||||
ParsedUrl::Archive(archive) => RequirementSource::Url {
|
||||
url,
|
||||
location: archive.url,
|
||||
subdirectory: archive.subdirectory,
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
|
@ -1,9 +1,10 @@
|
|||
use rustc_hash::FxHashMap;
|
||||
|
||||
use pep508_rs::Requirement;
|
||||
use uv_normalize::PackageName;
|
||||
|
||||
use crate::{BuiltDist, Dist, InstalledDist, Name, ResolvedDist, SourceDist};
|
||||
use crate::{
|
||||
BuiltDist, Dist, Name, ParsedGitUrl, Requirement, RequirementSource, ResolvedDist, SourceDist,
|
||||
};
|
||||
|
||||
/// A set of packages pinned at specific versions.
|
||||
#[derive(Debug, Default, Clone)]
|
||||
|
@ -59,96 +60,85 @@ impl Resolution {
|
|||
/// Return the set of [`Requirement`]s that this resolution represents, exclusive of any
|
||||
/// editable requirements.
|
||||
pub fn requirements(&self) -> Vec<Requirement> {
|
||||
let mut requirements = self
|
||||
let mut requirements: Vec<_> = self
|
||||
.0
|
||||
.values()
|
||||
// Remove editable requirements
|
||||
.filter(|dist| !dist.is_editable())
|
||||
.map(|dist| Requirement::from(dist.clone()))
|
||||
.collect::<Vec<_>>();
|
||||
.map(Requirement::from)
|
||||
.collect();
|
||||
requirements.sort_unstable_by(|a, b| a.name.cmp(&b.name));
|
||||
requirements
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Dist> for Requirement {
|
||||
fn from(dist: Dist) -> Self {
|
||||
match dist {
|
||||
Dist::Built(BuiltDist::Registry(wheel)) => Self {
|
||||
name: wheel.filename.name,
|
||||
extras: vec![],
|
||||
version_or_url: Some(pep508_rs::VersionOrUrl::VersionSpecifier(
|
||||
pep440_rs::VersionSpecifiers::from(
|
||||
pep440_rs::VersionSpecifier::equals_version(wheel.filename.version),
|
||||
impl From<&ResolvedDist> for Requirement {
|
||||
fn from(resolved_dist: &ResolvedDist) -> Self {
|
||||
let source = match resolved_dist {
|
||||
ResolvedDist::Installable(dist) => match dist {
|
||||
Dist::Built(BuiltDist::Registry(wheel)) => RequirementSource::Registry {
|
||||
specifier: pep440_rs::VersionSpecifiers::from(
|
||||
pep440_rs::VersionSpecifier::equals_version(wheel.filename.version.clone()),
|
||||
),
|
||||
)),
|
||||
marker: None,
|
||||
},
|
||||
|
||||
Dist::Built(BuiltDist::DirectUrl(wheel)) => Self {
|
||||
name: wheel.filename.name,
|
||||
extras: vec![],
|
||||
version_or_url: Some(pep508_rs::VersionOrUrl::Url(wheel.url)),
|
||||
marker: None,
|
||||
},
|
||||
Dist::Built(BuiltDist::Path(wheel)) => Self {
|
||||
name: wheel.filename.name,
|
||||
extras: vec![],
|
||||
version_or_url: Some(pep508_rs::VersionOrUrl::Url(wheel.url)),
|
||||
marker: None,
|
||||
},
|
||||
Dist::Source(SourceDist::Registry(sdist)) => Self {
|
||||
name: sdist.filename.name,
|
||||
extras: vec![],
|
||||
version_or_url: Some(pep508_rs::VersionOrUrl::VersionSpecifier(
|
||||
pep440_rs::VersionSpecifiers::from(
|
||||
pep440_rs::VersionSpecifier::equals_version(sdist.filename.version),
|
||||
index: None,
|
||||
},
|
||||
Dist::Built(BuiltDist::DirectUrl(wheel)) => {
|
||||
let mut location = wheel.url.to_url();
|
||||
location.set_fragment(None);
|
||||
RequirementSource::Url {
|
||||
url: wheel.url.clone(),
|
||||
location,
|
||||
subdirectory: None,
|
||||
}
|
||||
}
|
||||
Dist::Built(BuiltDist::Path(wheel)) => RequirementSource::Path {
|
||||
path: wheel.path.clone(),
|
||||
url: wheel.url.clone(),
|
||||
editable: None,
|
||||
},
|
||||
Dist::Source(SourceDist::Registry(sdist)) => RequirementSource::Registry {
|
||||
specifier: pep440_rs::VersionSpecifiers::from(
|
||||
pep440_rs::VersionSpecifier::equals_version(sdist.filename.version.clone()),
|
||||
),
|
||||
)),
|
||||
marker: None,
|
||||
index: None,
|
||||
},
|
||||
Dist::Source(SourceDist::DirectUrl(sdist)) => {
|
||||
let mut location = sdist.url.to_url();
|
||||
location.set_fragment(None);
|
||||
RequirementSource::Url {
|
||||
url: sdist.url.clone(),
|
||||
location,
|
||||
subdirectory: None,
|
||||
}
|
||||
}
|
||||
Dist::Source(SourceDist::Git(sdist)) => {
|
||||
let git_url = ParsedGitUrl::try_from(sdist.url.to_url())
|
||||
.expect("urls must be valid at this point");
|
||||
RequirementSource::Git {
|
||||
url: sdist.url.clone(),
|
||||
repository: git_url.url.repository().clone(),
|
||||
reference: git_url.url.reference().clone(),
|
||||
subdirectory: git_url.subdirectory,
|
||||
}
|
||||
}
|
||||
Dist::Source(SourceDist::Path(sdist)) => RequirementSource::Path {
|
||||
path: sdist.path.clone(),
|
||||
url: sdist.url.clone(),
|
||||
editable: None,
|
||||
},
|
||||
},
|
||||
Dist::Source(SourceDist::DirectUrl(sdist)) => Self {
|
||||
name: sdist.name,
|
||||
extras: vec![],
|
||||
version_or_url: Some(pep508_rs::VersionOrUrl::Url(sdist.url)),
|
||||
marker: None,
|
||||
ResolvedDist::Installed(dist) => RequirementSource::Registry {
|
||||
specifier: pep440_rs::VersionSpecifiers::from(
|
||||
pep440_rs::VersionSpecifier::equals_version(dist.version().clone()),
|
||||
),
|
||||
index: None,
|
||||
},
|
||||
Dist::Source(SourceDist::Git(sdist)) => Self {
|
||||
name: sdist.name,
|
||||
extras: vec![],
|
||||
version_or_url: Some(pep508_rs::VersionOrUrl::Url(sdist.url)),
|
||||
marker: None,
|
||||
},
|
||||
Dist::Source(SourceDist::Path(sdist)) => Self {
|
||||
name: sdist.name,
|
||||
extras: vec![],
|
||||
version_or_url: Some(pep508_rs::VersionOrUrl::Url(sdist.url)),
|
||||
marker: None,
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<InstalledDist> for Requirement {
|
||||
fn from(dist: InstalledDist) -> Self {
|
||||
Self {
|
||||
name: dist.name().clone(),
|
||||
};
|
||||
Requirement {
|
||||
name: resolved_dist.name().clone(),
|
||||
extras: vec![],
|
||||
version_or_url: Some(pep508_rs::VersionOrUrl::VersionSpecifier(
|
||||
pep440_rs::VersionSpecifiers::from(pep440_rs::VersionSpecifier::equals_version(
|
||||
dist.version().clone(),
|
||||
)),
|
||||
)),
|
||||
marker: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<ResolvedDist> for Requirement {
|
||||
fn from(dist: ResolvedDist) -> Self {
|
||||
match dist {
|
||||
ResolvedDist::Installable(dist) => dist.into(),
|
||||
ResolvedDist::Installed(dist) => dist.into(),
|
||||
source,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
74
crates/distribution-types/src/specified_requirement.rs
Normal file
74
crates/distribution-types/src/specified_requirement.rs
Normal file
|
@ -0,0 +1,74 @@
|
|||
use std::fmt::{Display, Formatter};
|
||||
|
||||
use pep508_rs::{MarkerEnvironment, UnnamedRequirement};
|
||||
use uv_normalize::ExtraName;
|
||||
|
||||
use crate::{ParsedUrl, ParsedUrlError, Requirement, RequirementSource};
|
||||
|
||||
/// An [`UnresolvedRequirement`] with additional metadata from `requirements.txt`, currently only
|
||||
/// hashes but in the future also editable and similar information.
|
||||
#[derive(Debug, Clone, Eq, PartialEq, Hash)]
|
||||
pub struct UnresolvedRequirementSpecification {
|
||||
/// The actual requirement.
|
||||
pub requirement: UnresolvedRequirement,
|
||||
/// Hashes of the downloadable packages.
|
||||
pub hashes: Vec<String>,
|
||||
}
|
||||
|
||||
/// A requirement read from a `requirements.txt` or `pyproject.toml` file.
|
||||
///
|
||||
/// It is considered unresolved as we still need to query the URL for the `Unnamed` variant to
|
||||
/// resolve the requirement name.
|
||||
///
|
||||
/// Analog to `RequirementsTxtRequirement` but with `distribution_types::Requirement` instead of
|
||||
/// `pep508_rs::Requirement`.
|
||||
#[derive(Hash, Debug, Clone, Eq, PartialEq)]
|
||||
pub enum UnresolvedRequirement {
|
||||
/// The uv-specific superset over PEP 508 requirements specifier incorporating
|
||||
/// `tool.uv.sources`.
|
||||
Named(Requirement),
|
||||
/// A PEP 508-like, direct URL dependency specifier.
|
||||
Unnamed(UnnamedRequirement),
|
||||
}
|
||||
|
||||
impl Display for UnresolvedRequirement {
|
||||
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
|
||||
match self {
|
||||
Self::Named(requirement) => write!(f, "{requirement}"),
|
||||
Self::Unnamed(requirement) => write!(f, "{requirement}"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl UnresolvedRequirement {
|
||||
/// Returns whether the markers apply for the given environment.
|
||||
pub fn evaluate_markers(&self, env: &MarkerEnvironment, extras: &[ExtraName]) -> bool {
|
||||
match self {
|
||||
Self::Named(requirement) => requirement.evaluate_markers(env, extras),
|
||||
Self::Unnamed(requirement) => requirement.evaluate_markers(env, extras),
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the extras for the requirement.
|
||||
pub fn extras(&self) -> &[ExtraName] {
|
||||
match self {
|
||||
Self::Named(requirement) => requirement.extras.as_slice(),
|
||||
Self::Unnamed(requirement) => requirement.extras.as_slice(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Return the version specifier or URL for the requirement.
|
||||
pub fn source(&self) -> Result<RequirementSource, ParsedUrlError> {
|
||||
// TODO(konsti): This is a bad place to raise errors, we should have parsed the url earlier.
|
||||
match self {
|
||||
Self::Named(requirement) => Ok(requirement.source.clone()),
|
||||
Self::Unnamed(requirement) => {
|
||||
let parsed_url = ParsedUrl::try_from(requirement.url.to_url())?;
|
||||
Ok(RequirementSource::from_parsed_url(
|
||||
parsed_url,
|
||||
requirement.url.clone(),
|
||||
))
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
|
@ -49,7 +49,12 @@ impl std::ops::Deref for VersionSpecifiers {
|
|||
}
|
||||
|
||||
impl VersionSpecifiers {
|
||||
/// Whether all specifiers match the given version
|
||||
/// Matches all versions.
|
||||
pub fn empty() -> Self {
|
||||
Self(Vec::new())
|
||||
}
|
||||
|
||||
/// Whether all specifiers match the given version.
|
||||
pub fn contains(&self, version: &Version) -> bool {
|
||||
self.iter().all(|specifier| specifier.contains(version))
|
||||
}
|
||||
|
|
|
@ -19,9 +19,11 @@ crate-type = ["cdylib", "rlib"]
|
|||
[dependencies]
|
||||
pep440_rs = { workspace = true }
|
||||
uv-fs = { workspace = true }
|
||||
uv-git = { workspace = true }
|
||||
uv-normalize = { workspace = true }
|
||||
|
||||
derivative = { workspace = true }
|
||||
indexmap = { workspace = true }
|
||||
once_cell = { workspace = true }
|
||||
pyo3 = { workspace = true, optional = true, features = ["abi3", "extension-module"] }
|
||||
pyo3-log = { workspace = true, optional = true }
|
||||
|
|
|
@ -20,6 +20,10 @@ pub enum DirectUrl {
|
|||
/// {"archive_info": {"hash": "sha256=75909db2664838d015e3d9139004ee16711748a52c8f336b52882266540215d8", "hashes": {"sha256": "75909db2664838d015e3d9139004ee16711748a52c8f336b52882266540215d8"}}, "url": "https://files.pythonhosted.org/packages/b8/8b/31273bf66016be6ad22bb7345c37ff350276cfd46e389a0c2ac5da9d9073/wheel-0.41.2-py3-none-any.whl"}
|
||||
/// ```
|
||||
ArchiveUrl {
|
||||
/// The URL without parsed information (such as the Git revision or subdirectory).
|
||||
///
|
||||
/// For example, for `pip install git+https://github.com/tqdm/tqdm@cc372d09dcd5a5eabdc6ed4cf365bdb0be004d44#subdirectory=.`,
|
||||
/// the URL is `https://github.com/tqdm/tqdm`.
|
||||
url: String,
|
||||
archive_info: ArchiveInfo,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
|
|
|
@ -13,6 +13,8 @@ license = { workspace = true }
|
|||
workspace = true
|
||||
|
||||
[dependencies]
|
||||
distribution-types = { workspace = true }
|
||||
pep440_rs = { workspace = true }
|
||||
pep508_rs = { workspace = true, features = ["non-pep508-extensions"] }
|
||||
uv-client = { workspace = true }
|
||||
uv-fs = { workspace = true }
|
||||
|
@ -25,6 +27,7 @@ regex = { workspace = true }
|
|||
reqwest = { workspace = true, optional = true }
|
||||
reqwest-middleware = { workspace = true, optional = true }
|
||||
serde = { workspace = true }
|
||||
thiserror = { workspace = true }
|
||||
tracing = { workspace = true }
|
||||
unscanny = { workspace = true }
|
||||
url = { workspace = true }
|
||||
|
|
|
@ -40,16 +40,17 @@ use std::io;
|
|||
use std::path::{Path, PathBuf};
|
||||
use std::str::FromStr;
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
use tracing::instrument;
|
||||
use unscanny::{Pattern, Scanner};
|
||||
use url::Url;
|
||||
|
||||
use pep508_rs::{
|
||||
expand_env_vars, split_scheme, strip_host, Extras, Pep508Error, Pep508ErrorSource, Requirement,
|
||||
Scheme, VerbatimUrl,
|
||||
use distribution_types::{
|
||||
ParsedUrlError, Requirement, UnresolvedRequirement, UnresolvedRequirementSpecification,
|
||||
};
|
||||
use pep508_rs::{
|
||||
expand_env_vars, split_scheme, strip_host, Extras, Pep508Error, Pep508ErrorSource, Scheme,
|
||||
VerbatimUrl,
|
||||
};
|
||||
pub use requirement::RequirementsTxtRequirement;
|
||||
#[cfg(feature = "http")]
|
||||
use uv_client::BaseClient;
|
||||
use uv_client::BaseClientBuilder;
|
||||
|
@ -58,6 +59,8 @@ use uv_fs::{normalize_url_path, Simplified};
|
|||
use uv_normalize::ExtraName;
|
||||
use uv_warnings::warn_user;
|
||||
|
||||
pub use crate::requirement::{RequirementsTxtRequirement, RequirementsTxtRequirementError};
|
||||
|
||||
mod requirement;
|
||||
|
||||
/// We emit one of those for each requirements.txt entry
|
||||
|
@ -294,23 +297,34 @@ impl Display for EditableRequirement {
|
|||
}
|
||||
}
|
||||
|
||||
/// A [Requirement] with additional metadata from the requirements.txt, currently only hashes but in
|
||||
/// the future also editable an similar information
|
||||
#[derive(Debug, Deserialize, Clone, Eq, PartialEq, Hash, Serialize)]
|
||||
/// A [Requirement] with additional metadata from the `requirements.txt`, currently only hashes but in
|
||||
/// the future also editable and similar information.
|
||||
#[derive(Debug, Clone, Eq, PartialEq, Hash)]
|
||||
pub struct RequirementEntry {
|
||||
/// The actual PEP 508 requirement
|
||||
/// The actual PEP 508 requirement.
|
||||
pub requirement: RequirementsTxtRequirement,
|
||||
/// Hashes of the downloadable packages
|
||||
/// Hashes of the downloadable packages.
|
||||
pub hashes: Vec<String>,
|
||||
}
|
||||
|
||||
impl Display for RequirementEntry {
|
||||
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
|
||||
write!(f, "{}", self.requirement)?;
|
||||
for hash in &self.hashes {
|
||||
write!(f, " --hash {hash}")?;
|
||||
}
|
||||
Ok(())
|
||||
// We place the impl here instead of next to `UnresolvedRequirementSpecification` because
|
||||
// `UnresolvedRequirementSpecification` is defined in `distribution-types` and `requirements-txt`
|
||||
// depends on `distribution-types`.
|
||||
impl TryFrom<RequirementEntry> for UnresolvedRequirementSpecification {
|
||||
type Error = ParsedUrlError;
|
||||
|
||||
fn try_from(value: RequirementEntry) -> Result<Self, Self::Error> {
|
||||
Ok(Self {
|
||||
requirement: match value.requirement {
|
||||
RequirementsTxtRequirement::Named(named) => {
|
||||
UnresolvedRequirement::Named(Requirement::from_pep508(named)?)
|
||||
}
|
||||
RequirementsTxtRequirement::Unnamed(unnamed) => {
|
||||
UnresolvedRequirement::Unnamed(unnamed)
|
||||
}
|
||||
},
|
||||
hashes: value.hashes,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -320,7 +334,7 @@ pub struct RequirementsTxt {
|
|||
/// The actual requirements with the hashes.
|
||||
pub requirements: Vec<RequirementEntry>,
|
||||
/// Constraints included with `-c`.
|
||||
pub constraints: Vec<Requirement>,
|
||||
pub constraints: Vec<pep508_rs::Requirement>,
|
||||
/// Editables with `-e`.
|
||||
pub editables: Vec<EditableRequirement>,
|
||||
/// The index URL, specified with `--index-url`.
|
||||
|
@ -486,7 +500,7 @@ impl RequirementsTxt {
|
|||
// _requirements_, but we don't want to support that.
|
||||
for entry in sub_constraints.requirements {
|
||||
match entry.requirement {
|
||||
RequirementsTxtRequirement::Pep508(requirement) => {
|
||||
RequirementsTxtRequirement::Named(requirement) => {
|
||||
data.constraints.push(requirement);
|
||||
}
|
||||
RequirementsTxtRequirement::Unnamed(_) => {
|
||||
|
@ -782,8 +796,15 @@ fn parse_requirement_and_hashes(
|
|||
}
|
||||
|
||||
let requirement =
|
||||
RequirementsTxtRequirement::parse(requirement, working_dir).map_err(|err| {
|
||||
match err.message {
|
||||
RequirementsTxtRequirement::parse(requirement, working_dir).map_err(|err| match err {
|
||||
RequirementsTxtRequirementError::ParsedUrl(err) => {
|
||||
RequirementsTxtParserError::ParsedUrl {
|
||||
source: err,
|
||||
start,
|
||||
end,
|
||||
}
|
||||
}
|
||||
RequirementsTxtRequirementError::Pep508(err) => match err.message {
|
||||
Pep508ErrorSource::String(_) | Pep508ErrorSource::UrlError(_) => {
|
||||
RequirementsTxtParserError::Pep508 {
|
||||
source: err,
|
||||
|
@ -798,7 +819,7 @@ fn parse_requirement_and_hashes(
|
|||
end,
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
})?;
|
||||
|
||||
let hashes = if has_hashes {
|
||||
|
@ -934,6 +955,11 @@ pub enum RequirementsTxtParserError {
|
|||
start: usize,
|
||||
end: usize,
|
||||
},
|
||||
ParsedUrl {
|
||||
source: Box<ParsedUrlError>,
|
||||
start: usize,
|
||||
end: usize,
|
||||
},
|
||||
Subfile {
|
||||
source: Box<RequirementsTxtFileError>,
|
||||
start: usize,
|
||||
|
@ -1011,6 +1037,11 @@ impl RequirementsTxtParserError {
|
|||
start: start + offset,
|
||||
end: end + offset,
|
||||
},
|
||||
Self::ParsedUrl { source, start, end } => Self::ParsedUrl {
|
||||
source,
|
||||
start: start + offset,
|
||||
end: end + offset,
|
||||
},
|
||||
Self::Subfile { source, start, end } => Self::Subfile {
|
||||
source,
|
||||
start: start + offset,
|
||||
|
@ -1061,6 +1092,9 @@ impl Display for RequirementsTxtParserError {
|
|||
Self::Pep508 { start, .. } => {
|
||||
write!(f, "Couldn't parse requirement at position {start}")
|
||||
}
|
||||
Self::ParsedUrl { start, .. } => {
|
||||
write!(f, "Couldn't URL at position {start}")
|
||||
}
|
||||
Self::Subfile { start, .. } => {
|
||||
write!(f, "Error parsing included file at position {start}")
|
||||
}
|
||||
|
@ -1092,6 +1126,7 @@ impl std::error::Error for RequirementsTxtParserError {
|
|||
Self::UnnamedConstraint { .. } => None,
|
||||
Self::UnsupportedRequirement { source, .. } => Some(source),
|
||||
Self::Pep508 { source, .. } => Some(source),
|
||||
Self::ParsedUrl { source, .. } => Some(source),
|
||||
Self::Subfile { source, .. } => Some(source.as_ref()),
|
||||
Self::Parser { .. } => None,
|
||||
Self::NonUnicodeUrl { .. } => None,
|
||||
|
@ -1179,6 +1214,13 @@ impl Display for RequirementsTxtFileError {
|
|||
self.file.user_display(),
|
||||
)
|
||||
}
|
||||
RequirementsTxtParserError::ParsedUrl { start, .. } => {
|
||||
write!(
|
||||
f,
|
||||
"Couldn't parse URL in `{}` at position {start}",
|
||||
self.file.user_display(),
|
||||
)
|
||||
}
|
||||
RequirementsTxtParserError::Subfile { start, .. } => {
|
||||
write!(
|
||||
f,
|
||||
|
@ -1726,7 +1768,7 @@ mod test {
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"flask",
|
||||
|
@ -1780,7 +1822,7 @@ mod test {
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"flask",
|
||||
|
@ -1962,7 +2004,7 @@ mod test {
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"httpx",
|
||||
|
@ -1975,7 +2017,7 @@ mod test {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"flask",
|
||||
|
@ -2001,7 +2043,7 @@ mod test {
|
|||
],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"requests",
|
||||
|
@ -2027,7 +2069,7 @@ mod test {
|
|||
],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"black",
|
||||
|
@ -2051,7 +2093,7 @@ mod test {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"mypy",
|
||||
|
|
|
@ -1,106 +1,40 @@
|
|||
use pep508_rs::{
|
||||
MarkerEnvironment, MarkerTree, Pep508Error, Pep508ErrorSource, Requirement, UnnamedRequirement,
|
||||
VersionOrUrl, VersionOrUrlRef,
|
||||
};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::fmt::{Display, Formatter};
|
||||
use std::path::Path;
|
||||
use std::str::FromStr;
|
||||
use uv_normalize::ExtraName;
|
||||
|
||||
use thiserror::Error;
|
||||
|
||||
use distribution_types::ParsedUrlError;
|
||||
use pep508_rs::{Pep508Error, Pep508ErrorSource, UnnamedRequirement};
|
||||
|
||||
/// A requirement specifier in a `requirements.txt` file.
|
||||
#[derive(Hash, Debug, Clone, Eq, PartialEq, Serialize, Deserialize)]
|
||||
///
|
||||
/// Analog to `SpecifiedRequirement` but with `pep508_rs::Requirement` instead of
|
||||
/// `distribution_types::Requirement`.
|
||||
#[derive(Hash, Debug, Clone, Eq, PartialEq)]
|
||||
pub enum RequirementsTxtRequirement {
|
||||
/// A PEP 508-compliant dependency specifier.
|
||||
Pep508(Requirement),
|
||||
/// The uv-specific superset over PEP 508 requirements specifier incorporating
|
||||
/// `tool.uv.sources`.
|
||||
Named(pep508_rs::Requirement),
|
||||
/// A PEP 508-like, direct URL dependency specifier.
|
||||
Unnamed(UnnamedRequirement),
|
||||
}
|
||||
|
||||
impl Display for RequirementsTxtRequirement {
|
||||
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
|
||||
match self {
|
||||
Self::Pep508(requirement) => write!(f, "{requirement}"),
|
||||
Self::Unnamed(requirement) => write!(f, "{requirement}"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl RequirementsTxtRequirement {
|
||||
/// Returns whether the markers apply for the given environment
|
||||
pub fn evaluate_markers(&self, env: &MarkerEnvironment, extras: &[ExtraName]) -> bool {
|
||||
match self {
|
||||
Self::Pep508(requirement) => requirement.evaluate_markers(env, extras),
|
||||
Self::Unnamed(requirement) => requirement.evaluate_markers(env, extras),
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the extras for the requirement.
|
||||
pub fn extras(&self) -> &[ExtraName] {
|
||||
match self {
|
||||
Self::Pep508(requirement) => requirement.extras.as_slice(),
|
||||
Self::Unnamed(requirement) => requirement.extras.as_slice(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the markers for the requirement.
|
||||
pub fn markers(&self) -> Option<&MarkerTree> {
|
||||
match self {
|
||||
Self::Pep508(requirement) => requirement.marker.as_ref(),
|
||||
Self::Unnamed(requirement) => requirement.marker.as_ref(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Return the version specifier or URL for the requirement.
|
||||
pub fn version_or_url(&self) -> Option<VersionOrUrlRef> {
|
||||
match self {
|
||||
Self::Pep508(requirement) => match requirement.version_or_url.as_ref() {
|
||||
Some(VersionOrUrl::VersionSpecifier(specifiers)) => {
|
||||
Some(VersionOrUrlRef::VersionSpecifier(specifiers))
|
||||
}
|
||||
Some(VersionOrUrl::Url(url)) => Some(VersionOrUrlRef::Url(url)),
|
||||
None => None,
|
||||
},
|
||||
Self::Unnamed(requirement) => Some(VersionOrUrlRef::Url(&requirement.url)),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Requirement> for RequirementsTxtRequirement {
|
||||
fn from(requirement: Requirement) -> Self {
|
||||
Self::Pep508(requirement)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<UnnamedRequirement> for RequirementsTxtRequirement {
|
||||
fn from(requirement: UnnamedRequirement) -> Self {
|
||||
Self::Unnamed(requirement)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromStr for RequirementsTxtRequirement {
|
||||
type Err = Pep508Error;
|
||||
|
||||
/// Parse a requirement as seen in a `requirements.txt` file.
|
||||
fn from_str(input: &str) -> Result<Self, Self::Err> {
|
||||
match Requirement::from_str(input) {
|
||||
Ok(requirement) => Ok(Self::Pep508(requirement)),
|
||||
Err(err) => match err.message {
|
||||
Pep508ErrorSource::UnsupportedRequirement(_) => {
|
||||
Ok(Self::Unnamed(UnnamedRequirement::from_str(input)?))
|
||||
}
|
||||
_ => Err(err),
|
||||
},
|
||||
}
|
||||
}
|
||||
#[derive(Debug, Error)]
|
||||
pub enum RequirementsTxtRequirementError {
|
||||
#[error(transparent)]
|
||||
ParsedUrl(#[from] Box<ParsedUrlError>),
|
||||
#[error(transparent)]
|
||||
Pep508(#[from] Pep508Error),
|
||||
}
|
||||
|
||||
impl RequirementsTxtRequirement {
|
||||
/// Parse a requirement as seen in a `requirements.txt` file.
|
||||
pub fn parse(input: &str, working_dir: impl AsRef<Path>) -> Result<Self, Pep508Error> {
|
||||
pub fn parse(
|
||||
input: &str,
|
||||
working_dir: impl AsRef<Path>,
|
||||
) -> Result<Self, RequirementsTxtRequirementError> {
|
||||
// Attempt to parse as a PEP 508-compliant requirement.
|
||||
match Requirement::parse(input, &working_dir) {
|
||||
Ok(requirement) => Ok(Self::Pep508(requirement)),
|
||||
match pep508_rs::Requirement::parse(input, &working_dir) {
|
||||
Ok(requirement) => Ok(Self::Named(requirement)),
|
||||
Err(err) => match err.message {
|
||||
Pep508ErrorSource::UnsupportedRequirement(_) => {
|
||||
// If that fails, attempt to parse as a direct URL requirement.
|
||||
|
@ -109,7 +43,7 @@ impl RequirementsTxtRequirement {
|
|||
&working_dir,
|
||||
)?))
|
||||
}
|
||||
_ => Err(err),
|
||||
_ => Err(RequirementsTxtRequirementError::Pep508(err)),
|
||||
},
|
||||
}
|
||||
}
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"numpy",
|
||||
|
@ -29,7 +29,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"pandas",
|
||||
|
@ -53,7 +53,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"python-dateutil",
|
||||
|
@ -77,7 +77,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"pytz",
|
||||
|
@ -101,7 +101,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"six",
|
||||
|
@ -125,7 +125,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"tzdata",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"django-debug-toolbar",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"django",
|
||||
|
@ -29,7 +29,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"pytz",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"numpy",
|
||||
|
@ -18,7 +18,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"pandas",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"inflection",
|
||||
|
@ -29,7 +29,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"upsidedown",
|
||||
|
@ -53,7 +53,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"numpy",
|
||||
|
@ -66,7 +66,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"pandas",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"tomli",
|
||||
|
@ -18,7 +18,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"numpy",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"tomli",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"werkzeug",
|
||||
|
@ -58,7 +58,7 @@ RequirementsTxt {
|
|||
],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"urllib3",
|
||||
|
@ -111,7 +111,7 @@ RequirementsTxt {
|
|||
],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"ansicon",
|
||||
|
@ -175,7 +175,7 @@ RequirementsTxt {
|
|||
],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"requests-oauthlib",
|
||||
|
@ -229,7 +229,7 @@ RequirementsTxt {
|
|||
],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"psycopg2",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"tqdm",
|
||||
|
@ -29,7 +29,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"tomli-w",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"numpy",
|
||||
|
@ -18,7 +18,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"pandas",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"numpy",
|
||||
|
@ -29,7 +29,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"pandas",
|
||||
|
@ -53,7 +53,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"python-dateutil",
|
||||
|
@ -77,7 +77,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"pytz",
|
||||
|
@ -101,7 +101,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"six",
|
||||
|
@ -125,7 +125,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"tzdata",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"django-debug-toolbar",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"django",
|
||||
|
@ -29,7 +29,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"pytz",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"inflection",
|
||||
|
@ -29,7 +29,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"upsidedown",
|
||||
|
@ -53,7 +53,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"numpy",
|
||||
|
@ -66,7 +66,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"pandas",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"tomli",
|
||||
|
@ -18,7 +18,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"numpy",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"tomli",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"werkzeug",
|
||||
|
@ -58,7 +58,7 @@ RequirementsTxt {
|
|||
],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"urllib3",
|
||||
|
@ -111,7 +111,7 @@ RequirementsTxt {
|
|||
],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"ansicon",
|
||||
|
@ -175,7 +175,7 @@ RequirementsTxt {
|
|||
],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"requests-oauthlib",
|
||||
|
@ -229,7 +229,7 @@ RequirementsTxt {
|
|||
],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"psycopg2",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"tqdm",
|
||||
|
@ -29,7 +29,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"tomli-w",
|
||||
|
|
|
@ -5,7 +5,7 @@ expression: actual
|
|||
RequirementsTxt {
|
||||
requirements: [
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"numpy",
|
||||
|
@ -18,7 +18,7 @@ RequirementsTxt {
|
|||
hashes: [],
|
||||
},
|
||||
RequirementEntry {
|
||||
requirement: Pep508(
|
||||
requirement: Named(
|
||||
Requirement {
|
||||
name: PackageName(
|
||||
"pandas",
|
||||
|
|
|
@ -25,9 +25,9 @@ use tokio::process::Command;
|
|||
use tokio::sync::Mutex;
|
||||
use tracing::{debug, info_span, instrument, Instrument};
|
||||
|
||||
use distribution_types::Resolution;
|
||||
use distribution_types::{ParsedUrlError, Requirement, Resolution};
|
||||
use pep440_rs::Version;
|
||||
use pep508_rs::{PackageName, Requirement};
|
||||
use pep508_rs::PackageName;
|
||||
use uv_configuration::{BuildKind, ConfigSettings, SetupPyStrategy};
|
||||
use uv_fs::{PythonExt, Simplified};
|
||||
use uv_interpreter::{Interpreter, PythonEnvironment};
|
||||
|
@ -54,14 +54,18 @@ static WHEEL_NOT_FOUND_RE: Lazy<Regex> =
|
|||
static DEFAULT_BACKEND: Lazy<Pep517Backend> = Lazy::new(|| Pep517Backend {
|
||||
backend: "setuptools.build_meta:__legacy__".to_string(),
|
||||
backend_path: None,
|
||||
requirements: vec![Requirement::from_str("setuptools >= 40.8.0").unwrap()],
|
||||
requirements: vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("setuptools >= 40.8.0").unwrap(),
|
||||
)
|
||||
.unwrap()],
|
||||
});
|
||||
|
||||
/// The requirements for `--legacy-setup-py` builds.
|
||||
static SETUP_PY_REQUIREMENTS: Lazy<[Requirement; 2]> = Lazy::new(|| {
|
||||
[
|
||||
Requirement::from_str("setuptools >= 40.8.0").unwrap(),
|
||||
Requirement::from_str("wheel").unwrap(),
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("setuptools >= 40.8.0").unwrap())
|
||||
.unwrap(),
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("wheel").unwrap()).unwrap(),
|
||||
]
|
||||
});
|
||||
|
||||
|
@ -100,6 +104,8 @@ pub enum Error {
|
|||
},
|
||||
#[error("Failed to build PATH for build script")]
|
||||
BuildScriptPath(#[source] env::JoinPathsError),
|
||||
#[error("Failed to parse requirements from build backend")]
|
||||
DirectUrl(#[source] ParsedUrlError),
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
|
@ -223,7 +229,7 @@ pub struct Project {
|
|||
#[serde(rename_all = "kebab-case")]
|
||||
pub struct BuildSystem {
|
||||
/// PEP 508 dependencies required to execute the build system.
|
||||
pub requires: Vec<Requirement>,
|
||||
pub requires: Vec<pep508_rs::Requirement>,
|
||||
/// A string naming a Python object that will be used to perform the build.
|
||||
pub build_backend: Option<String>,
|
||||
/// Specify that their backend code is hosted in-tree, this key contains a list of directories.
|
||||
|
@ -571,7 +577,12 @@ impl SourceBuild {
|
|||
.build_backend
|
||||
.unwrap_or_else(|| "setuptools.build_meta:__legacy__".to_string()),
|
||||
backend_path: build_system.backend_path,
|
||||
requirements: build_system.requires,
|
||||
requirements: build_system
|
||||
.requires
|
||||
.into_iter()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, _>>()
|
||||
.map_err(|err| Box::new(Error::DirectUrl(err)))?,
|
||||
}
|
||||
} else {
|
||||
// If a `pyproject.toml` is present, but `[build-system]` is missing, proceed with
|
||||
|
@ -943,7 +954,7 @@ async fn create_pep517_build_environment(
|
|||
})?;
|
||||
|
||||
// Deserialize the requirements from the output file.
|
||||
let extra_requires: Vec<Requirement> = serde_json::from_slice(&contents).map_err(|err| {
|
||||
let extra_requires: Vec<pep508_rs::Requirement> = serde_json::from_slice::<Vec<pep508_rs::Requirement>>(&contents).map_err(|err| {
|
||||
Error::from_command_output(
|
||||
format!(
|
||||
"Build backend failed to return extra requires with `get_requires_for_build_{build_kind}`: {err}"
|
||||
|
@ -952,6 +963,11 @@ async fn create_pep517_build_environment(
|
|||
version_id,
|
||||
)
|
||||
})?;
|
||||
let extra_requires: Vec<_> = extra_requires
|
||||
.into_iter()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, _>>()
|
||||
.map_err(Error::DirectUrl)?;
|
||||
|
||||
// Some packages (such as tqdm 4.66.1) list only extra requires that have already been part of
|
||||
// the pyproject.toml requires (in this case, `wheel`). We can skip doing the whole resolution
|
||||
|
@ -962,7 +978,7 @@ async fn create_pep517_build_environment(
|
|||
.any(|req| !pep517_backend.requirements.contains(req))
|
||||
{
|
||||
debug!("Installing extra requirements for build backend");
|
||||
let requirements: Vec<Requirement> = pep517_backend
|
||||
let requirements: Vec<_> = pep517_backend
|
||||
.requirements
|
||||
.iter()
|
||||
.cloned()
|
||||
|
|
|
@ -13,9 +13,11 @@ license = { workspace = true }
|
|||
workspace = true
|
||||
|
||||
[dependencies]
|
||||
distribution-types = { workspace = true }
|
||||
pep508_rs = { workspace = true }
|
||||
platform-tags = { workspace = true }
|
||||
uv-auth = { workspace = true }
|
||||
uv-cache = { workspace = true }
|
||||
uv-normalize = { workspace = true }
|
||||
|
||||
anyhow = { workspace = true }
|
||||
|
|
|
@ -1,8 +1,8 @@
|
|||
use std::hash::BuildHasherDefault;
|
||||
|
||||
use distribution_types::Requirement;
|
||||
use rustc_hash::FxHashMap;
|
||||
|
||||
use pep508_rs::Requirement;
|
||||
use uv_normalize::PackageName;
|
||||
|
||||
/// A set of constraints for a set of requirements.
|
||||
|
|
|
@ -3,7 +3,7 @@ use std::hash::BuildHasherDefault;
|
|||
use itertools::Either;
|
||||
use rustc_hash::FxHashMap;
|
||||
|
||||
use pep508_rs::Requirement;
|
||||
use distribution_types::Requirement;
|
||||
use uv_normalize::PackageName;
|
||||
|
||||
/// A set of overrides for a set of requirements.
|
||||
|
|
|
@ -30,6 +30,7 @@ uv-fs = { workspace = true }
|
|||
uv-installer = { workspace = true }
|
||||
uv-interpreter = { workspace = true }
|
||||
uv-normalize = { workspace = true }
|
||||
uv-requirements = { workspace = true, features = ["schemars"] }
|
||||
uv-resolver = { workspace = true }
|
||||
uv-types = { workspace = true }
|
||||
uv-workspace = { workspace = true, features = ["schemars"] }
|
||||
|
|
|
@ -3,12 +3,26 @@ use std::path::PathBuf;
|
|||
use anstream::println;
|
||||
use anyhow::{bail, Result};
|
||||
use pretty_assertions::StrComparison;
|
||||
use schemars::schema_for;
|
||||
use schemars::{schema_for, JsonSchema};
|
||||
use serde::Deserialize;
|
||||
|
||||
use uv_workspace::Options;
|
||||
|
||||
use crate::ROOT_DIR;
|
||||
|
||||
#[derive(Deserialize, JsonSchema)]
|
||||
#[serde(deny_unknown_fields)]
|
||||
#[allow(dead_code)]
|
||||
// The names and docstrings of this struct and the types it contains are used as `title` and
|
||||
// `description` in uv.schema.json, see https://github.com/SchemaStore/schemastore/blob/master/editor-features.md#title-as-an-expected-object-type
|
||||
/// Metadata and configuration for uv.
|
||||
struct ToolUv {
|
||||
#[serde(flatten)]
|
||||
options: Options,
|
||||
#[serde(flatten)]
|
||||
dep_spec: uv_requirements::pyproject::ToolUv,
|
||||
}
|
||||
|
||||
#[derive(clap::Args)]
|
||||
pub(crate) struct GenerateJsonSchemaArgs {
|
||||
/// Write the generated table to stdout (rather than to `uv.schema.json`).
|
||||
|
@ -30,7 +44,7 @@ enum Mode {
|
|||
}
|
||||
|
||||
pub(crate) fn main(args: &GenerateJsonSchemaArgs) -> Result<()> {
|
||||
let schema = schema_for!(Options);
|
||||
let schema = schema_for!(ToolUv);
|
||||
let schema_string = serde_json::to_string_pretty(&schema).unwrap();
|
||||
let filename = "uv.schema.json";
|
||||
let schema_path = PathBuf::from(ROOT_DIR).join(filename);
|
||||
|
|
|
@ -3,14 +3,12 @@ use std::path::PathBuf;
|
|||
|
||||
use anstream::println;
|
||||
use anyhow::{Context, Result};
|
||||
|
||||
use clap::{Parser, ValueEnum};
|
||||
use fs_err::File;
|
||||
use itertools::Itertools;
|
||||
use petgraph::dot::{Config as DotConfig, Dot};
|
||||
|
||||
use distribution_types::{FlatIndexLocation, IndexLocations, IndexUrl, Resolution};
|
||||
use pep508_rs::Requirement;
|
||||
use distribution_types::{FlatIndexLocation, IndexLocations, IndexUrl, Requirement, Resolution};
|
||||
use uv_cache::{Cache, CacheArgs};
|
||||
use uv_client::{FlatIndexClient, RegistryClientBuilder};
|
||||
use uv_configuration::{ConfigSettings, NoBinary, NoBuild, SetupPyStrategy};
|
||||
|
@ -29,7 +27,7 @@ pub(crate) enum ResolveCliFormat {
|
|||
|
||||
#[derive(Parser)]
|
||||
pub(crate) struct ResolveCliArgs {
|
||||
requirements: Vec<Requirement>,
|
||||
requirements: Vec<pep508_rs::Requirement>,
|
||||
/// Write debug output in DOT format for graphviz to this file
|
||||
#[clap(long)]
|
||||
graphviz: Option<PathBuf>,
|
||||
|
@ -101,7 +99,13 @@ pub(crate) async fn resolve_cli(args: ResolveCliArgs) -> Result<()> {
|
|||
// Copied from `BuildDispatch`
|
||||
let tags = venv.interpreter().tags()?;
|
||||
let resolver = Resolver::new(
|
||||
Manifest::simple(args.requirements.clone()),
|
||||
Manifest::simple(
|
||||
args.requirements
|
||||
.iter()
|
||||
.cloned()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, _>>()?,
|
||||
),
|
||||
Options::default(),
|
||||
venv.interpreter().markers(),
|
||||
venv.interpreter(),
|
||||
|
|
|
@ -10,9 +10,9 @@ use tokio::time::Instant;
|
|||
use tracing::{info, info_span, Span};
|
||||
use tracing_indicatif::span_ext::IndicatifSpanExt;
|
||||
|
||||
use distribution_types::IndexLocations;
|
||||
use distribution_types::{IndexLocations, Requirement};
|
||||
use pep440_rs::{Version, VersionSpecifier, VersionSpecifiers};
|
||||
use pep508_rs::{Requirement, VersionOrUrl};
|
||||
use pep508_rs::VersionOrUrl;
|
||||
use uv_cache::{Cache, CacheArgs};
|
||||
use uv_client::{OwnedArchive, RegistryClient, RegistryClientBuilder};
|
||||
use uv_configuration::{ConfigSettings, NoBinary, NoBuild, SetupPyStrategy};
|
||||
|
@ -68,10 +68,10 @@ pub(crate) async fn resolve_many(args: ResolveManyArgs) -> Result<()> {
|
|||
let tf_models_nightly = PackageName::from_str("tf-models-nightly").unwrap();
|
||||
let lines = data
|
||||
.lines()
|
||||
.map(Requirement::from_str)
|
||||
.map(pep508_rs::Requirement::from_str)
|
||||
.filter_ok(|req| req.name != tf_models_nightly);
|
||||
|
||||
let requirements: Vec<Requirement> = if let Some(limit) = args.limit {
|
||||
let requirements: Vec<pep508_rs::Requirement> = if let Some(limit) = args.limit {
|
||||
lines.take(limit).collect::<Result<_, _>>()?
|
||||
} else {
|
||||
lines.collect::<Result<_, _>>()?
|
||||
|
@ -127,7 +127,7 @@ pub(crate) async fn resolve_many(args: ResolveManyArgs) -> Result<()> {
|
|||
let equals_version = VersionOrUrl::VersionSpecifier(
|
||||
VersionSpecifiers::from(VersionSpecifier::equals_version(version)),
|
||||
);
|
||||
Requirement {
|
||||
pep508_rs::Requirement {
|
||||
name: requirement.name,
|
||||
extras: requirement.extras,
|
||||
version_or_url: Some(equals_version),
|
||||
|
@ -140,7 +140,11 @@ pub(crate) async fn resolve_many(args: ResolveManyArgs) -> Result<()> {
|
|||
requirement
|
||||
};
|
||||
|
||||
let result = build_dispatch.resolve(&[requirement.clone()]).await;
|
||||
let result = build_dispatch
|
||||
.resolve(&[
|
||||
Requirement::from_pep508(requirement.clone()).expect("Invalid requirement")
|
||||
])
|
||||
.await;
|
||||
(requirement.to_string(), start.elapsed(), result)
|
||||
}
|
||||
})
|
||||
|
|
|
@ -12,8 +12,7 @@ use itertools::Itertools;
|
|||
use rustc_hash::FxHashMap;
|
||||
use tracing::{debug, instrument};
|
||||
|
||||
use distribution_types::{IndexLocations, Name, Resolution, SourceDist};
|
||||
use pep508_rs::Requirement;
|
||||
use distribution_types::{IndexLocations, Name, Requirement, Resolution, SourceDist};
|
||||
use uv_build::{SourceBuild, SourceBuildContext};
|
||||
use uv_cache::Cache;
|
||||
use uv_client::RegistryClient;
|
||||
|
|
|
@ -67,7 +67,8 @@ pub(crate) async fn fetch_git_archive(
|
|||
)
|
||||
.map_err(Error::CacheWrite)?;
|
||||
|
||||
let ParsedGitUrl { url, subdirectory } = ParsedGitUrl::try_from(url).map_err(Box::new)?;
|
||||
let ParsedGitUrl { url, subdirectory } =
|
||||
ParsedGitUrl::try_from(url.clone()).map_err(Box::new)?;
|
||||
|
||||
// Fetch the Git repository.
|
||||
let source = if let Some(reporter) = reporter {
|
||||
|
@ -95,7 +96,8 @@ pub(crate) async fn resolve_precise(
|
|||
cache: &Cache,
|
||||
reporter: Option<&Arc<dyn Reporter>>,
|
||||
) -> Result<Option<Url>, Error> {
|
||||
let ParsedGitUrl { url, subdirectory } = ParsedGitUrl::try_from(url).map_err(Box::new)?;
|
||||
let ParsedGitUrl { url, subdirectory } =
|
||||
ParsedGitUrl::try_from(url.clone()).map_err(Box::new)?;
|
||||
|
||||
// If the Git reference already contains a complete SHA, short-circuit.
|
||||
if url.precise().is_some() {
|
||||
|
@ -154,7 +156,7 @@ pub(crate) async fn resolve_precise(
|
|||
/// This method will only return precise URLs for URLs that have already been resolved via
|
||||
/// [`resolve_precise`].
|
||||
pub fn to_precise(url: &Url) -> Option<Url> {
|
||||
let ParsedGitUrl { url, subdirectory } = ParsedGitUrl::try_from(url).ok()?;
|
||||
let ParsedGitUrl { url, subdirectory } = ParsedGitUrl::try_from(url.clone()).ok()?;
|
||||
let resolved_git_refs = RESOLVED_GIT_REFS.lock().unwrap();
|
||||
let reference = RepositoryReference::new(&url);
|
||||
let precise = resolved_git_refs.get(&reference)?;
|
||||
|
@ -182,12 +184,12 @@ fn is_same_reference_impl<'a>(
|
|||
resolved_refs: &FxHashMap<RepositoryReference, GitSha>,
|
||||
) -> bool {
|
||||
// Convert `a` to a Git URL, if possible.
|
||||
let Ok(a_git) = ParsedGitUrl::try_from(&Url::from(CanonicalUrl::new(a))) else {
|
||||
let Ok(a_git) = ParsedGitUrl::try_from(Url::from(CanonicalUrl::new(a))) else {
|
||||
return false;
|
||||
};
|
||||
|
||||
// Convert `b` to a Git URL, if possible.
|
||||
let Ok(b_git) = ParsedGitUrl::try_from(&Url::from(CanonicalUrl::new(b))) else {
|
||||
let Ok(b_git) = ParsedGitUrl::try_from(Url::from(CanonicalUrl::new(b))) else {
|
||||
return false;
|
||||
};
|
||||
|
||||
|
|
|
@ -95,7 +95,7 @@ impl<'a> BuiltWheelIndex<'a> {
|
|||
return None;
|
||||
}
|
||||
|
||||
let Ok(Some(git_sha)) = git_reference(&source_dist.url) else {
|
||||
let Ok(Some(git_sha)) = git_reference(source_dist.url.to_url()) else {
|
||||
return None;
|
||||
};
|
||||
|
||||
|
|
|
@ -135,7 +135,8 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
|
|||
}
|
||||
BuildableSource::Dist(SourceDist::DirectUrl(dist)) => {
|
||||
let filename = dist.filename().expect("Distribution must have a filename");
|
||||
let ParsedArchiveUrl { url, subdirectory } = ParsedArchiveUrl::from(dist.url.raw());
|
||||
let ParsedArchiveUrl { url, subdirectory } =
|
||||
ParsedArchiveUrl::from(dist.url.to_url());
|
||||
|
||||
// For direct URLs, cache directly under the hash of the URL itself.
|
||||
let cache_shard = self
|
||||
|
@ -186,7 +187,8 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
|
|||
.url
|
||||
.filename()
|
||||
.expect("Distribution must have a filename");
|
||||
let ParsedArchiveUrl { url, subdirectory } = ParsedArchiveUrl::from(resource.url);
|
||||
let ParsedArchiveUrl { url, subdirectory } =
|
||||
ParsedArchiveUrl::from(resource.url.clone());
|
||||
|
||||
// For direct URLs, cache directly under the hash of the URL itself.
|
||||
let cache_shard = self
|
||||
|
@ -284,7 +286,8 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
|
|||
}
|
||||
BuildableSource::Dist(SourceDist::DirectUrl(dist)) => {
|
||||
let filename = dist.filename().expect("Distribution must have a filename");
|
||||
let ParsedArchiveUrl { url, subdirectory } = ParsedArchiveUrl::from(dist.url.raw());
|
||||
let ParsedArchiveUrl { url, subdirectory } =
|
||||
ParsedArchiveUrl::from(dist.url.to_url());
|
||||
|
||||
// For direct URLs, cache directly under the hash of the URL itself.
|
||||
let cache_shard = self
|
||||
|
@ -328,7 +331,8 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
|
|||
.url
|
||||
.filename()
|
||||
.expect("Distribution must have a filename");
|
||||
let ParsedArchiveUrl { url, subdirectory } = ParsedArchiveUrl::from(resource.url);
|
||||
let ParsedArchiveUrl { url, subdirectory } =
|
||||
ParsedArchiveUrl::from(resource.url.clone());
|
||||
|
||||
// For direct URLs, cache directly under the hash of the URL itself.
|
||||
let cache_shard = self
|
||||
|
|
|
@ -14,7 +14,8 @@ mod util;
|
|||
/// A URL reference to a Git repository.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
|
||||
pub struct GitUrl {
|
||||
/// The URL of the Git repository, with any query parameters and fragments removed.
|
||||
/// The URL of the Git repository, with any query parameters, fragments, and leading `git+`
|
||||
/// removed.
|
||||
repository: Url,
|
||||
/// The reference to the commit to use, which could be a branch, tag or revision.
|
||||
reference: GitReference,
|
||||
|
@ -23,6 +24,14 @@ pub struct GitUrl {
|
|||
}
|
||||
|
||||
impl GitUrl {
|
||||
pub fn new(repository: Url, reference: GitReference) -> Self {
|
||||
Self {
|
||||
repository,
|
||||
reference,
|
||||
precise: None,
|
||||
}
|
||||
}
|
||||
|
||||
#[must_use]
|
||||
pub fn with_precise(mut self, precise: GitSha) -> Self {
|
||||
self.precise = Some(precise);
|
||||
|
|
|
@ -13,6 +13,7 @@ license = { workspace = true }
|
|||
workspace = true
|
||||
|
||||
[dependencies]
|
||||
distribution-filename = { workspace = true }
|
||||
distribution-types = { workspace = true }
|
||||
install-wheel-rs = { workspace = true, default-features = false }
|
||||
pep440_rs = { workspace = true }
|
||||
|
@ -28,6 +29,7 @@ uv-extract = { workspace = true }
|
|||
uv-fs = { workspace = true }
|
||||
uv-interpreter = { workspace = true }
|
||||
uv-normalize = { workspace = true }
|
||||
uv-requirements = { workspace = true }
|
||||
uv-types = { workspace = true }
|
||||
uv-warnings = { workspace = true }
|
||||
|
||||
|
|
|
@ -1,16 +1,18 @@
|
|||
use std::collections::hash_map::Entry;
|
||||
use std::hash::BuildHasherDefault;
|
||||
use std::path::Path;
|
||||
use std::str::FromStr;
|
||||
|
||||
use anyhow::{bail, Result};
|
||||
use rustc_hash::FxHashMap;
|
||||
use tracing::{debug, warn};
|
||||
|
||||
use distribution_types::Hashed;
|
||||
use distribution_filename::WheelFilename;
|
||||
use distribution_types::{
|
||||
BuiltDist, CachedDirectUrlDist, CachedDist, Dist, IndexLocations, InstalledDist,
|
||||
InstalledMetadata, InstalledVersion, Name, SourceDist,
|
||||
CachedDirectUrlDist, CachedDist, DirectUrlBuiltDist, DirectUrlSourceDist, Error, GitSourceDist,
|
||||
Hashed, IndexLocations, InstalledDist, InstalledMetadata, InstalledVersion, Name,
|
||||
PathBuiltDist, PathSourceDist, RemoteSource, Requirement, RequirementSource, Verbatim,
|
||||
};
|
||||
use pep508_rs::{Requirement, VersionOrUrl, VersionOrUrlRef};
|
||||
use platform_tags::Tags;
|
||||
use uv_cache::{ArchiveTimestamp, Cache, CacheBucket, WheelCache};
|
||||
use uv_configuration::{NoBinary, Reinstall};
|
||||
|
@ -144,7 +146,7 @@ impl<'a> Planner<'a> {
|
|||
}
|
||||
|
||||
// If we see the same requirement twice, then we have a conflict.
|
||||
let specifier = Specifier::NonEditable(requirement.version_or_url.as_ref());
|
||||
let specifier = Specifier::NonEditable(&requirement.source);
|
||||
match seen.entry(requirement.name.clone()) {
|
||||
Entry::Occupied(value) => {
|
||||
if value.get() == &specifier {
|
||||
|
@ -183,14 +185,7 @@ impl<'a> Planner<'a> {
|
|||
match installed_dists.as_slice() {
|
||||
[] => {}
|
||||
[distribution] => {
|
||||
match RequirementSatisfaction::check(
|
||||
distribution,
|
||||
requirement
|
||||
.version_or_url
|
||||
.as_ref()
|
||||
.map(VersionOrUrlRef::from),
|
||||
requirement,
|
||||
)? {
|
||||
match RequirementSatisfaction::check(distribution, &requirement.source)? {
|
||||
RequirementSatisfaction::Mismatch => {}
|
||||
RequirementSatisfaction::Satisfied => {
|
||||
debug!("Requirement already installed: {distribution}");
|
||||
|
@ -211,23 +206,14 @@ impl<'a> Planner<'a> {
|
|||
}
|
||||
|
||||
if cache.must_revalidate(&requirement.name) {
|
||||
debug!("Must revalidate requirement: {requirement}");
|
||||
debug!("Must revalidate requirement: {}", requirement.name);
|
||||
remote.push(requirement.clone());
|
||||
continue;
|
||||
}
|
||||
|
||||
// Identify any cached distributions that satisfy the requirement.
|
||||
match requirement.version_or_url.as_ref() {
|
||||
None => {
|
||||
if let Some((_version, distribution)) =
|
||||
registry_index.get(&requirement.name).next()
|
||||
{
|
||||
debug!("Requirement already cached: {distribution}");
|
||||
cached.push(CachedDist::Registry(distribution.clone()));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
Some(VersionOrUrl::VersionSpecifier(specifier)) => {
|
||||
match &requirement.source {
|
||||
RequirementSource::Registry { specifier, .. } => {
|
||||
if let Some((_version, distribution)) = registry_index
|
||||
.get(&requirement.name)
|
||||
.find(|(version, _)| specifier.contains(version))
|
||||
|
@ -237,40 +223,158 @@ impl<'a> Planner<'a> {
|
|||
continue;
|
||||
}
|
||||
}
|
||||
Some(VersionOrUrl::Url(url)) => {
|
||||
match Dist::from_url(requirement.name.clone(), url.clone())? {
|
||||
Dist::Built(BuiltDist::Registry(_)) => {
|
||||
// Nothing to do.
|
||||
RequirementSource::Url { url, .. } => {
|
||||
// Check if we have a wheel or a source distribution.
|
||||
if Path::new(url.path())
|
||||
.extension()
|
||||
.is_some_and(|ext| ext.eq_ignore_ascii_case("whl"))
|
||||
{
|
||||
// Validate that the name in the wheel matches that of the requirement.
|
||||
let filename = WheelFilename::from_str(&url.filename()?)?;
|
||||
if filename.name != requirement.name {
|
||||
return Err(Error::PackageNameMismatch(
|
||||
requirement.name.clone(),
|
||||
filename.name,
|
||||
url.verbatim().to_string(),
|
||||
)
|
||||
.into());
|
||||
}
|
||||
Dist::Source(SourceDist::Registry(_)) => {
|
||||
// Nothing to do.
|
||||
|
||||
let wheel = DirectUrlBuiltDist {
|
||||
filename,
|
||||
url: url.clone(),
|
||||
};
|
||||
|
||||
if !wheel.filename.is_compatible(tags) {
|
||||
bail!(
|
||||
"A URL dependency is incompatible with the current platform: {}",
|
||||
wheel.url
|
||||
);
|
||||
}
|
||||
Dist::Built(BuiltDist::DirectUrl(wheel)) => {
|
||||
if !wheel.filename.is_compatible(tags) {
|
||||
bail!(
|
||||
"A URL dependency is incompatible with the current platform: {}",
|
||||
wheel.url
|
||||
|
||||
if no_binary {
|
||||
bail!(
|
||||
"A URL dependency points to a wheel which conflicts with `--no-binary`: {}",
|
||||
wheel.url
|
||||
);
|
||||
}
|
||||
|
||||
// Find the exact wheel from the cache, since we know the filename in
|
||||
// advance.
|
||||
let cache_entry = cache
|
||||
.shard(
|
||||
CacheBucket::Wheels,
|
||||
WheelCache::Url(&wheel.url).wheel_dir(wheel.name().as_ref()),
|
||||
)
|
||||
.entry(format!("{}.http", wheel.filename.stem()));
|
||||
|
||||
// Read the HTTP pointer.
|
||||
if let Some(pointer) = HttpArchivePointer::read_from(&cache_entry)? {
|
||||
let archive = pointer.into_archive();
|
||||
if archive.satisfies(hasher.get(&wheel)) {
|
||||
let cached_dist = CachedDirectUrlDist::from_url(
|
||||
wheel.filename,
|
||||
wheel.url,
|
||||
archive.hashes,
|
||||
cache.archive(&archive.id),
|
||||
);
|
||||
|
||||
debug!("URL wheel requirement already cached: {cached_dist}");
|
||||
cached.push(CachedDist::Url(cached_dist));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
let sdist = DirectUrlSourceDist {
|
||||
name: requirement.name.clone(),
|
||||
url: url.clone(),
|
||||
};
|
||||
// Find the most-compatible wheel from the cache, since we don't know
|
||||
// the filename in advance.
|
||||
if let Some(wheel) = built_index.url(&sdist)? {
|
||||
let cached_dist = wheel.into_url_dist(url.clone());
|
||||
debug!("URL source requirement already cached: {cached_dist}");
|
||||
cached.push(CachedDist::Url(cached_dist));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
RequirementSource::Git { url, .. } => {
|
||||
let sdist = GitSourceDist {
|
||||
name: requirement.name.clone(),
|
||||
url: url.clone(),
|
||||
};
|
||||
// Find the most-compatible wheel from the cache, since we don't know
|
||||
// the filename in advance.
|
||||
if let Some(wheel) = built_index.git(&sdist) {
|
||||
let cached_dist = wheel.into_url_dist(url.clone());
|
||||
debug!("Git source requirement already cached: {cached_dist}");
|
||||
cached.push(CachedDist::Url(cached_dist));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
RequirementSource::Path { url, .. } => {
|
||||
// Store the canonicalized path, which also serves to validate that it exists.
|
||||
let path = match url
|
||||
.to_file_path()
|
||||
.map_err(|()| Error::UrlFilename(url.to_url()))?
|
||||
.canonicalize()
|
||||
{
|
||||
Ok(path) => path,
|
||||
Err(err) if err.kind() == std::io::ErrorKind::NotFound => {
|
||||
return Err(Error::NotFound(url.to_url()).into());
|
||||
}
|
||||
Err(err) => return Err(err.into()),
|
||||
};
|
||||
|
||||
if no_binary {
|
||||
bail!(
|
||||
"A URL dependency points to a wheel which conflicts with `--no-binary`: {}",
|
||||
wheel.url
|
||||
);
|
||||
}
|
||||
// Check if we have a wheel or a source distribution.
|
||||
if path
|
||||
.extension()
|
||||
.is_some_and(|ext| ext.eq_ignore_ascii_case("whl"))
|
||||
{
|
||||
// Validate that the name in the wheel matches that of the requirement.
|
||||
let filename = WheelFilename::from_str(&url.filename()?)?;
|
||||
if filename.name != requirement.name {
|
||||
return Err(Error::PackageNameMismatch(
|
||||
requirement.name.clone(),
|
||||
filename.name,
|
||||
url.verbatim().to_string(),
|
||||
)
|
||||
.into());
|
||||
}
|
||||
|
||||
// Find the exact wheel from the cache, since we know the filename in
|
||||
// advance.
|
||||
let cache_entry = cache
|
||||
.shard(
|
||||
CacheBucket::Wheels,
|
||||
WheelCache::Url(&wheel.url).wheel_dir(wheel.name().as_ref()),
|
||||
)
|
||||
.entry(format!("{}.http", wheel.filename.stem()));
|
||||
let wheel = PathBuiltDist {
|
||||
filename,
|
||||
url: url.clone(),
|
||||
path,
|
||||
};
|
||||
|
||||
// Read the HTTP pointer.
|
||||
if let Some(pointer) = HttpArchivePointer::read_from(&cache_entry)? {
|
||||
if !wheel.filename.is_compatible(tags) {
|
||||
bail!(
|
||||
"A path dependency is incompatible with the current platform: {}",
|
||||
wheel.path.user_display()
|
||||
);
|
||||
}
|
||||
|
||||
if no_binary {
|
||||
bail!(
|
||||
"A path dependency points to a wheel which conflicts with `--no-binary`: {}",
|
||||
wheel.url
|
||||
);
|
||||
}
|
||||
|
||||
// Find the exact wheel from the cache, since we know the filename in
|
||||
// advance.
|
||||
let cache_entry = cache
|
||||
.shard(
|
||||
CacheBucket::Wheels,
|
||||
WheelCache::Url(&wheel.url).wheel_dir(wheel.name().as_ref()),
|
||||
)
|
||||
.entry(format!("{}.rev", wheel.filename.stem()));
|
||||
|
||||
if let Some(pointer) = LocalArchivePointer::read_from(&cache_entry)? {
|
||||
let timestamp = ArchiveTimestamp::from_file(&wheel.path)?;
|
||||
if pointer.is_up_to_date(timestamp) {
|
||||
let archive = pointer.into_archive();
|
||||
if archive.satisfies(hasher.get(&wheel)) {
|
||||
let cached_dist = CachedDirectUrlDist::from_url(
|
||||
|
@ -280,86 +384,26 @@ impl<'a> Planner<'a> {
|
|||
cache.archive(&archive.id),
|
||||
);
|
||||
|
||||
debug!("URL wheel requirement already cached: {cached_dist}");
|
||||
debug!("Path wheel requirement already cached: {cached_dist}");
|
||||
cached.push(CachedDist::Url(cached_dist));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
Dist::Built(BuiltDist::Path(wheel)) => {
|
||||
if !wheel.filename.is_compatible(tags) {
|
||||
bail!(
|
||||
"A path dependency is incompatible with the current platform: {}",
|
||||
wheel.path.user_display()
|
||||
);
|
||||
}
|
||||
|
||||
if no_binary {
|
||||
bail!(
|
||||
"A path dependency points to a wheel which conflicts with `--no-binary`: {}",
|
||||
wheel.url
|
||||
);
|
||||
}
|
||||
|
||||
// Find the exact wheel from the cache, since we know the filename in
|
||||
// advance.
|
||||
let cache_entry = cache
|
||||
.shard(
|
||||
CacheBucket::Wheels,
|
||||
WheelCache::Url(&wheel.url).wheel_dir(wheel.name().as_ref()),
|
||||
)
|
||||
.entry(format!("{}.rev", wheel.filename.stem()));
|
||||
|
||||
if let Some(pointer) = LocalArchivePointer::read_from(&cache_entry)? {
|
||||
let timestamp = ArchiveTimestamp::from_file(&wheel.path)?;
|
||||
if pointer.is_up_to_date(timestamp) {
|
||||
let archive = pointer.into_archive();
|
||||
if archive.satisfies(hasher.get(&wheel)) {
|
||||
let cached_dist = CachedDirectUrlDist::from_url(
|
||||
wheel.filename,
|
||||
wheel.url,
|
||||
archive.hashes,
|
||||
cache.archive(&archive.id),
|
||||
);
|
||||
|
||||
debug!(
|
||||
"Path wheel requirement already cached: {cached_dist}"
|
||||
);
|
||||
cached.push(CachedDist::Url(cached_dist));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
Dist::Source(SourceDist::DirectUrl(sdist)) => {
|
||||
// Find the most-compatible wheel from the cache, since we don't know
|
||||
// the filename in advance.
|
||||
if let Some(wheel) = built_index.url(&sdist)? {
|
||||
let cached_dist = wheel.into_url_dist(url.clone());
|
||||
debug!("URL source requirement already cached: {cached_dist}");
|
||||
cached.push(CachedDist::Url(cached_dist));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
Dist::Source(SourceDist::Path(sdist)) => {
|
||||
// Find the most-compatible wheel from the cache, since we don't know
|
||||
// the filename in advance.
|
||||
if let Some(wheel) = built_index.path(&sdist)? {
|
||||
let cached_dist = wheel.into_url_dist(url.clone());
|
||||
debug!("Path source requirement already cached: {cached_dist}");
|
||||
cached.push(CachedDist::Url(cached_dist));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
Dist::Source(SourceDist::Git(sdist)) => {
|
||||
// Find the most-compatible wheel from the cache, since we don't know
|
||||
// the filename in advance.
|
||||
if let Some(wheel) = built_index.git(&sdist) {
|
||||
let cached_dist = wheel.into_url_dist(url.clone());
|
||||
debug!("Git source requirement already cached: {cached_dist}");
|
||||
cached.push(CachedDist::Url(cached_dist));
|
||||
continue;
|
||||
}
|
||||
} else {
|
||||
let sdist = PathSourceDist {
|
||||
name: requirement.name.clone(),
|
||||
url: url.clone(),
|
||||
path,
|
||||
editable: false,
|
||||
};
|
||||
// Find the most-compatible wheel from the cache, since we don't know
|
||||
// the filename in advance.
|
||||
if let Some(wheel) = built_index.path(&sdist)? {
|
||||
let cached_dist = wheel.into_url_dist(url.clone());
|
||||
debug!("Path source requirement already cached: {cached_dist}");
|
||||
cached.push(CachedDist::Url(cached_dist));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -405,7 +449,7 @@ enum Specifier<'a> {
|
|||
/// An editable requirement, marked by the installed version of the package.
|
||||
Editable(InstalledVersion<'a>),
|
||||
/// A non-editable requirement, marked by the version or URL specifier.
|
||||
NonEditable(Option<&'a VersionOrUrl>),
|
||||
NonEditable(&'a RequirementSource),
|
||||
}
|
||||
|
||||
#[derive(Debug, Default)]
|
||||
|
|
|
@ -2,9 +2,8 @@ use anyhow::Result;
|
|||
use std::fmt::Debug;
|
||||
use tracing::trace;
|
||||
|
||||
use distribution_types::InstalledDist;
|
||||
use pep508_rs::VersionOrUrlRef;
|
||||
|
||||
use distribution_types::{InstalledDirectUrlDist, InstalledDist, RequirementSource};
|
||||
use pypi_types::{DirInfo, DirectUrl, VcsInfo, VcsKind};
|
||||
use uv_cache::{ArchiveTarget, ArchiveTimestamp};
|
||||
|
||||
#[derive(Debug, Copy, Clone)]
|
||||
|
@ -18,53 +17,139 @@ impl RequirementSatisfaction {
|
|||
/// Returns true if a requirement is satisfied by an installed distribution.
|
||||
///
|
||||
/// Returns an error if IO fails during a freshness check for a local path.
|
||||
pub(crate) fn check(
|
||||
distribution: &InstalledDist,
|
||||
version_or_url: Option<VersionOrUrlRef>,
|
||||
requirement: impl Debug,
|
||||
) -> Result<Self> {
|
||||
pub(crate) fn check(distribution: &InstalledDist, source: &RequirementSource) -> Result<Self> {
|
||||
trace!(
|
||||
"Comparing installed with requirement: {:?} {:?}",
|
||||
"Comparing installed with source: {:?} {:?}",
|
||||
distribution,
|
||||
requirement
|
||||
source
|
||||
);
|
||||
// Filter out already-installed packages.
|
||||
match version_or_url {
|
||||
// Accept any version of the package.
|
||||
None => return Ok(Self::Satisfied),
|
||||
|
||||
match source {
|
||||
// If the requirement comes from a registry, check by name.
|
||||
Some(VersionOrUrlRef::VersionSpecifier(version_specifier)) => {
|
||||
if version_specifier.contains(distribution.version()) {
|
||||
RequirementSource::Registry { specifier, .. } => {
|
||||
if specifier.contains(distribution.version()) {
|
||||
return Ok(Self::Satisfied);
|
||||
}
|
||||
Ok(Self::Mismatch)
|
||||
}
|
||||
RequirementSource::Url {
|
||||
// We use the location since `direct_url.json` also stores this URL, e.g.
|
||||
// `pip install git+https://github.com/tqdm/tqdm@cc372d09dcd5a5eabdc6ed4cf365bdb0be004d44#subdirectory=.`
|
||||
// records `"url": "https://github.com/tqdm/tqdm"` in `direct_url.json`.
|
||||
location: requested_url,
|
||||
subdirectory: requested_subdirectory,
|
||||
url: _,
|
||||
} => {
|
||||
let InstalledDist::Url(InstalledDirectUrlDist {
|
||||
direct_url,
|
||||
editable,
|
||||
..
|
||||
}) = &distribution
|
||||
else {
|
||||
return Ok(Self::Mismatch);
|
||||
};
|
||||
let DirectUrl::ArchiveUrl {
|
||||
url: installed_url,
|
||||
archive_info: _,
|
||||
subdirectory: installed_subdirectory,
|
||||
} = direct_url.as_ref()
|
||||
else {
|
||||
return Ok(Self::Mismatch);
|
||||
};
|
||||
|
||||
// If the requirement comes from a direct URL, check by URL.
|
||||
Some(VersionOrUrlRef::Url(url)) => {
|
||||
if let InstalledDist::Url(installed) = &distribution {
|
||||
if !installed.editable && &installed.url == url.raw() {
|
||||
// If the requirement came from a local path, check freshness.
|
||||
return if let Some(archive) = (url.scheme() == "file")
|
||||
.then(|| url.to_file_path().ok())
|
||||
.flatten()
|
||||
{
|
||||
if ArchiveTimestamp::up_to_date_with(
|
||||
&archive,
|
||||
ArchiveTarget::Install(distribution),
|
||||
)? {
|
||||
return Ok(Self::Satisfied);
|
||||
}
|
||||
Ok(Self::OutOfDate)
|
||||
} else {
|
||||
// Otherwise, assume the requirement is up-to-date.
|
||||
Ok(Self::Satisfied)
|
||||
};
|
||||
if *editable {
|
||||
return Ok(Self::Mismatch);
|
||||
}
|
||||
|
||||
if &requested_url.to_string() != installed_url
|
||||
|| requested_subdirectory != installed_subdirectory
|
||||
{
|
||||
return Ok(Self::Mismatch);
|
||||
}
|
||||
|
||||
// If the requirement came from a local path, check freshness.
|
||||
if requested_url.scheme() == "file" {
|
||||
if let Ok(archive) = requested_url.to_file_path() {
|
||||
if !ArchiveTimestamp::up_to_date_with(
|
||||
&archive,
|
||||
ArchiveTarget::Install(distribution),
|
||||
)? {
|
||||
return Ok(Self::OutOfDate);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Otherwise, assume the requirement is up-to-date.
|
||||
Ok(Self::Satisfied)
|
||||
}
|
||||
RequirementSource::Git {
|
||||
url: _,
|
||||
repository: requested_repository,
|
||||
reference: requested_reference,
|
||||
subdirectory: requested_subdirectory,
|
||||
} => {
|
||||
let InstalledDist::Url(InstalledDirectUrlDist { direct_url, .. }) = &distribution
|
||||
else {
|
||||
return Ok(Self::Mismatch);
|
||||
};
|
||||
let DirectUrl::VcsUrl {
|
||||
url: installed_url,
|
||||
vcs_info:
|
||||
VcsInfo {
|
||||
vcs: VcsKind::Git,
|
||||
requested_revision: installed_reference,
|
||||
commit_id: _,
|
||||
},
|
||||
subdirectory: installed_subdirectory,
|
||||
} = direct_url.as_ref()
|
||||
else {
|
||||
return Ok(Self::Mismatch);
|
||||
};
|
||||
if &requested_repository.to_string() != installed_url
|
||||
|| requested_subdirectory != installed_subdirectory
|
||||
{
|
||||
return Ok(Self::Mismatch);
|
||||
}
|
||||
if installed_reference.as_deref() != requested_reference.as_str() {
|
||||
return Ok(Self::OutOfDate);
|
||||
}
|
||||
|
||||
Ok(Self::Satisfied)
|
||||
}
|
||||
RequirementSource::Path {
|
||||
path,
|
||||
url: requested_url,
|
||||
editable: requested_editable,
|
||||
} => {
|
||||
let InstalledDist::Url(InstalledDirectUrlDist { direct_url, .. }) = &distribution
|
||||
else {
|
||||
return Ok(Self::Mismatch);
|
||||
};
|
||||
let DirectUrl::LocalDirectory {
|
||||
url: installed_url,
|
||||
dir_info:
|
||||
DirInfo {
|
||||
editable: installed_editable,
|
||||
},
|
||||
} = direct_url.as_ref()
|
||||
else {
|
||||
return Ok(Self::Mismatch);
|
||||
};
|
||||
|
||||
if &requested_url.to_string() != installed_url
|
||||
|| requested_editable.unwrap_or_default()
|
||||
!= installed_editable.unwrap_or_default()
|
||||
{
|
||||
return Ok(Self::Mismatch);
|
||||
}
|
||||
|
||||
if !ArchiveTimestamp::up_to_date_with(path, ArchiveTarget::Install(distribution))? {
|
||||
return Ok(Self::OutOfDate);
|
||||
}
|
||||
|
||||
// Otherwise, assume the requirement is up-to-date.
|
||||
Ok(Self::Satisfied)
|
||||
}
|
||||
}
|
||||
|
||||
Ok(Self::Mismatch)
|
||||
}
|
||||
}
|
||||
|
|
|
@ -7,10 +7,11 @@ use fs_err as fs;
|
|||
use rustc_hash::{FxHashMap, FxHashSet};
|
||||
use url::Url;
|
||||
|
||||
use distribution_types::{InstalledDist, InstalledMetadata, InstalledVersion, Name};
|
||||
use distribution_types::{
|
||||
InstalledDist, Name, Requirement, UnresolvedRequirement, UnresolvedRequirementSpecification,
|
||||
};
|
||||
use pep440_rs::{Version, VersionSpecifiers};
|
||||
use pep508_rs::{Requirement, VerbatimUrl, VersionOrUrlRef};
|
||||
use requirements_txt::{EditableRequirement, RequirementEntry, RequirementsTxtRequirement};
|
||||
use requirements_txt::EditableRequirement;
|
||||
use uv_cache::{ArchiveTarget, ArchiveTimestamp};
|
||||
use uv_interpreter::PythonEnvironment;
|
||||
use uv_normalize::PackageName;
|
||||
|
@ -113,25 +114,6 @@ impl<'a> SitePackages<'a> {
|
|||
self.distributions.iter().flatten()
|
||||
}
|
||||
|
||||
/// Returns an iterator over the installed distributions, represented as requirements.
|
||||
pub fn requirements(&self) -> impl Iterator<Item = Requirement> + '_ {
|
||||
self.iter().map(|dist| Requirement {
|
||||
name: dist.name().clone(),
|
||||
extras: vec![],
|
||||
version_or_url: Some(match dist.installed_version() {
|
||||
InstalledVersion::Version(version) => {
|
||||
pep508_rs::VersionOrUrl::VersionSpecifier(pep440_rs::VersionSpecifiers::from(
|
||||
pep440_rs::VersionSpecifier::equals_version(version.clone()),
|
||||
))
|
||||
}
|
||||
InstalledVersion::Url(url, ..) => {
|
||||
pep508_rs::VersionOrUrl::Url(VerbatimUrl::unknown(url.clone()))
|
||||
}
|
||||
}),
|
||||
marker: None,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns the installed distributions for a given package.
|
||||
pub fn get_packages(&self, name: &PackageName) -> Vec<&InstalledDist> {
|
||||
let Some(indexes) = self.by_name.get(name) else {
|
||||
|
@ -297,11 +279,11 @@ impl<'a> SitePackages<'a> {
|
|||
/// Returns if the installed packages satisfy the given requirements.
|
||||
pub fn satisfies(
|
||||
&self,
|
||||
requirements: &[RequirementEntry],
|
||||
requirements: &[UnresolvedRequirementSpecification],
|
||||
editables: &[EditableRequirement],
|
||||
constraints: &[Requirement],
|
||||
) -> Result<SatisfiesResult> {
|
||||
let mut stack = Vec::<RequirementEntry>::with_capacity(requirements.len());
|
||||
let mut stack = Vec::with_capacity(requirements.len());
|
||||
let mut seen =
|
||||
FxHashSet::with_capacity_and_hasher(requirements.len(), BuildHasherDefault::default());
|
||||
|
||||
|
@ -350,8 +332,10 @@ impl<'a> SitePackages<'a> {
|
|||
self.venv.interpreter().markers(),
|
||||
&requirement.extras,
|
||||
) {
|
||||
let dependency = RequirementEntry {
|
||||
requirement: RequirementsTxtRequirement::Pep508(dependency),
|
||||
let dependency = UnresolvedRequirementSpecification {
|
||||
requirement: UnresolvedRequirement::Named(
|
||||
Requirement::from_pep508(dependency)?,
|
||||
),
|
||||
hashes: vec![],
|
||||
};
|
||||
if seen.insert(dependency.clone()) {
|
||||
|
@ -370,42 +354,32 @@ impl<'a> SitePackages<'a> {
|
|||
// Verify that all non-editable requirements are met.
|
||||
while let Some(entry) = stack.pop() {
|
||||
let installed = match &entry.requirement {
|
||||
RequirementsTxtRequirement::Pep508(requirement) => {
|
||||
self.get_packages(&requirement.name)
|
||||
}
|
||||
RequirementsTxtRequirement::Unnamed(requirement) => {
|
||||
self.get_urls(requirement.url.raw())
|
||||
}
|
||||
UnresolvedRequirement::Named(requirement) => self.get_packages(&requirement.name),
|
||||
UnresolvedRequirement::Unnamed(requirement) => self.get_urls(requirement.url.raw()),
|
||||
};
|
||||
match installed.as_slice() {
|
||||
[] => {
|
||||
// The package isn't installed.
|
||||
return Ok(SatisfiesResult::Unsatisfied(entry.to_string()));
|
||||
return Ok(SatisfiesResult::Unsatisfied(entry.requirement.to_string()));
|
||||
}
|
||||
[distribution] => {
|
||||
match RequirementSatisfaction::check(
|
||||
distribution,
|
||||
entry.requirement.version_or_url(),
|
||||
&entry.requirement,
|
||||
&entry.requirement.source()?,
|
||||
)? {
|
||||
RequirementSatisfaction::Mismatch | RequirementSatisfaction::OutOfDate => {
|
||||
return Ok(SatisfiesResult::Unsatisfied(entry.to_string()))
|
||||
return Ok(SatisfiesResult::Unsatisfied(entry.requirement.to_string()))
|
||||
}
|
||||
RequirementSatisfaction::Satisfied => {}
|
||||
}
|
||||
// Validate that the installed version satisfies the constraints.
|
||||
for constraint in constraints {
|
||||
match RequirementSatisfaction::check(
|
||||
distribution,
|
||||
constraint
|
||||
.version_or_url
|
||||
.as_ref()
|
||||
.map(VersionOrUrlRef::from),
|
||||
constraint,
|
||||
)? {
|
||||
match RequirementSatisfaction::check(distribution, &constraint.source)? {
|
||||
RequirementSatisfaction::Mismatch
|
||||
| RequirementSatisfaction::OutOfDate => {
|
||||
return Ok(SatisfiesResult::Unsatisfied(constraint.to_string()))
|
||||
return Ok(SatisfiesResult::Unsatisfied(
|
||||
entry.requirement.to_string(),
|
||||
))
|
||||
}
|
||||
RequirementSatisfaction::Satisfied => {}
|
||||
}
|
||||
|
@ -422,8 +396,10 @@ impl<'a> SitePackages<'a> {
|
|||
self.venv.interpreter().markers(),
|
||||
entry.requirement.extras(),
|
||||
) {
|
||||
let dependency = RequirementEntry {
|
||||
requirement: RequirementsTxtRequirement::Pep508(dependency),
|
||||
let dependency = UnresolvedRequirementSpecification {
|
||||
requirement: UnresolvedRequirement::Named(
|
||||
Requirement::from_pep508(dependency)?,
|
||||
),
|
||||
hashes: vec![],
|
||||
};
|
||||
if seen.insert(dependency.clone()) {
|
||||
|
@ -434,7 +410,7 @@ impl<'a> SitePackages<'a> {
|
|||
}
|
||||
_ => {
|
||||
// There are multiple installed distributions for the same package.
|
||||
return Ok(SatisfiesResult::Unsatisfied(entry.to_string()));
|
||||
return Ok(SatisfiesResult::Unsatisfied(entry.requirement.to_string()));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -451,7 +427,7 @@ pub enum SatisfiesResult {
|
|||
/// All requirements are recursively satisfied.
|
||||
Fresh {
|
||||
/// The flattened set (transitive closure) of all requirements checked.
|
||||
recursive_requirements: FxHashSet<RequirementEntry>,
|
||||
recursive_requirements: FxHashSet<UnresolvedRequirementSpecification>,
|
||||
},
|
||||
/// We found an unsatisfied requirement. Since we exit early, we only know about the first
|
||||
/// unsatisfied requirement.
|
||||
|
@ -487,7 +463,7 @@ pub enum Diagnostic {
|
|||
/// The package that is missing a dependency.
|
||||
package: PackageName,
|
||||
/// The dependency that is missing.
|
||||
requirement: Requirement,
|
||||
requirement: pep508_rs::Requirement,
|
||||
},
|
||||
IncompatibleDependency {
|
||||
/// The package that has an incompatible dependency.
|
||||
|
@ -495,7 +471,7 @@ pub enum Diagnostic {
|
|||
/// The version of the package that is installed.
|
||||
version: Version,
|
||||
/// The dependency that is incompatible.
|
||||
requirement: Requirement,
|
||||
requirement: pep508_rs::Requirement,
|
||||
},
|
||||
DuplicatePackage {
|
||||
/// The package that has multiple installed distributions.
|
||||
|
|
|
@ -13,12 +13,14 @@ license.workspace = true
|
|||
cache-key = { workspace = true }
|
||||
distribution-filename = { workspace = true }
|
||||
distribution-types = { workspace = true }
|
||||
pep440_rs = { workspace = true }
|
||||
pep508_rs = { workspace = true }
|
||||
pypi-types = { workspace = true }
|
||||
requirements-txt = { workspace = true, features = ["reqwest"] }
|
||||
uv-client = { workspace = true }
|
||||
uv-distribution = { workspace = true }
|
||||
uv-fs = { workspace = true }
|
||||
uv-git = { workspace = true }
|
||||
uv-normalize = { workspace = true }
|
||||
uv-resolver = { workspace = true, features = ["clap"] }
|
||||
uv-types = { workspace = true }
|
||||
|
@ -31,13 +33,23 @@ console = { workspace = true }
|
|||
ctrlc = { workspace = true }
|
||||
fs-err = { workspace = true, features = ["tokio"] }
|
||||
futures = { workspace = true }
|
||||
glob = { workspace = true }
|
||||
indexmap = { workspace = true }
|
||||
path-absolutize = { workspace = true }
|
||||
rustc-hash = { workspace = true }
|
||||
schemars = { workspace = true, optional = true }
|
||||
serde = { workspace = true }
|
||||
thiserror = { workspace = true }
|
||||
toml = { workspace = true }
|
||||
tracing = { workspace = true }
|
||||
url = { workspace = true }
|
||||
|
||||
[features]
|
||||
schemars = ["dep:schemars"]
|
||||
|
||||
[dev-dependencies]
|
||||
indoc = "2.0.5"
|
||||
insta = "1.38.0"
|
||||
|
||||
[lints]
|
||||
workspace = true
|
||||
|
|
|
@ -6,7 +6,7 @@ pub use crate::unnamed::*;
|
|||
|
||||
mod confirm;
|
||||
mod lookahead;
|
||||
mod pyproject;
|
||||
pub mod pyproject;
|
||||
mod source_tree;
|
||||
mod sources;
|
||||
mod specification;
|
||||
|
|
|
@ -5,8 +5,11 @@ use futures::StreamExt;
|
|||
use rustc_hash::FxHashSet;
|
||||
use thiserror::Error;
|
||||
|
||||
use distribution_types::{BuiltDist, Dist, DistributionMetadata, LocalEditable, SourceDist};
|
||||
use pep508_rs::{MarkerEnvironment, Requirement, VersionOrUrl};
|
||||
use distribution_types::{
|
||||
BuiltDist, Dist, DistributionMetadata, GitSourceDist, LocalEditable, Requirement,
|
||||
RequirementSource, Requirements, SourceDist,
|
||||
};
|
||||
use pep508_rs::MarkerEnvironment;
|
||||
use pypi_types::Metadata23;
|
||||
use uv_client::RegistryClient;
|
||||
use uv_configuration::{Constraints, Overrides};
|
||||
|
@ -22,6 +25,8 @@ pub enum LookaheadError {
|
|||
DownloadAndBuild(SourceDist, #[source] uv_distribution::Error),
|
||||
#[error(transparent)]
|
||||
UnsupportedUrl(#[from] distribution_types::Error),
|
||||
#[error(transparent)]
|
||||
InvalidRequirement(#[from] distribution_types::ParsedUrlError),
|
||||
}
|
||||
|
||||
/// A resolver for resolving lookahead requirements from direct URLs.
|
||||
|
@ -48,7 +53,7 @@ pub struct LookaheadResolver<'a, Context: BuildContext + Send + Sync> {
|
|||
/// The overrides for the project.
|
||||
overrides: &'a Overrides,
|
||||
/// The editable requirements for the project.
|
||||
editables: &'a [(LocalEditable, Metadata23)],
|
||||
editables: &'a [(LocalEditable, Metadata23, Requirements)],
|
||||
/// The required hashes for the project.
|
||||
hasher: &'a HashStrategy,
|
||||
/// The in-memory index for resolving dependencies.
|
||||
|
@ -64,7 +69,7 @@ impl<'a, Context: BuildContext + Send + Sync> LookaheadResolver<'a, Context> {
|
|||
requirements: &'a [Requirement],
|
||||
constraints: &'a Constraints,
|
||||
overrides: &'a Overrides,
|
||||
editables: &'a [(LocalEditable, Metadata23)],
|
||||
editables: &'a [(LocalEditable, Metadata23, Requirements)],
|
||||
hasher: &'a HashStrategy,
|
||||
context: &'a Context,
|
||||
client: &'a RegistryClient,
|
||||
|
@ -100,21 +105,27 @@ impl<'a, Context: BuildContext + Send + Sync> LookaheadResolver<'a, Context> {
|
|||
let mut seen = FxHashSet::default();
|
||||
|
||||
// Queue up the initial requirements.
|
||||
let mut queue: VecDeque<Requirement> = self
|
||||
let mut queue: VecDeque<_> = self
|
||||
.constraints
|
||||
.apply(self.overrides.apply(self.requirements))
|
||||
.filter(|requirement| requirement.evaluate_markers(markers, &[]))
|
||||
.chain(self.editables.iter().flat_map(|(editable, metadata)| {
|
||||
self.constraints
|
||||
.apply(self.overrides.apply(&metadata.requires_dist))
|
||||
.filter(|requirement| requirement.evaluate_markers(markers, &editable.extras))
|
||||
}))
|
||||
.chain(
|
||||
self.editables
|
||||
.iter()
|
||||
.flat_map(|(editable, _metadata, requirements)| {
|
||||
self.constraints
|
||||
.apply(self.overrides.apply(&requirements.dependencies))
|
||||
.filter(|requirement| {
|
||||
requirement.evaluate_markers(markers, &editable.extras)
|
||||
})
|
||||
}),
|
||||
)
|
||||
.cloned()
|
||||
.collect();
|
||||
|
||||
while !queue.is_empty() || !futures.is_empty() {
|
||||
while let Some(requirement) = queue.pop_front() {
|
||||
if matches!(requirement.version_or_url, Some(VersionOrUrl::Url(_))) {
|
||||
if !matches!(requirement.source, RequirementSource::Registry { .. }) {
|
||||
if seen.insert(requirement.clone()) {
|
||||
futures.push(self.lookahead(requirement));
|
||||
}
|
||||
|
@ -144,14 +155,22 @@ impl<'a, Context: BuildContext + Send + Sync> LookaheadResolver<'a, Context> {
|
|||
&self,
|
||||
requirement: Requirement,
|
||||
) -> Result<Option<RequestedRequirements>, LookaheadError> {
|
||||
// Determine whether the requirement represents a local distribution.
|
||||
let Some(VersionOrUrl::Url(url)) = requirement.version_or_url.as_ref() else {
|
||||
return Ok(None);
|
||||
// Determine whether the requirement represents a local distribution and convert to a
|
||||
// buildable distribution.
|
||||
let dist = match requirement.source {
|
||||
RequirementSource::Registry { .. } => return Ok(None),
|
||||
RequirementSource::Url { url, .. } => Dist::from_http_url(requirement.name, url)?,
|
||||
RequirementSource::Git { url, .. } => Dist::Source(SourceDist::Git(GitSourceDist {
|
||||
name: requirement.name,
|
||||
url,
|
||||
})),
|
||||
RequirementSource::Path {
|
||||
path: _,
|
||||
url,
|
||||
editable: _,
|
||||
} => Dist::from_file_url(requirement.name, url, false)?,
|
||||
};
|
||||
|
||||
// Convert to a buildable distribution.
|
||||
let dist = Dist::from_url(requirement.name, url.clone())?;
|
||||
|
||||
// Fetch the metadata for the distribution.
|
||||
let requires_dist = {
|
||||
let id = dist.version_id();
|
||||
|
@ -168,7 +187,13 @@ impl<'a, Context: BuildContext + Send + Sync> LookaheadResolver<'a, Context> {
|
|||
})
|
||||
{
|
||||
// If the metadata is already in the index, return it.
|
||||
archive.metadata.requires_dist.clone()
|
||||
archive
|
||||
.metadata
|
||||
.requires_dist
|
||||
.iter()
|
||||
.cloned()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, _>>()?
|
||||
} else {
|
||||
// Run the PEP 517 build process to extract metadata from the source distribution.
|
||||
let archive = self
|
||||
|
@ -189,6 +214,9 @@ impl<'a, Context: BuildContext + Send + Sync> LookaheadResolver<'a, Context> {
|
|||
.insert_metadata(id, MetadataResponse::Found(archive));
|
||||
|
||||
requires_dist
|
||||
.into_iter()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, _>>()?
|
||||
}
|
||||
};
|
||||
|
||||
|
|
|
@ -1,23 +1,82 @@
|
|||
use indexmap::IndexMap;
|
||||
use rustc_hash::FxHashSet;
|
||||
use serde::{Deserialize, Serialize};
|
||||
//! Reading from `pyproject.toml`
|
||||
//! * `project.{dependencies,optional-dependencies}`,
|
||||
//! * `tool.uv.sources` and
|
||||
//! * `tool.uv.workspace`
|
||||
//!
|
||||
//! and lowering them into a dependency specification.
|
||||
|
||||
use std::collections::HashMap;
|
||||
use std::io;
|
||||
use std::ops::Deref;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::str::FromStr;
|
||||
|
||||
use pep508_rs::Requirement;
|
||||
use pypi_types::LenientRequirement;
|
||||
use glob::Pattern;
|
||||
use indexmap::IndexMap;
|
||||
use path_absolutize::Absolutize;
|
||||
use rustc_hash::FxHashSet;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use thiserror::Error;
|
||||
use url::Url;
|
||||
|
||||
use distribution_types::{ParsedUrlError, Requirement, RequirementSource, Requirements};
|
||||
use pep508_rs::{VerbatimUrl, VersionOrUrl};
|
||||
use uv_fs::Simplified;
|
||||
use uv_git::GitReference;
|
||||
use uv_normalize::{ExtraName, PackageName};
|
||||
|
||||
use crate::ExtrasSpecification;
|
||||
|
||||
#[derive(Debug, Error)]
|
||||
pub enum Pep621Error {
|
||||
#[error(transparent)]
|
||||
Pep508(#[from] pep508_rs::Pep508Error),
|
||||
#[error("You need to specify a `[project]` section to use `[tool.uv.sources]`")]
|
||||
MissingProjectSection,
|
||||
#[error("pyproject.toml section is declared as dynamic, but must be static: `{0}`")]
|
||||
CantBeDynamic(&'static str),
|
||||
#[error("Failed to parse entry for: `{0}`")]
|
||||
LoweringError(PackageName, #[source] LoweringError),
|
||||
}
|
||||
|
||||
/// An error parsing and merging `tool.uv.sources` with
|
||||
/// `project.{dependencies,optional-dependencies}`.
|
||||
#[derive(Debug, Error)]
|
||||
pub enum LoweringError {
|
||||
#[error("Invalid URL structure")]
|
||||
DirectUrl(#[from] Box<ParsedUrlError>),
|
||||
#[error("Unsupported path (can't convert to URL): `{}`", _0.user_display())]
|
||||
PathToUrl(PathBuf),
|
||||
#[error("The package is not included as workspace package in `tool.uv.workspace`")]
|
||||
UndeclaredWorkspacePackage,
|
||||
#[error("You need to specify a version constraint")]
|
||||
UnconstrainedVersion,
|
||||
#[error("You can only use one of rev, tag or branch")]
|
||||
MoreThanOneGitRef,
|
||||
#[error("You can't combine these options in `tool.uv.sources`")]
|
||||
InvalidEntry,
|
||||
#[error(transparent)]
|
||||
InvalidUrl(#[from] url::ParseError),
|
||||
#[error("You can't combine a url in `project` with `tool.uv.sources`")]
|
||||
ConflictingUrls,
|
||||
/// Note: Infallible on unix and windows.
|
||||
#[error("Could not normalize path: `{0}`")]
|
||||
AbsolutizeError(String, #[source] io::Error),
|
||||
#[error("Fragments are not allowed in URLs: `{0}`")]
|
||||
ForbiddenFragment(Url),
|
||||
}
|
||||
|
||||
/// A `pyproject.toml` as specified in PEP 517.
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq)]
|
||||
#[serde(rename_all = "kebab-case")]
|
||||
pub(crate) struct PyProjectToml {
|
||||
pub struct PyProjectToml {
|
||||
/// Project metadata
|
||||
pub(crate) project: Option<Project>,
|
||||
pub project: Option<Project>,
|
||||
/// Uv additions
|
||||
pub tool: Option<Tool>,
|
||||
}
|
||||
|
||||
/// PEP 621 project metadata.
|
||||
/// PEP 621 project metadata (`project`).
|
||||
///
|
||||
/// This is a subset of the full metadata specification, and only includes the fields that are
|
||||
/// relevant for extracting static requirements.
|
||||
|
@ -25,19 +84,137 @@ pub(crate) struct PyProjectToml {
|
|||
/// See <https://packaging.python.org/en/latest/specifications/pyproject-toml>.
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq)]
|
||||
#[serde(rename_all = "kebab-case")]
|
||||
pub(crate) struct Project {
|
||||
pub struct Project {
|
||||
/// The name of the project
|
||||
pub(crate) name: PackageName,
|
||||
pub name: PackageName,
|
||||
/// Project dependencies
|
||||
pub(crate) dependencies: Option<Vec<String>>,
|
||||
pub dependencies: Option<Vec<String>>,
|
||||
/// Optional dependencies
|
||||
pub(crate) optional_dependencies: Option<IndexMap<ExtraName, Vec<String>>>,
|
||||
pub optional_dependencies: Option<IndexMap<ExtraName, Vec<String>>>,
|
||||
/// Specifies which fields listed by PEP 621 were intentionally unspecified
|
||||
/// so another tool can/will provide such metadata dynamically.
|
||||
pub(crate) dynamic: Option<Vec<String>>,
|
||||
pub dynamic: Option<Vec<String>>,
|
||||
}
|
||||
|
||||
/// The PEP 621 project metadata, with static requirements extracted in advance.
|
||||
/// `tool`.
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq)]
|
||||
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
|
||||
pub struct Tool {
|
||||
pub uv: Option<ToolUv>,
|
||||
}
|
||||
|
||||
/// `tool.uv`.
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq)]
|
||||
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
|
||||
#[serde(deny_unknown_fields)]
|
||||
pub struct ToolUv {
|
||||
pub sources: Option<HashMap<PackageName, Source>>,
|
||||
pub workspace: Option<ToolUvWorkspace>,
|
||||
}
|
||||
|
||||
/// `tool.uv.workspace`.
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq)]
|
||||
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
|
||||
#[serde(deny_unknown_fields)]
|
||||
pub struct ToolUvWorkspace {
|
||||
pub members: Option<Vec<SerdePattern>>,
|
||||
pub exclude: Option<Vec<SerdePattern>>,
|
||||
}
|
||||
|
||||
/// (De)serialize globs as strings.
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq)]
|
||||
pub struct SerdePattern(#[serde(with = "serde_from_and_to_string")] pub Pattern);
|
||||
|
||||
#[cfg(feature = "schemars")]
|
||||
impl schemars::JsonSchema for SerdePattern {
|
||||
fn schema_name() -> String {
|
||||
<String as schemars::JsonSchema>::schema_name()
|
||||
}
|
||||
|
||||
fn json_schema(gen: &mut schemars::gen::SchemaGenerator) -> schemars::schema::Schema {
|
||||
<String as schemars::JsonSchema>::json_schema(gen)
|
||||
}
|
||||
}
|
||||
|
||||
impl Deref for SerdePattern {
|
||||
type Target = Pattern;
|
||||
|
||||
fn deref(&self) -> &Self::Target {
|
||||
&self.0
|
||||
}
|
||||
}
|
||||
|
||||
/// A `tool.uv.sources` value.
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq)]
|
||||
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
|
||||
#[serde(untagged, deny_unknown_fields)]
|
||||
pub enum Source {
|
||||
/// A remote git repository, either over HTTPS or over SSH.
|
||||
///
|
||||
/// Example:
|
||||
/// ```toml
|
||||
/// flask = { git = "https://github.com/pallets/flask", tag = "3.0.0" }
|
||||
/// ```
|
||||
Git {
|
||||
git: Url,
|
||||
/// The path to the directory with the `pyproject.toml` if it is not in the archive root.
|
||||
subdirectory: Option<String>,
|
||||
// Only one of the three may be used, we validate this later for a better error message.
|
||||
rev: Option<String>,
|
||||
tag: Option<String>,
|
||||
branch: Option<String>,
|
||||
},
|
||||
/// A remote `http://` or `https://` URL, either a wheel (`.whl`) or a source distribution
|
||||
/// (`.zip`, `.tar.gz`).
|
||||
///
|
||||
/// Example:
|
||||
/// ```toml
|
||||
/// flask = { url = "https://files.pythonhosted.org/packages/61/80/ffe1da13ad9300f87c93af113edd0638c75138c42a0994becfacac078c06/flask-3.0.3-py3-none-any.whl" }
|
||||
/// ```
|
||||
Url {
|
||||
url: Url,
|
||||
/// For source distributions, the path to the directory with the `pyproject.toml` if it is
|
||||
/// not in the archive root.
|
||||
subdirectory: Option<String>,
|
||||
},
|
||||
/// The path to a dependency. It can either be a wheel (a `.whl` file), a source distribution
|
||||
/// as archive (a `.zip` or `.tag.gz` file) or a source distribution as directory (a directory
|
||||
/// with a pyproject.toml in, or a legacy directory with only a setup.py but non pyproject.toml
|
||||
/// in it).
|
||||
Path {
|
||||
path: String,
|
||||
/// `false` by default.
|
||||
editable: Option<bool>,
|
||||
},
|
||||
/// When using a version as requirement, you can optionally pin the requirement to an index
|
||||
/// you defined, e.g. `torch` after configuring `torch` to
|
||||
/// `https://download.pytorch.org/whl/cu118`.
|
||||
Registry {
|
||||
// TODO(konstin): The string is more-or-less a placeholder
|
||||
index: String,
|
||||
},
|
||||
/// A dependency on another package in the workspace.
|
||||
Workspace {
|
||||
workspace: bool,
|
||||
/// `true` by default.
|
||||
editable: Option<bool>,
|
||||
},
|
||||
/// Show a better error message for invalid combinations of options.
|
||||
CatchAll {
|
||||
git: String,
|
||||
subdirectory: Option<String>,
|
||||
rev: Option<String>,
|
||||
tag: Option<String>,
|
||||
branch: Option<String>,
|
||||
url: String,
|
||||
patch: String,
|
||||
index: String,
|
||||
workspace: bool,
|
||||
},
|
||||
}
|
||||
|
||||
/// The PEP 621 project metadata, with static requirements extracted in advance, joined
|
||||
/// with `tool.uv.sources`.
|
||||
#[derive(Debug)]
|
||||
pub(crate) struct Pep621Metadata {
|
||||
/// The name of the project.
|
||||
|
@ -48,12 +225,6 @@ pub(crate) struct Pep621Metadata {
|
|||
pub(crate) used_extras: FxHashSet<ExtraName>,
|
||||
}
|
||||
|
||||
#[derive(thiserror::Error, Debug)]
|
||||
pub(crate) enum Pep621Error {
|
||||
#[error(transparent)]
|
||||
Pep508(#[from] pep508_rs::Pep508Error),
|
||||
}
|
||||
|
||||
impl Pep621Metadata {
|
||||
/// Extract the static [`Pep621Metadata`] from a [`Project`] and [`ExtrasSpecification`], if
|
||||
/// possible.
|
||||
|
@ -63,71 +234,298 @@ impl Pep621Metadata {
|
|||
///
|
||||
/// Returns an error if the requirements are not valid PEP 508 requirements.
|
||||
pub(crate) fn try_from(
|
||||
project: Project,
|
||||
pyproject: PyProjectToml,
|
||||
extras: &ExtrasSpecification,
|
||||
project_dir: &Path,
|
||||
workspace_sources: &HashMap<PackageName, Source>,
|
||||
workspace_packages: &HashMap<PackageName, String>,
|
||||
) -> Result<Option<Self>, Pep621Error> {
|
||||
let project_sources = pyproject
|
||||
.tool
|
||||
.as_ref()
|
||||
.and_then(|tool| tool.uv.as_ref())
|
||||
.and_then(|uv| uv.sources.clone());
|
||||
|
||||
let has_sources = project_sources.is_some() || !workspace_sources.is_empty();
|
||||
|
||||
let Some(project) = pyproject.project else {
|
||||
return if has_sources {
|
||||
Err(Pep621Error::MissingProjectSection)
|
||||
} else {
|
||||
Ok(None)
|
||||
};
|
||||
};
|
||||
if let Some(dynamic) = project.dynamic.as_ref() {
|
||||
// If the project specifies dynamic dependencies, we can't extract the requirements.
|
||||
if dynamic.iter().any(|field| field == "dependencies") {
|
||||
return Ok(None);
|
||||
return if has_sources {
|
||||
Err(Pep621Error::CantBeDynamic("project.dependencies"))
|
||||
} else {
|
||||
Ok(None)
|
||||
};
|
||||
}
|
||||
// If we requested extras, and the project specifies dynamic optional dependencies, we can't
|
||||
// extract the requirements.
|
||||
if !extras.is_empty() && dynamic.iter().any(|field| field == "optional-dependencies") {
|
||||
return Ok(None);
|
||||
return if has_sources {
|
||||
Err(Pep621Error::CantBeDynamic("project.optional-dependencies"))
|
||||
} else {
|
||||
Ok(None)
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
let name = project.name;
|
||||
let requirements = lower_requirements(
|
||||
&project.dependencies.unwrap_or_default(),
|
||||
&project.optional_dependencies.unwrap_or_default(),
|
||||
&project.name,
|
||||
project_dir,
|
||||
&project_sources.unwrap_or_default(),
|
||||
workspace_sources,
|
||||
workspace_packages,
|
||||
)?;
|
||||
|
||||
// Parse out the project requirements.
|
||||
let mut requirements = project
|
||||
.dependencies
|
||||
.unwrap_or_default()
|
||||
.iter()
|
||||
.map(String::as_str)
|
||||
.map(|s| LenientRequirement::from_str(s).map(Requirement::from))
|
||||
.collect::<Result<Vec<_>, _>>()?;
|
||||
let mut requirements_with_extras = requirements.dependencies;
|
||||
|
||||
// Include any optional dependencies specified in `extras`.
|
||||
let mut used_extras = FxHashSet::default();
|
||||
if !extras.is_empty() {
|
||||
if let Some(optional_dependencies) = project.optional_dependencies {
|
||||
// Parse out the optional dependencies.
|
||||
let optional_dependencies = optional_dependencies
|
||||
.into_iter()
|
||||
.map(|(extra, requirements)| {
|
||||
let requirements = requirements
|
||||
.iter()
|
||||
.map(String::as_str)
|
||||
.map(|s| LenientRequirement::from_str(s).map(Requirement::from))
|
||||
.collect::<Result<Vec<_>, _>>()?;
|
||||
Ok::<(ExtraName, Vec<Requirement>), Pep621Error>((extra, requirements))
|
||||
})
|
||||
.collect::<Result<IndexMap<_, _>, _>>()?;
|
||||
|
||||
// Include the optional dependencies if the extras are requested.
|
||||
for (extra, optional_requirements) in &optional_dependencies {
|
||||
if extras.contains(extra) {
|
||||
used_extras.insert(extra.clone());
|
||||
requirements.extend(flatten_extra(
|
||||
&name,
|
||||
optional_requirements,
|
||||
&optional_dependencies,
|
||||
));
|
||||
}
|
||||
// Include the optional dependencies if the extras are requested.
|
||||
for (extra, optional_requirements) in &requirements.optional_dependencies {
|
||||
if extras.contains(extra) {
|
||||
used_extras.insert(extra.clone());
|
||||
requirements_with_extras.extend(flatten_extra(
|
||||
&project.name,
|
||||
optional_requirements,
|
||||
&requirements.optional_dependencies,
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(Some(Self {
|
||||
name,
|
||||
requirements,
|
||||
name: project.name,
|
||||
requirements: requirements_with_extras,
|
||||
used_extras,
|
||||
}))
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn lower_requirements(
|
||||
dependencies: &[String],
|
||||
optional_dependencies: &IndexMap<ExtraName, Vec<String>>,
|
||||
project_name: &PackageName,
|
||||
project_dir: &Path,
|
||||
project_sources: &HashMap<PackageName, Source>,
|
||||
workspace_sources: &HashMap<PackageName, Source>,
|
||||
workspace_packages: &HashMap<PackageName, String>,
|
||||
) -> Result<Requirements, Pep621Error> {
|
||||
let dependencies = dependencies
|
||||
.iter()
|
||||
.map(|dependency| {
|
||||
let requirement = pep508_rs::Requirement::from_str(dependency)?;
|
||||
let name = requirement.name.clone();
|
||||
lower_requirement(
|
||||
requirement,
|
||||
project_name,
|
||||
project_dir,
|
||||
project_sources,
|
||||
workspace_sources,
|
||||
workspace_packages,
|
||||
)
|
||||
.map_err(|err| Pep621Error::LoweringError(name, err))
|
||||
})
|
||||
.collect::<Result<_, Pep621Error>>()?;
|
||||
let optional_dependencies = optional_dependencies
|
||||
.iter()
|
||||
.map(|(extra_name, dependencies)| {
|
||||
let dependencies: Vec<_> = dependencies
|
||||
.iter()
|
||||
.map(|dependency| {
|
||||
let requirement = pep508_rs::Requirement::from_str(dependency)?;
|
||||
let name = requirement.name.clone();
|
||||
lower_requirement(
|
||||
requirement,
|
||||
project_name,
|
||||
project_dir,
|
||||
project_sources,
|
||||
workspace_sources,
|
||||
workspace_packages,
|
||||
)
|
||||
.map_err(|err| Pep621Error::LoweringError(name, err))
|
||||
})
|
||||
.collect::<Result<_, Pep621Error>>()?;
|
||||
Ok((extra_name.clone(), dependencies))
|
||||
})
|
||||
.collect::<Result<_, Pep621Error>>()?;
|
||||
Ok(Requirements {
|
||||
dependencies,
|
||||
optional_dependencies,
|
||||
})
|
||||
}
|
||||
|
||||
/// Combine `project.dependencies`/`project.optional-dependencies` with `tool.uv.sources`.
|
||||
pub(crate) fn lower_requirement(
|
||||
requirement: pep508_rs::Requirement,
|
||||
project_name: &PackageName,
|
||||
project_dir: &Path,
|
||||
project_sources: &HashMap<PackageName, Source>,
|
||||
workspace_sources: &HashMap<PackageName, Source>,
|
||||
workspace_packages: &HashMap<PackageName, String>,
|
||||
) -> Result<Requirement, LoweringError> {
|
||||
let source = project_sources
|
||||
.get(&requirement.name)
|
||||
.or(workspace_sources.get(&requirement.name))
|
||||
.cloned();
|
||||
if !matches!(
|
||||
source,
|
||||
Some(Source::Workspace {
|
||||
// By using toml, we technically support `workspace = false`.
|
||||
workspace: true,
|
||||
..
|
||||
})
|
||||
) && workspace_packages.contains_key(&requirement.name)
|
||||
{
|
||||
return Err(LoweringError::UndeclaredWorkspacePackage);
|
||||
}
|
||||
|
||||
let Some(source) = source else {
|
||||
// Support recursive editable inclusions. TODO(konsti): This is a workspace feature.
|
||||
return if requirement.version_or_url.is_none() && &requirement.name != project_name {
|
||||
Err(LoweringError::UnconstrainedVersion)
|
||||
} else {
|
||||
Ok(Requirement::from_pep508(requirement).map_err(Box::new)?)
|
||||
};
|
||||
};
|
||||
|
||||
let source = match source {
|
||||
Source::Git {
|
||||
git,
|
||||
subdirectory,
|
||||
rev,
|
||||
tag,
|
||||
branch,
|
||||
} => {
|
||||
if matches!(requirement.version_or_url, Some(VersionOrUrl::Url(_))) {
|
||||
return Err(LoweringError::ConflictingUrls);
|
||||
}
|
||||
// TODO(konsti): We know better than this enum
|
||||
let reference = match (rev, tag, branch) {
|
||||
(None, None, None) => GitReference::DefaultBranch,
|
||||
(Some(rev), None, None) => {
|
||||
if rev.len() == 40 {
|
||||
GitReference::FullCommit(rev)
|
||||
} else {
|
||||
GitReference::BranchOrTagOrCommit(rev)
|
||||
}
|
||||
}
|
||||
(None, Some(tag), None) => GitReference::BranchOrTag(tag),
|
||||
(None, None, Some(branch)) => GitReference::BranchOrTag(branch),
|
||||
_ => return Err(LoweringError::MoreThanOneGitRef),
|
||||
};
|
||||
|
||||
let mut url = Url::parse(&format!("git+{git}"))?;
|
||||
let mut given = git.to_string();
|
||||
if let Some(rev) = reference.as_str() {
|
||||
url.set_path(&format!("{}@{}", url.path(), rev));
|
||||
given = format!("{given}@{rev}");
|
||||
}
|
||||
if let Some(subdirectory) = &subdirectory {
|
||||
url.set_fragment(Some(&format!("subdirectory={subdirectory}")));
|
||||
given = format!("{given}#subdirectory={subdirectory}");
|
||||
}
|
||||
let url = VerbatimUrl::from_url(url).with_given(given);
|
||||
let repository = url.to_url().clone();
|
||||
RequirementSource::Git {
|
||||
url,
|
||||
repository,
|
||||
reference,
|
||||
subdirectory: subdirectory.map(PathBuf::from),
|
||||
}
|
||||
}
|
||||
Source::Url { url, subdirectory } => {
|
||||
if matches!(requirement.version_or_url, Some(VersionOrUrl::Url(_))) {
|
||||
return Err(LoweringError::ConflictingUrls);
|
||||
}
|
||||
|
||||
let mut verbatim_url = url.clone();
|
||||
if verbatim_url.fragment().is_some() {
|
||||
return Err(LoweringError::ForbiddenFragment(url));
|
||||
}
|
||||
if let Some(subdirectory) = &subdirectory {
|
||||
verbatim_url.set_fragment(Some(subdirectory));
|
||||
}
|
||||
|
||||
let verbatim_url = VerbatimUrl::from_url(verbatim_url);
|
||||
RequirementSource::Url {
|
||||
location: url,
|
||||
subdirectory: subdirectory.map(PathBuf::from),
|
||||
url: verbatim_url,
|
||||
}
|
||||
}
|
||||
Source::Path { path, editable } => {
|
||||
if matches!(requirement.version_or_url, Some(VersionOrUrl::Url(_))) {
|
||||
return Err(LoweringError::ConflictingUrls);
|
||||
}
|
||||
path_source(path, project_dir, editable)?
|
||||
}
|
||||
Source::Registry { index } => match requirement.version_or_url {
|
||||
None => return Err(LoweringError::UnconstrainedVersion),
|
||||
Some(VersionOrUrl::VersionSpecifier(version)) => RequirementSource::Registry {
|
||||
specifier: version,
|
||||
index: Some(index),
|
||||
},
|
||||
Some(VersionOrUrl::Url(_)) => return Err(LoweringError::ConflictingUrls),
|
||||
},
|
||||
Source::Workspace {
|
||||
workspace,
|
||||
editable,
|
||||
} => {
|
||||
if matches!(requirement.version_or_url, Some(VersionOrUrl::Url(_))) {
|
||||
return Err(LoweringError::ConflictingUrls);
|
||||
}
|
||||
if !workspace {
|
||||
todo!()
|
||||
}
|
||||
let path = workspace_packages
|
||||
.get(&requirement.name)
|
||||
.ok_or(LoweringError::UndeclaredWorkspacePackage)?
|
||||
.clone();
|
||||
path_source(path, project_dir, editable)?
|
||||
}
|
||||
Source::CatchAll { .. } => {
|
||||
// This is better than a serde error about not matching any enum variant
|
||||
return Err(LoweringError::InvalidEntry);
|
||||
}
|
||||
};
|
||||
Ok(Requirement {
|
||||
name: requirement.name,
|
||||
extras: requirement.extras,
|
||||
marker: requirement.marker,
|
||||
source,
|
||||
})
|
||||
}
|
||||
|
||||
/// Convert a path string to a path section.
|
||||
fn path_source(
|
||||
path: String,
|
||||
project_dir: &Path,
|
||||
editable: Option<bool>,
|
||||
) -> Result<RequirementSource, LoweringError> {
|
||||
let url = VerbatimUrl::parse_path(&path, project_dir);
|
||||
let path_buf = PathBuf::from(&path);
|
||||
let path_buf = path_buf
|
||||
.absolutize_from(project_dir)
|
||||
.map_err(|err| LoweringError::AbsolutizeError(path, err))?
|
||||
.to_path_buf();
|
||||
Ok(RequirementSource::Path {
|
||||
path: path_buf,
|
||||
url,
|
||||
editable,
|
||||
})
|
||||
}
|
||||
|
||||
/// Given an extra in a project that may contain references to the project
|
||||
/// itself, flatten it into a list of requirements.
|
||||
///
|
||||
|
@ -189,3 +587,302 @@ fn flatten_extra(
|
|||
&mut FxHashSet::default(),
|
||||
)
|
||||
}
|
||||
|
||||
/// <https://github.com/serde-rs/serde/issues/1316#issue-332908452>
|
||||
mod serde_from_and_to_string {
|
||||
use std::fmt::Display;
|
||||
use std::str::FromStr;
|
||||
|
||||
use serde::{de, Deserialize, Deserializer, Serializer};
|
||||
|
||||
pub(super) fn serialize<T, S>(value: &T, serializer: S) -> Result<S::Ok, S::Error>
|
||||
where
|
||||
T: Display,
|
||||
S: Serializer,
|
||||
{
|
||||
serializer.collect_str(value)
|
||||
}
|
||||
|
||||
pub(super) fn deserialize<'de, T, D>(deserializer: D) -> Result<T, D::Error>
|
||||
where
|
||||
T: FromStr,
|
||||
T::Err: Display,
|
||||
D: Deserializer<'de>,
|
||||
{
|
||||
String::deserialize(deserializer)?
|
||||
.parse()
|
||||
.map_err(de::Error::custom)
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use std::path::Path;
|
||||
|
||||
use anyhow::Context;
|
||||
use indoc::indoc;
|
||||
use insta::assert_snapshot;
|
||||
|
||||
use uv_fs::Simplified;
|
||||
|
||||
use crate::{ExtrasSpecification, RequirementsSpecification};
|
||||
|
||||
fn from_source(
|
||||
contents: &str,
|
||||
path: impl AsRef<Path>,
|
||||
extras: &ExtrasSpecification,
|
||||
) -> anyhow::Result<RequirementsSpecification> {
|
||||
let path = uv_fs::absolutize_path(path.as_ref())?;
|
||||
RequirementsSpecification::parse_direct_pyproject_toml(contents, extras, path.as_ref())
|
||||
.with_context(|| format!("Failed to parse `{}`", path.user_display()))
|
||||
}
|
||||
|
||||
fn format_err(input: &str) -> String {
|
||||
let err = from_source(input, "pyproject.toml", &ExtrasSpecification::None).unwrap_err();
|
||||
let mut causes = err.chain();
|
||||
let mut message = String::new();
|
||||
message.push_str(&format!("error: {}\n", causes.next().unwrap()));
|
||||
for err in causes {
|
||||
message.push_str(&format!(" Caused by: {err}\n"));
|
||||
}
|
||||
message
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn conflict_project_and_sources() {
|
||||
let input = indoc! {r#"
|
||||
[project]
|
||||
name = "foo"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"tqdm @ git+https://github.com/tqdm/tqdm",
|
||||
]
|
||||
|
||||
[tool.uv.sources]
|
||||
tqdm = { url = "https://files.pythonhosted.org/packages/a5/d6/502a859bac4ad5e274255576cd3e15ca273cdb91731bc39fb840dd422ee9/tqdm-4.66.0-py3-none-any.whl" }
|
||||
"#};
|
||||
|
||||
assert_snapshot!(format_err(input), @r###"
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: Failed to parse entry for: `tqdm`
|
||||
Caused by: You can't combine a url in `project` with `tool.uv.sources`
|
||||
"###);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn too_many_git_specs() {
|
||||
let input = indoc! {r#"
|
||||
[project]
|
||||
name = "foo"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"tqdm",
|
||||
]
|
||||
|
||||
[tool.uv.sources]
|
||||
tqdm = { git = "https://github.com/tqdm/tqdm", rev = "baaaaaab", tag = "v1.0.0" }
|
||||
"#};
|
||||
|
||||
assert_snapshot!(format_err(input), @r###"
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: Failed to parse entry for: `tqdm`
|
||||
Caused by: You can only use one of rev, tag or branch
|
||||
"###);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn too_many_git_typo() {
|
||||
let input = indoc! {r#"
|
||||
[project]
|
||||
name = "foo"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"tqdm",
|
||||
]
|
||||
|
||||
[tool.uv.sources]
|
||||
tqdm = { git = "https://github.com/tqdm/tqdm", ref = "baaaaaab" }
|
||||
"#};
|
||||
|
||||
// TODO(konsti): This should tell you the set of valid fields
|
||||
assert_snapshot!(format_err(input), @r###"
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: TOML parse error at line 9, column 8
|
||||
|
|
||||
9 | tqdm = { git = "https://github.com/tqdm/tqdm", ref = "baaaaaab" }
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
data did not match any variant of untagged enum Source
|
||||
|
||||
"###);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn you_cant_mix_those() {
|
||||
let input = indoc! {r#"
|
||||
[project]
|
||||
name = "foo"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"tqdm",
|
||||
]
|
||||
|
||||
[tool.uv.sources]
|
||||
tqdm = { path = "tqdm", index = "torch" }
|
||||
"#};
|
||||
|
||||
// TODO(konsti): This should tell you the set of valid fields
|
||||
assert_snapshot!(format_err(input), @r###"
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: TOML parse error at line 9, column 8
|
||||
|
|
||||
9 | tqdm = { path = "tqdm", index = "torch" }
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
data did not match any variant of untagged enum Source
|
||||
|
||||
"###);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn missing_constraint() {
|
||||
let input = indoc! {r#"
|
||||
[project]
|
||||
name = "foo"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"tqdm",
|
||||
]
|
||||
"#};
|
||||
|
||||
assert_snapshot!(format_err(input), @r###"
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: Failed to parse entry for: `tqdm`
|
||||
Caused by: You need to specify a version constraint
|
||||
"###);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn invalid_syntax() {
|
||||
let input = indoc! {r#"
|
||||
[project]
|
||||
name = "foo"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"tqdm ==4.66.0",
|
||||
]
|
||||
|
||||
[tool.uv.sources]
|
||||
tqdm = { url = invalid url to tqdm-4.66.0-py3-none-any.whl" }
|
||||
"#};
|
||||
|
||||
assert_snapshot!(format_err(input), @r###"
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: TOML parse error at line 9, column 16
|
||||
|
|
||||
9 | tqdm = { url = invalid url to tqdm-4.66.0-py3-none-any.whl" }
|
||||
| ^
|
||||
invalid string
|
||||
expected `"`, `'`
|
||||
|
||||
"###);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn invalid_url() {
|
||||
let input = indoc! {r#"
|
||||
[project]
|
||||
name = "foo"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"tqdm ==4.66.0",
|
||||
]
|
||||
|
||||
[tool.uv.sources]
|
||||
tqdm = { url = "§invalid#+#*Ä" }
|
||||
"#};
|
||||
|
||||
assert_snapshot!(format_err(input), @r###"
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: TOML parse error at line 9, column 8
|
||||
|
|
||||
9 | tqdm = { url = "§invalid#+#*Ä" }
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
data did not match any variant of untagged enum Source
|
||||
|
||||
"###);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn workspace_and_url_spec() {
|
||||
let input = indoc! {r#"
|
||||
[project]
|
||||
name = "foo"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"tqdm @ git+https://github.com/tqdm/tqdm",
|
||||
]
|
||||
|
||||
[tool.uv.sources]
|
||||
tqdm = { workspace = true }
|
||||
"#};
|
||||
|
||||
assert_snapshot!(format_err(input), @r###"
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: Failed to parse entry for: `tqdm`
|
||||
Caused by: You can't combine a url in `project` with `tool.uv.sources`
|
||||
"###);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn missing_workspace_package() {
|
||||
let input = indoc! {r#"
|
||||
[project]
|
||||
name = "foo"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"tqdm ==4.66.0",
|
||||
]
|
||||
|
||||
[tool.uv.sources]
|
||||
tqdm = { workspace = true }
|
||||
"#};
|
||||
|
||||
assert_snapshot!(format_err(input), @r###"
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: Failed to parse entry for: `tqdm`
|
||||
Caused by: The package is not included as workspace package in `tool.uv.workspace`
|
||||
"###);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn cant_be_dynamic() {
|
||||
let input = indoc! {r#"
|
||||
[project]
|
||||
name = "foo"
|
||||
version = "0.0.0"
|
||||
dynamic = [
|
||||
"dependencies"
|
||||
]
|
||||
|
||||
[tool.uv.sources]
|
||||
tqdm = { workspace = true }
|
||||
"#};
|
||||
|
||||
assert_snapshot!(format_err(input), @r###"
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: pyproject.toml section is declared as dynamic, but must be static: `project.dependencies`
|
||||
"###);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn missing_project_section() {
|
||||
let input = indoc! {"
|
||||
[tool.uv.sources]
|
||||
tqdm = { workspace = true }
|
||||
"};
|
||||
|
||||
assert_snapshot!(format_err(input), @r###"
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: You need to specify a `[project]` section to use `[tool.uv.sources]`
|
||||
"###);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -5,8 +5,10 @@ use anyhow::{Context, Result};
|
|||
use futures::{StreamExt, TryStreamExt};
|
||||
use url::Url;
|
||||
|
||||
use distribution_types::{BuildableSource, HashPolicy, PathSourceUrl, SourceUrl, VersionId};
|
||||
use pep508_rs::Requirement;
|
||||
use distribution_types::{
|
||||
BuildableSource, HashPolicy, PathSourceUrl, Requirement, SourceUrl, VersionId,
|
||||
};
|
||||
|
||||
use uv_client::RegistryClient;
|
||||
use uv_distribution::{DistributionDatabase, Reporter};
|
||||
use uv_fs::Simplified;
|
||||
|
@ -67,11 +69,15 @@ impl<'a, Context: BuildContext + Send + Sync> SourceTreeResolver<'a, Context> {
|
|||
.buffered(50)
|
||||
.try_collect()
|
||||
.await?;
|
||||
Ok(requirements.into_iter().flatten().collect())
|
||||
Ok(requirements
|
||||
.into_iter()
|
||||
.flatten()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, _>>()?)
|
||||
}
|
||||
|
||||
/// Infer the package name for a given "unnamed" requirement.
|
||||
async fn resolve_source_tree(&self, source_tree: &Path) -> Result<Vec<Requirement>> {
|
||||
async fn resolve_source_tree(&self, source_tree: &Path) -> Result<Vec<pep508_rs::Requirement>> {
|
||||
// Convert to a buildable source.
|
||||
let path = fs_err::canonicalize(source_tree).with_context(|| {
|
||||
format!(
|
||||
|
@ -138,7 +144,7 @@ impl<'a, Context: BuildContext + Send + Sync> SourceTreeResolver<'a, Context> {
|
|||
ExtrasSpecification::All => Ok(metadata
|
||||
.requires_dist
|
||||
.into_iter()
|
||||
.map(|requirement| Requirement {
|
||||
.map(|requirement| pep508_rs::Requirement {
|
||||
marker: requirement
|
||||
.marker
|
||||
.and_then(|marker| marker.simplify_extras(&metadata.provides_extras)),
|
||||
|
@ -148,7 +154,7 @@ impl<'a, Context: BuildContext + Send + Sync> SourceTreeResolver<'a, Context> {
|
|||
ExtrasSpecification::Some(extras) => Ok(metadata
|
||||
.requires_dist
|
||||
.into_iter()
|
||||
.map(|requirement| Requirement {
|
||||
.map(|requirement| pep508_rs::Requirement {
|
||||
marker: requirement
|
||||
.marker
|
||||
.and_then(|marker| marker.simplify_extras(extras)),
|
||||
|
|
|
@ -1,12 +1,15 @@
|
|||
use std::path::PathBuf;
|
||||
use std::collections::HashMap;
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use rustc_hash::FxHashSet;
|
||||
use tracing::instrument;
|
||||
use tracing::{debug, instrument};
|
||||
|
||||
use cache_key::CanonicalUrl;
|
||||
use distribution_types::{FlatIndexLocation, IndexUrl};
|
||||
use pep508_rs::Requirement;
|
||||
use distribution_types::{
|
||||
FlatIndexLocation, IndexUrl, Requirement, UnresolvedRequirement,
|
||||
UnresolvedRequirementSpecification,
|
||||
};
|
||||
use requirements_txt::{
|
||||
EditableRequirement, FindLink, RequirementEntry, RequirementsTxt, RequirementsTxtRequirement,
|
||||
};
|
||||
|
@ -23,11 +26,11 @@ pub struct RequirementsSpecification {
|
|||
/// The name of the project specifying requirements.
|
||||
pub project: Option<PackageName>,
|
||||
/// The requirements for the project.
|
||||
pub requirements: Vec<RequirementEntry>,
|
||||
pub requirements: Vec<UnresolvedRequirementSpecification>,
|
||||
/// The constraints for the project.
|
||||
pub constraints: Vec<Requirement>,
|
||||
/// The overrides for the project.
|
||||
pub overrides: Vec<RequirementEntry>,
|
||||
pub overrides: Vec<UnresolvedRequirementSpecification>,
|
||||
/// Package to install as editable installs
|
||||
pub editables: Vec<EditableRequirement>,
|
||||
/// The source trees from which to extract requirements.
|
||||
|
@ -62,10 +65,12 @@ impl RequirementsSpecification {
|
|||
.with_context(|| format!("Failed to parse `{name}`"))?;
|
||||
Self {
|
||||
project: None,
|
||||
requirements: vec![RequirementEntry {
|
||||
requirement,
|
||||
hashes: vec![],
|
||||
}],
|
||||
requirements: vec![UnresolvedRequirementSpecification::try_from(
|
||||
RequirementEntry {
|
||||
requirement,
|
||||
hashes: vec![],
|
||||
},
|
||||
)?],
|
||||
constraints: vec![],
|
||||
overrides: vec![],
|
||||
editables: vec![],
|
||||
|
@ -103,8 +108,16 @@ impl RequirementsSpecification {
|
|||
RequirementsTxt::parse(path, std::env::current_dir()?, client_builder).await?;
|
||||
Self {
|
||||
project: None,
|
||||
requirements: requirements_txt.requirements,
|
||||
constraints: requirements_txt.constraints,
|
||||
requirements: requirements_txt
|
||||
.requirements
|
||||
.into_iter()
|
||||
.map(UnresolvedRequirementSpecification::try_from)
|
||||
.collect::<Result<_, _>>()?,
|
||||
constraints: requirements_txt
|
||||
.constraints
|
||||
.into_iter()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, _>>()?,
|
||||
overrides: vec![],
|
||||
editables: requirements_txt.editables,
|
||||
source_trees: vec![],
|
||||
|
@ -129,70 +142,13 @@ impl RequirementsSpecification {
|
|||
}
|
||||
}
|
||||
RequirementsSource::PyprojectToml(path) => {
|
||||
let contents = uv_fs::read_to_string(path).await?;
|
||||
let pyproject = toml::from_str::<PyProjectToml>(&contents)
|
||||
.with_context(|| format!("Failed to parse `{}`", path.user_display()))?;
|
||||
|
||||
// Attempt to read metadata from the `pyproject.toml` directly.
|
||||
//
|
||||
// If we fail to extract the PEP 621 metadata, fall back to treating it as a source
|
||||
// tree, as there are some cases where the `pyproject.toml` may not be a valid PEP
|
||||
// 621 file, but might still resolve under PEP 517. (If the source tree doesn't
|
||||
// resolve under PEP 517, we'll catch that later.)
|
||||
//
|
||||
// For example, Hatch's "Context formatting" API is not compliant with PEP 621, as
|
||||
// it expects dynamic processing by the build backend for the static metadata
|
||||
// fields. See: https://hatch.pypa.io/latest/config/context/
|
||||
if let Some(project) = pyproject
|
||||
.project
|
||||
.and_then(|project| Pep621Metadata::try_from(project, extras).ok().flatten())
|
||||
{
|
||||
Self {
|
||||
project: Some(project.name),
|
||||
requirements: project
|
||||
.requirements
|
||||
.into_iter()
|
||||
.map(|requirement| RequirementEntry {
|
||||
requirement: RequirementsTxtRequirement::Pep508(requirement),
|
||||
hashes: vec![],
|
||||
})
|
||||
.collect(),
|
||||
constraints: vec![],
|
||||
overrides: vec![],
|
||||
editables: vec![],
|
||||
source_trees: vec![],
|
||||
extras: project.used_extras,
|
||||
index_url: None,
|
||||
extra_index_urls: vec![],
|
||||
no_index: false,
|
||||
find_links: vec![],
|
||||
no_binary: NoBinary::default(),
|
||||
no_build: NoBuild::default(),
|
||||
}
|
||||
} else {
|
||||
let path = fs_err::canonicalize(path)?;
|
||||
let source_tree = path.parent().ok_or_else(|| {
|
||||
anyhow::anyhow!(
|
||||
"The file `{}` appears to be a `pyproject.toml` file, which must be in a directory",
|
||||
path.user_display()
|
||||
)
|
||||
})?;
|
||||
Self {
|
||||
project: None,
|
||||
requirements: vec![],
|
||||
constraints: vec![],
|
||||
overrides: vec![],
|
||||
editables: vec![],
|
||||
source_trees: vec![source_tree.to_path_buf()],
|
||||
extras: FxHashSet::default(),
|
||||
index_url: None,
|
||||
extra_index_urls: vec![],
|
||||
no_index: false,
|
||||
find_links: vec![],
|
||||
no_binary: NoBinary::default(),
|
||||
no_build: NoBuild::default(),
|
||||
}
|
||||
}
|
||||
let contents = uv_fs::read_to_string(&path).await?;
|
||||
// We need use this path as base for the relative paths inside pyproject.toml, so
|
||||
// we need the absolute path instead of a potentially relative path. E.g. with
|
||||
// `foo = { path = "../foo" }`, we will join `../foo` onto this path.
|
||||
let path = uv_fs::absolutize_path(path)?;
|
||||
Self::parse_direct_pyproject_toml(&contents, extras, path.as_ref())
|
||||
.with_context(|| format!("Failed to parse `{}`", path.user_display()))?
|
||||
}
|
||||
RequirementsSource::SetupPy(path) | RequirementsSource::SetupCfg(path) => {
|
||||
let path = fs_err::canonicalize(path)?;
|
||||
|
@ -221,6 +177,63 @@ impl RequirementsSpecification {
|
|||
})
|
||||
}
|
||||
|
||||
/// Attempt to read metadata from the `pyproject.toml` directly.
|
||||
///
|
||||
/// Since we only use this path for directly included pyproject.toml, we are strict about
|
||||
/// PEP 621 and don't allow invalid `project.dependencies` (e.g., Hatch's relative path
|
||||
/// support).
|
||||
pub(crate) fn parse_direct_pyproject_toml(
|
||||
contents: &str,
|
||||
extras: &ExtrasSpecification,
|
||||
path: &Path,
|
||||
) -> Result<Self> {
|
||||
let pyproject = toml::from_str::<PyProjectToml>(contents)?;
|
||||
|
||||
let workspace_sources = HashMap::default();
|
||||
let workspace_packages = HashMap::default();
|
||||
let project_dir = path
|
||||
.parent()
|
||||
.context("pyproject.toml has no parent directory")?;
|
||||
match Pep621Metadata::try_from(
|
||||
pyproject,
|
||||
extras,
|
||||
project_dir,
|
||||
&workspace_sources,
|
||||
&workspace_packages,
|
||||
) {
|
||||
Ok(Some(project)) => Ok(Self {
|
||||
project: Some(project.name),
|
||||
requirements: project
|
||||
.requirements
|
||||
.into_iter()
|
||||
.map(|requirement| UnresolvedRequirementSpecification {
|
||||
requirement: UnresolvedRequirement::Named(requirement),
|
||||
hashes: vec![],
|
||||
})
|
||||
.collect(),
|
||||
extras: project.used_extras,
|
||||
..Self::default()
|
||||
}),
|
||||
Ok(None) => {
|
||||
debug!("Dynamic pyproject.toml at: `{}`", path.user_display());
|
||||
let path = fs_err::canonicalize(path)?;
|
||||
let source_tree = path.parent().ok_or_else(|| {
|
||||
anyhow::anyhow!(
|
||||
"The file `{}` appears to be a `pyproject.toml` file, which must be in a directory",
|
||||
path.user_display()
|
||||
)
|
||||
})?;
|
||||
Ok(Self {
|
||||
project: None,
|
||||
requirements: vec![],
|
||||
source_trees: vec![source_tree.to_path_buf()],
|
||||
..Self::default()
|
||||
})
|
||||
}
|
||||
Err(err) => Err(err.into()),
|
||||
}
|
||||
}
|
||||
|
||||
/// Read the combined requirements and constraints from a set of sources.
|
||||
pub async fn from_sources(
|
||||
requirements: &[RequirementsSource],
|
||||
|
@ -271,10 +284,10 @@ impl RequirementsSpecification {
|
|||
let source = Self::from_source(source, extras, client_builder).await?;
|
||||
for entry in source.requirements {
|
||||
match entry.requirement {
|
||||
RequirementsTxtRequirement::Pep508(requirement) => {
|
||||
UnresolvedRequirement::Named(requirement) => {
|
||||
spec.constraints.push(requirement);
|
||||
}
|
||||
RequirementsTxtRequirement::Unnamed(requirement) => {
|
||||
UnresolvedRequirement::Unnamed(requirement) => {
|
||||
return Err(anyhow::anyhow!(
|
||||
"Unnamed requirements are not allowed as constraints (found: `{requirement}`)"
|
||||
));
|
||||
|
|
|
@ -10,12 +10,11 @@ use tracing::debug;
|
|||
|
||||
use distribution_filename::{SourceDistFilename, WheelFilename};
|
||||
use distribution_types::{
|
||||
BuildableSource, DirectSourceUrl, GitSourceUrl, PathSourceUrl, RemoteSource, SourceUrl,
|
||||
VersionId,
|
||||
BuildableSource, DirectSourceUrl, GitSourceUrl, PathSourceUrl, RemoteSource, Requirement,
|
||||
SourceUrl, UnresolvedRequirement, UnresolvedRequirementSpecification, VersionId,
|
||||
};
|
||||
use pep508_rs::{Requirement, Scheme, UnnamedRequirement, VersionOrUrl};
|
||||
use pep508_rs::{Scheme, UnnamedRequirement, VersionOrUrl};
|
||||
use pypi_types::Metadata10;
|
||||
use requirements_txt::{RequirementEntry, RequirementsTxtRequirement};
|
||||
use uv_client::RegistryClient;
|
||||
use uv_distribution::{DistributionDatabase, Reporter};
|
||||
use uv_normalize::PackageName;
|
||||
|
@ -25,7 +24,7 @@ use uv_types::{BuildContext, HashStrategy};
|
|||
/// Like [`RequirementsSpecification`], but with concrete names for all requirements.
|
||||
pub struct NamedRequirementsResolver<'a, Context: BuildContext + Send + Sync> {
|
||||
/// The requirements for the project.
|
||||
requirements: Vec<RequirementEntry>,
|
||||
requirements: Vec<UnresolvedRequirementSpecification>,
|
||||
/// Whether to check hashes for distributions.
|
||||
hasher: &'a HashStrategy,
|
||||
/// The in-memory index for resolving dependencies.
|
||||
|
@ -37,7 +36,7 @@ pub struct NamedRequirementsResolver<'a, Context: BuildContext + Send + Sync> {
|
|||
impl<'a, Context: BuildContext + Send + Sync> NamedRequirementsResolver<'a, Context> {
|
||||
/// Instantiate a new [`NamedRequirementsResolver`] for a given set of requirements.
|
||||
pub fn new(
|
||||
requirements: Vec<RequirementEntry>,
|
||||
requirements: Vec<UnresolvedRequirementSpecification>,
|
||||
hasher: &'a HashStrategy,
|
||||
context: &'a Context,
|
||||
client: &'a RegistryClient,
|
||||
|
@ -71,10 +70,10 @@ impl<'a, Context: BuildContext + Send + Sync> NamedRequirementsResolver<'a, Cont
|
|||
futures::stream::iter(requirements)
|
||||
.map(|entry| async {
|
||||
match entry.requirement {
|
||||
RequirementsTxtRequirement::Pep508(requirement) => Ok(requirement),
|
||||
RequirementsTxtRequirement::Unnamed(requirement) => {
|
||||
Self::resolve_requirement(requirement, hasher, index, &database).await
|
||||
}
|
||||
UnresolvedRequirement::Named(requirement) => Ok(requirement),
|
||||
UnresolvedRequirement::Unnamed(requirement) => Ok(Requirement::from_pep508(
|
||||
Self::resolve_requirement(requirement, hasher, index, &database).await?,
|
||||
)?),
|
||||
}
|
||||
})
|
||||
.buffered(50)
|
||||
|
@ -88,7 +87,7 @@ impl<'a, Context: BuildContext + Send + Sync> NamedRequirementsResolver<'a, Cont
|
|||
hasher: &HashStrategy,
|
||||
index: &InMemoryIndex,
|
||||
database: &DistributionDatabase<'a, Context>,
|
||||
) -> Result<Requirement> {
|
||||
) -> Result<pep508_rs::Requirement> {
|
||||
// If the requirement is a wheel, extract the package name from the wheel filename.
|
||||
//
|
||||
// Ex) `anyio-4.3.0-py3-none-any.whl`
|
||||
|
@ -97,7 +96,7 @@ impl<'a, Context: BuildContext + Send + Sync> NamedRequirementsResolver<'a, Cont
|
|||
.is_some_and(|ext| ext.eq_ignore_ascii_case("whl"))
|
||||
{
|
||||
let filename = WheelFilename::from_str(&requirement.url.filename()?)?;
|
||||
return Ok(Requirement {
|
||||
return Ok(pep508_rs::Requirement {
|
||||
name: filename.name,
|
||||
extras: requirement.extras,
|
||||
version_or_url: Some(VersionOrUrl::Url(requirement.url)),
|
||||
|
@ -115,7 +114,7 @@ impl<'a, Context: BuildContext + Send + Sync> NamedRequirementsResolver<'a, Cont
|
|||
.ok()
|
||||
.and_then(|filename| SourceDistFilename::parsed_normalized_filename(&filename).ok())
|
||||
{
|
||||
return Ok(Requirement {
|
||||
return Ok(pep508_rs::Requirement {
|
||||
name: filename.name,
|
||||
extras: requirement.extras,
|
||||
version_or_url: Some(VersionOrUrl::Url(requirement.url)),
|
||||
|
@ -142,7 +141,7 @@ impl<'a, Context: BuildContext + Send + Sync> NamedRequirementsResolver<'a, Cont
|
|||
path = path.display(),
|
||||
name = metadata.name
|
||||
);
|
||||
return Ok(Requirement {
|
||||
return Ok(pep508_rs::Requirement {
|
||||
name: metadata.name,
|
||||
extras: requirement.extras,
|
||||
version_or_url: Some(VersionOrUrl::Url(requirement.url)),
|
||||
|
@ -162,7 +161,7 @@ impl<'a, Context: BuildContext + Send + Sync> NamedRequirementsResolver<'a, Cont
|
|||
path = path.display(),
|
||||
name = project.name
|
||||
);
|
||||
return Ok(Requirement {
|
||||
return Ok(pep508_rs::Requirement {
|
||||
name: project.name,
|
||||
extras: requirement.extras,
|
||||
version_or_url: Some(VersionOrUrl::Url(requirement.url)),
|
||||
|
@ -179,7 +178,7 @@ impl<'a, Context: BuildContext + Send + Sync> NamedRequirementsResolver<'a, Cont
|
|||
path = path.display(),
|
||||
name = name
|
||||
);
|
||||
return Ok(Requirement {
|
||||
return Ok(pep508_rs::Requirement {
|
||||
name,
|
||||
extras: requirement.extras,
|
||||
version_or_url: Some(VersionOrUrl::Url(requirement.url)),
|
||||
|
@ -207,7 +206,7 @@ impl<'a, Context: BuildContext + Send + Sync> NamedRequirementsResolver<'a, Cont
|
|||
path = path.display(),
|
||||
name = name
|
||||
);
|
||||
return Ok(Requirement {
|
||||
return Ok(pep508_rs::Requirement {
|
||||
name,
|
||||
extras: requirement.extras,
|
||||
version_or_url: Some(VersionOrUrl::Url(requirement.url)),
|
||||
|
@ -265,7 +264,7 @@ impl<'a, Context: BuildContext + Send + Sync> NamedRequirementsResolver<'a, Cont
|
|||
}
|
||||
};
|
||||
|
||||
Ok(Requirement {
|
||||
Ok(pep508_rs::Requirement {
|
||||
name,
|
||||
extras: requirement.extras,
|
||||
version_or_url: Some(VersionOrUrl::Url(requirement.url)),
|
||||
|
|
|
@ -2,32 +2,40 @@ use std::hash::BuildHasherDefault;
|
|||
|
||||
use rustc_hash::FxHashMap;
|
||||
|
||||
use distribution_types::LocalEditable;
|
||||
use distribution_types::{LocalEditable, Requirements};
|
||||
use pypi_types::Metadata23;
|
||||
use uv_normalize::PackageName;
|
||||
|
||||
/// A set of editable packages, indexed by package name.
|
||||
#[derive(Debug, Default, Clone)]
|
||||
pub(crate) struct Editables(FxHashMap<PackageName, (LocalEditable, Metadata23)>);
|
||||
pub(crate) struct Editables(FxHashMap<PackageName, (LocalEditable, Metadata23, Requirements)>);
|
||||
|
||||
impl Editables {
|
||||
/// Create a new set of editables from a set of requirements.
|
||||
pub(crate) fn from_requirements(requirements: Vec<(LocalEditable, Metadata23)>) -> Self {
|
||||
pub(crate) fn from_requirements(
|
||||
requirements: Vec<(LocalEditable, Metadata23, Requirements)>,
|
||||
) -> Self {
|
||||
let mut editables =
|
||||
FxHashMap::with_capacity_and_hasher(requirements.len(), BuildHasherDefault::default());
|
||||
for (editable_requirement, metadata) in requirements {
|
||||
editables.insert(metadata.name.clone(), (editable_requirement, metadata));
|
||||
for (editable_requirement, metadata, requirements) in requirements {
|
||||
editables.insert(
|
||||
metadata.name.clone(),
|
||||
(editable_requirement, metadata, requirements),
|
||||
);
|
||||
}
|
||||
Self(editables)
|
||||
}
|
||||
|
||||
/// Get the editable for a package.
|
||||
pub(crate) fn get(&self, name: &PackageName) -> Option<&(LocalEditable, Metadata23)> {
|
||||
pub(crate) fn get(
|
||||
&self,
|
||||
name: &PackageName,
|
||||
) -> Option<&(LocalEditable, Metadata23, Requirements)> {
|
||||
self.0.get(name)
|
||||
}
|
||||
|
||||
/// Iterate over all editables.
|
||||
pub(crate) fn iter(&self) -> impl Iterator<Item = &(LocalEditable, Metadata23)> {
|
||||
pub(crate) fn iter(&self) -> impl Iterator<Item = &(LocalEditable, Metadata23, Requirements)> {
|
||||
self.0.values()
|
||||
}
|
||||
}
|
||||
|
|
|
@ -10,7 +10,8 @@ use pubgrub::report::{DefaultStringReporter, DerivationTree, External, Reporter}
|
|||
use rustc_hash::FxHashMap;
|
||||
|
||||
use distribution_types::{
|
||||
BuiltDist, IndexLocations, InstalledDist, PathBuiltDist, PathSourceDist, SourceDist,
|
||||
BuiltDist, IndexLocations, InstalledDist, ParsedUrlError, PathBuiltDist, PathSourceDist,
|
||||
SourceDist,
|
||||
};
|
||||
use once_map::OnceMap;
|
||||
use pep440_rs::Version;
|
||||
|
@ -97,6 +98,10 @@ pub enum ResolveError {
|
|||
#[error("In `--require-hashes` mode, all requirements must be pinned upfront with `==`, but found: `{0}`")]
|
||||
UnhashedPackage(PackageName),
|
||||
|
||||
// TODO(konsti): Attach the distribution that contained the invalid requirement as error source.
|
||||
#[error("Failed to parse requirements")]
|
||||
DirectUrl(#[from] Box<ParsedUrlError>),
|
||||
|
||||
/// Something unexpected happened.
|
||||
#[error("{0}")]
|
||||
Failure(String),
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
use distribution_types::LocalEditable;
|
||||
use distribution_types::{LocalEditable, Requirement, Requirements};
|
||||
use either::Either;
|
||||
use pep508_rs::{MarkerEnvironment, Requirement};
|
||||
use pep508_rs::MarkerEnvironment;
|
||||
use pypi_types::Metadata23;
|
||||
use uv_configuration::{Constraints, Overrides};
|
||||
use uv_normalize::PackageName;
|
||||
|
@ -34,7 +34,7 @@ pub struct Manifest {
|
|||
///
|
||||
/// The requirements of the editables should be included in resolution as if they were
|
||||
/// direct requirements in their own right.
|
||||
pub(crate) editables: Vec<(LocalEditable, Metadata23)>,
|
||||
pub(crate) editables: Vec<(LocalEditable, Metadata23, Requirements)>,
|
||||
|
||||
/// The installed packages to exclude from consideration during resolution.
|
||||
///
|
||||
|
@ -58,7 +58,7 @@ impl Manifest {
|
|||
overrides: Overrides,
|
||||
preferences: Vec<Preference>,
|
||||
project: Option<PackageName>,
|
||||
editables: Vec<(LocalEditable, Metadata23)>,
|
||||
editables: Vec<(LocalEditable, Metadata23, Requirements)>,
|
||||
exclusions: Exclusions,
|
||||
lookaheads: Vec<RequestedRequirements>,
|
||||
) -> Self {
|
||||
|
@ -112,9 +112,9 @@ impl Manifest {
|
|||
requirement.evaluate_markers(markers, lookahead.extras())
|
||||
})
|
||||
})
|
||||
.chain(self.editables.iter().flat_map(|(editable, metadata)| {
|
||||
.chain(self.editables.iter().flat_map(|(editable, _metadata, requirements)| {
|
||||
self.overrides
|
||||
.apply(&metadata.requires_dist)
|
||||
.apply(&requirements.dependencies)
|
||||
.filter(|requirement| {
|
||||
requirement.evaluate_markers(markers, &editable.extras)
|
||||
})
|
||||
|
@ -159,7 +159,7 @@ impl Manifest {
|
|||
&'a self,
|
||||
markers: &'a MarkerEnvironment,
|
||||
mode: DependencyMode,
|
||||
) -> impl Iterator<Item = &Requirement> {
|
||||
) -> impl Iterator<Item = &PackageName> {
|
||||
match mode {
|
||||
// Include direct requirements, dependencies of editables, and transitive dependencies
|
||||
// of local packages.
|
||||
|
@ -174,25 +174,29 @@ impl Manifest {
|
|||
requirement.evaluate_markers(markers, lookahead.extras())
|
||||
})
|
||||
})
|
||||
.chain(self.editables.iter().flat_map(|(editable, metadata)| {
|
||||
self.overrides
|
||||
.apply(&metadata.requires_dist)
|
||||
.filter(|requirement| {
|
||||
requirement.evaluate_markers(markers, &editable.extras)
|
||||
})
|
||||
}))
|
||||
.chain(self.editables.iter().flat_map(
|
||||
|(editable, _metadata, uv_requirements)| {
|
||||
self.overrides.apply(&uv_requirements.dependencies).filter(
|
||||
|requirement| {
|
||||
requirement.evaluate_markers(markers, &editable.extras)
|
||||
},
|
||||
)
|
||||
},
|
||||
))
|
||||
.chain(
|
||||
self.overrides
|
||||
.apply(&self.requirements)
|
||||
.filter(|requirement| requirement.evaluate_markers(markers, &[])),
|
||||
),
|
||||
)
|
||||
.map(|requirement| &requirement.name),
|
||||
),
|
||||
|
||||
// Restrict to the direct requirements.
|
||||
DependencyMode::Direct => Either::Right(
|
||||
self.overrides
|
||||
.apply(self.requirements.iter())
|
||||
.filter(|requirement| requirement.evaluate_markers(markers, &[])),
|
||||
.filter(|requirement| requirement.evaluate_markers(markers, &[]))
|
||||
.map(|requirement| &requirement.name),
|
||||
),
|
||||
}
|
||||
}
|
||||
|
|
|
@ -3,9 +3,9 @@ use std::str::FromStr;
|
|||
use rustc_hash::FxHashMap;
|
||||
use tracing::trace;
|
||||
|
||||
use distribution_types::{ParsedUrlError, Requirement, RequirementSource};
|
||||
use pep440_rs::{Operator, Version};
|
||||
use pep508_rs::UnnamedRequirement;
|
||||
use pep508_rs::{MarkerEnvironment, Requirement, VersionOrUrl};
|
||||
use pep508_rs::{MarkerEnvironment, UnnamedRequirement};
|
||||
use pypi_types::{HashDigest, HashError};
|
||||
use requirements_txt::{RequirementEntry, RequirementsTxtRequirement};
|
||||
use uv_normalize::PackageName;
|
||||
|
@ -16,6 +16,8 @@ pub enum PreferenceError {
|
|||
Bare(UnnamedRequirement),
|
||||
#[error(transparent)]
|
||||
Hash(#[from] HashError),
|
||||
#[error(transparent)]
|
||||
ParsedUrl(#[from] Box<ParsedUrlError>),
|
||||
}
|
||||
|
||||
/// A pinned requirement, as extracted from a `requirements.txt` file.
|
||||
|
@ -30,7 +32,9 @@ impl Preference {
|
|||
pub fn from_entry(entry: RequirementEntry) -> Result<Self, PreferenceError> {
|
||||
Ok(Self {
|
||||
requirement: match entry.requirement {
|
||||
RequirementsTxtRequirement::Pep508(requirement) => requirement,
|
||||
RequirementsTxtRequirement::Named(requirement) => {
|
||||
Requirement::from_pep508(requirement).map_err(Box::new)?
|
||||
}
|
||||
RequirementsTxtRequirement::Unnamed(requirement) => {
|
||||
return Err(PreferenceError::Bare(requirement));
|
||||
}
|
||||
|
@ -96,10 +100,9 @@ impl Preferences {
|
|||
);
|
||||
return None;
|
||||
}
|
||||
match requirement.version_or_url.as_ref() {
|
||||
Some(VersionOrUrl::VersionSpecifier(version_specifiers)) =>
|
||||
{
|
||||
let [version_specifier] = version_specifiers.as_ref() else {
|
||||
match &requirement.source {
|
||||
RequirementSource::Registry { specifier, ..} => {
|
||||
let [version_specifier] = specifier.as_ref() else {
|
||||
trace!(
|
||||
"Excluding {requirement} from preferences due to multiple version specifiers."
|
||||
);
|
||||
|
@ -119,15 +122,12 @@ impl Preferences {
|
|||
},
|
||||
))
|
||||
}
|
||||
Some(VersionOrUrl::Url(_)) => {
|
||||
RequirementSource::Url {..} | RequirementSource::Git { .. } | RequirementSource::Path { .. }=> {
|
||||
trace!(
|
||||
"Excluding {requirement} from preferences due to URL dependency."
|
||||
);
|
||||
None
|
||||
}
|
||||
_ => {
|
||||
None
|
||||
}
|
||||
}
|
||||
})
|
||||
.collect(),
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
use distribution_types::RequirementSource;
|
||||
use rustc_hash::FxHashSet;
|
||||
|
||||
use pep508_rs::{MarkerEnvironment, VersionOrUrl};
|
||||
use pep508_rs::MarkerEnvironment;
|
||||
use uv_normalize::PackageName;
|
||||
|
||||
use crate::{DependencyMode, Manifest};
|
||||
|
@ -66,16 +67,11 @@ impl PreReleaseStrategy {
|
|||
manifest
|
||||
.requirements(markers, dependencies)
|
||||
.filter(|requirement| {
|
||||
let Some(version_or_url) = &requirement.version_or_url else {
|
||||
let RequirementSource::Registry { specifier, .. } = &requirement.source
|
||||
else {
|
||||
return false;
|
||||
};
|
||||
let version_specifiers = match version_or_url {
|
||||
VersionOrUrl::VersionSpecifier(version_specifiers) => {
|
||||
version_specifiers
|
||||
}
|
||||
VersionOrUrl::Url(_) => return false,
|
||||
};
|
||||
version_specifiers
|
||||
specifier
|
||||
.iter()
|
||||
.any(pep440_rs::VersionSpecifier::any_prerelease)
|
||||
})
|
||||
|
@ -86,16 +82,11 @@ impl PreReleaseStrategy {
|
|||
manifest
|
||||
.requirements(markers, dependencies)
|
||||
.filter(|requirement| {
|
||||
let Some(version_or_url) = &requirement.version_or_url else {
|
||||
let RequirementSource::Registry { specifier, .. } = &requirement.source
|
||||
else {
|
||||
return false;
|
||||
};
|
||||
let version_specifiers = match version_or_url {
|
||||
VersionOrUrl::VersionSpecifier(version_specifiers) => {
|
||||
version_specifiers
|
||||
}
|
||||
VersionOrUrl::Url(_) => return false,
|
||||
};
|
||||
version_specifiers
|
||||
specifier
|
||||
.iter()
|
||||
.any(pep440_rs::VersionSpecifier::any_prerelease)
|
||||
})
|
||||
|
|
|
@ -3,9 +3,9 @@ use pubgrub::range::Range;
|
|||
use rustc_hash::FxHashSet;
|
||||
use tracing::warn;
|
||||
|
||||
use distribution_types::Verbatim;
|
||||
use distribution_types::{Requirement, RequirementSource, Verbatim};
|
||||
use pep440_rs::Version;
|
||||
use pep508_rs::{MarkerEnvironment, Requirement, VersionOrUrl};
|
||||
use pep508_rs::MarkerEnvironment;
|
||||
use uv_configuration::{Constraints, Overrides};
|
||||
use uv_normalize::{ExtraName, PackageName};
|
||||
|
||||
|
@ -185,19 +185,14 @@ fn to_pubgrub(
|
|||
urls: &Urls,
|
||||
locals: &Locals,
|
||||
) -> Result<(PubGrubPackage, Range<Version>), ResolveError> {
|
||||
match requirement.version_or_url.as_ref() {
|
||||
// The requirement has no specifier (e.g., `flask`).
|
||||
None => Ok((
|
||||
PubGrubPackage::from_package(requirement.name.clone(), extra, urls),
|
||||
Range::full(),
|
||||
)),
|
||||
|
||||
// The requirement has a specifier (e.g., `flask>=1.0`).
|
||||
Some(VersionOrUrl::VersionSpecifier(specifiers)) => {
|
||||
match &requirement.source {
|
||||
RequirementSource::Registry { specifier, .. } => {
|
||||
// TODO(konsti): We're currently losing the index information here, but we need
|
||||
// either pass it to `PubGrubPackage` or the `ResolverProvider` beforehand.
|
||||
// If the specifier is an exact version, and the user requested a local version that's
|
||||
// more precise than the specifier, use the local version instead.
|
||||
let version = if let Some(expected) = locals.get(&requirement.name) {
|
||||
specifiers
|
||||
specifier
|
||||
.iter()
|
||||
.map(|specifier| {
|
||||
Locals::map(expected, specifier)
|
||||
|
@ -208,7 +203,7 @@ fn to_pubgrub(
|
|||
range.intersection(&specifier.into())
|
||||
})?
|
||||
} else {
|
||||
specifiers
|
||||
specifier
|
||||
.iter()
|
||||
.map(PubGrubSpecifier::try_from)
|
||||
.fold_ok(Range::full(), |range, specifier| {
|
||||
|
@ -221,13 +216,11 @@ fn to_pubgrub(
|
|||
version,
|
||||
))
|
||||
}
|
||||
|
||||
// The requirement has a URL (e.g., `flask @ file:///path/to/flask`).
|
||||
Some(VersionOrUrl::Url(url)) => {
|
||||
RequirementSource::Url { url, .. } => {
|
||||
let Some(expected) = urls.get(&requirement.name) else {
|
||||
return Err(ResolveError::DisallowedUrl(
|
||||
requirement.name.clone(),
|
||||
url.verbatim().to_string(),
|
||||
url.to_string(),
|
||||
));
|
||||
};
|
||||
|
||||
|
@ -235,7 +228,49 @@ fn to_pubgrub(
|
|||
return Err(ResolveError::ConflictingUrlsTransitive(
|
||||
requirement.name.clone(),
|
||||
expected.verbatim().to_string(),
|
||||
url.verbatim().to_string(),
|
||||
url.to_string(),
|
||||
));
|
||||
}
|
||||
|
||||
Ok((
|
||||
PubGrubPackage::Package(requirement.name.clone(), extra, Some(expected.clone())),
|
||||
Range::full(),
|
||||
))
|
||||
}
|
||||
RequirementSource::Git { url, .. } => {
|
||||
let Some(expected) = urls.get(&requirement.name) else {
|
||||
return Err(ResolveError::DisallowedUrl(
|
||||
requirement.name.clone(),
|
||||
url.to_string(),
|
||||
));
|
||||
};
|
||||
|
||||
if !Urls::is_allowed(expected, url) {
|
||||
return Err(ResolveError::ConflictingUrlsTransitive(
|
||||
requirement.name.clone(),
|
||||
expected.verbatim().to_string(),
|
||||
url.to_string(),
|
||||
));
|
||||
}
|
||||
|
||||
Ok((
|
||||
PubGrubPackage::Package(requirement.name.clone(), extra, Some(expected.clone())),
|
||||
Range::full(),
|
||||
))
|
||||
}
|
||||
RequirementSource::Path { url, .. } => {
|
||||
let Some(expected) = urls.get(&requirement.name) else {
|
||||
return Err(ResolveError::DisallowedUrl(
|
||||
requirement.name.clone(),
|
||||
url.to_string(),
|
||||
));
|
||||
};
|
||||
|
||||
if !Urls::is_allowed(expected, url) {
|
||||
return Err(ResolveError::ConflictingUrlsTransitive(
|
||||
requirement.name.clone(),
|
||||
expected.verbatim().to_string(),
|
||||
url.to_string(),
|
||||
));
|
||||
}
|
||||
|
||||
|
|
|
@ -13,8 +13,8 @@ use pubgrub::type_aliases::SelectedDependencies;
|
|||
use rustc_hash::{FxHashMap, FxHashSet};
|
||||
|
||||
use distribution_types::{
|
||||
Dist, DistributionMetadata, IndexUrl, LocalEditable, Name, ResolvedDist, Verbatim, VersionId,
|
||||
VersionOrUrl,
|
||||
Dist, DistributionMetadata, IndexUrl, LocalEditable, Name, ParsedUrlError, Requirement,
|
||||
ResolvedDist, Verbatim, VersionId, VersionOrUrl,
|
||||
};
|
||||
use once_map::OnceMap;
|
||||
use pep440_rs::Version;
|
||||
|
@ -90,7 +90,8 @@ impl ResolutionGraph {
|
|||
match package {
|
||||
PubGrubPackage::Package(package_name, None, None) => {
|
||||
// Create the distribution.
|
||||
let pinned_package = if let Some((editable, _)) = editables.get(package_name) {
|
||||
let pinned_package = if let Some((editable, _, _)) = editables.get(package_name)
|
||||
{
|
||||
Dist::from_editable(package_name.clone(), editable.clone())?.into()
|
||||
} else {
|
||||
pins.get(package_name, version)
|
||||
|
@ -123,7 +124,8 @@ impl ResolutionGraph {
|
|||
}
|
||||
PubGrubPackage::Package(package_name, None, Some(url)) => {
|
||||
// Create the distribution.
|
||||
let pinned_package = if let Some((editable, _)) = editables.get(package_name) {
|
||||
let pinned_package = if let Some((editable, _, _)) = editables.get(package_name)
|
||||
{
|
||||
Dist::from_editable(package_name.clone(), editable.clone())?
|
||||
} else {
|
||||
let url = to_precise(url)
|
||||
|
@ -156,7 +158,7 @@ impl ResolutionGraph {
|
|||
// Validate that the `extra` exists.
|
||||
let dist = PubGrubDistribution::from_registry(package_name, version);
|
||||
|
||||
if let Some((editable, metadata)) = editables.get(package_name) {
|
||||
if let Some((editable, metadata, _)) = editables.get(package_name) {
|
||||
if metadata.provides_extras.contains(extra) {
|
||||
extras
|
||||
.entry(package_name.clone())
|
||||
|
@ -210,7 +212,7 @@ impl ResolutionGraph {
|
|||
// Validate that the `extra` exists.
|
||||
let dist = PubGrubDistribution::from_url(package_name, url);
|
||||
|
||||
if let Some((editable, metadata)) = editables.get(package_name) {
|
||||
if let Some((editable, metadata, _)) = editables.get(package_name) {
|
||||
if metadata.provides_extras.contains(extra) {
|
||||
extras
|
||||
.entry(package_name.clone())
|
||||
|
@ -379,7 +381,7 @@ impl ResolutionGraph {
|
|||
manifest: &Manifest,
|
||||
index: &InMemoryIndex,
|
||||
marker_env: &MarkerEnvironment,
|
||||
) -> pep508_rs::MarkerTree {
|
||||
) -> Result<pep508_rs::MarkerTree, Box<ParsedUrlError>> {
|
||||
use pep508_rs::{
|
||||
MarkerExpression, MarkerOperator, MarkerTree, MarkerValue, MarkerValueString,
|
||||
MarkerValueVersion,
|
||||
|
@ -449,7 +451,14 @@ impl ResolutionGraph {
|
|||
dist.version_id()
|
||||
)
|
||||
};
|
||||
for req in manifest.apply(&archive.metadata.requires_dist) {
|
||||
let requirements: Vec<_> = archive
|
||||
.metadata
|
||||
.requires_dist
|
||||
.iter()
|
||||
.cloned()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, _>>()?;
|
||||
for req in manifest.apply(requirements.iter()) {
|
||||
let Some(ref marker_tree) = req.marker else {
|
||||
continue;
|
||||
};
|
||||
|
@ -462,7 +471,7 @@ impl ResolutionGraph {
|
|||
manifest
|
||||
.editables
|
||||
.iter()
|
||||
.flat_map(|(_, metadata)| &metadata.requires_dist),
|
||||
.flat_map(|(_, _, uv_requirements)| &uv_requirements.dependencies),
|
||||
);
|
||||
for direct_req in manifest.apply(direct_reqs) {
|
||||
let Some(ref marker_tree) = direct_req.marker else {
|
||||
|
@ -495,7 +504,7 @@ impl ResolutionGraph {
|
|||
};
|
||||
conjuncts.push(MarkerTree::Expression(expr));
|
||||
}
|
||||
MarkerTree::And(conjuncts)
|
||||
Ok(MarkerTree::And(conjuncts))
|
||||
}
|
||||
|
||||
pub fn lock(&self) -> Result<Lock, LockError> {
|
||||
|
@ -651,7 +660,7 @@ impl std::fmt::Display for DisplayResolutionGraph<'_> {
|
|||
return None;
|
||||
}
|
||||
|
||||
let node = if let Some((editable, _)) = self.resolution.editables.get(name) {
|
||||
let node = if let Some((editable, _, _)) = self.resolution.editables.get(name) {
|
||||
Node::Editable(name, editable)
|
||||
} else if self.include_extras {
|
||||
Node::Distribution(
|
||||
|
|
|
@ -46,7 +46,7 @@ impl ResolutionStrategy {
|
|||
ResolutionMode::LowestDirect => Self::LowestDirect(
|
||||
manifest
|
||||
.user_requirements(markers, dependencies)
|
||||
.map(|requirement| requirement.name.clone())
|
||||
.cloned()
|
||||
.collect(),
|
||||
),
|
||||
}
|
||||
|
|
|
@ -1,12 +1,12 @@
|
|||
use std::iter;
|
||||
use std::str::FromStr;
|
||||
|
||||
use either::Either;
|
||||
use rustc_hash::FxHashMap;
|
||||
|
||||
use distribution_filename::{SourceDistFilename, WheelFilename};
|
||||
use distribution_types::RemoteSource;
|
||||
use distribution_types::{RemoteSource, RequirementSource};
|
||||
use pep440_rs::{Operator, Version, VersionSpecifier, VersionSpecifierBuildError};
|
||||
use pep508_rs::{MarkerEnvironment, VersionOrUrl};
|
||||
use pep508_rs::MarkerEnvironment;
|
||||
use uv_normalize::PackageName;
|
||||
|
||||
use crate::{DependencyMode, Manifest};
|
||||
|
@ -29,10 +29,8 @@ impl Locals {
|
|||
// Add all direct requirements and constraints. There's no need to look for conflicts,
|
||||
// since conflicts will be enforced by the solver.
|
||||
for requirement in manifest.requirements(markers, dependencies) {
|
||||
if let Some(version_or_url) = requirement.version_or_url.as_ref() {
|
||||
for local in iter_locals(version_or_url) {
|
||||
required.insert(requirement.name.clone(), local);
|
||||
}
|
||||
for local in iter_locals(&requirement.source) {
|
||||
required.insert(requirement.name.clone(), local);
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -143,12 +141,14 @@ fn is_compatible(expected: &Version, provided: &Version) -> bool {
|
|||
|
||||
/// If a [`VersionSpecifier`] contains exact equality specifiers for a local version, returns an
|
||||
/// iterator over the local versions.
|
||||
fn iter_locals(version_or_url: &VersionOrUrl) -> impl Iterator<Item = Version> + '_ {
|
||||
match version_or_url {
|
||||
fn iter_locals(source: &RequirementSource) -> Box<dyn Iterator<Item = Version> + '_> {
|
||||
match source {
|
||||
// Extract all local versions from specifiers that require an exact version (e.g.,
|
||||
// `==1.0.0+local`).
|
||||
VersionOrUrl::VersionSpecifier(specifiers) => Either::Left(
|
||||
specifiers
|
||||
RequirementSource::Registry {
|
||||
specifier: version, ..
|
||||
} => Box::new(
|
||||
version
|
||||
.iter()
|
||||
.filter(|specifier| {
|
||||
matches!(specifier.operator(), Operator::Equal | Operator::ExactEqual)
|
||||
|
@ -158,29 +158,40 @@ fn iter_locals(version_or_url: &VersionOrUrl) -> impl Iterator<Item = Version> +
|
|||
),
|
||||
// Exact a local version from a URL, if it includes a fully-qualified filename (e.g.,
|
||||
// `torch-2.2.1%2Bcu118-cp311-cp311-linux_x86_64.whl`).
|
||||
VersionOrUrl::Url(url) => Either::Right(
|
||||
RequirementSource::Url { url, .. } => Box::new(
|
||||
url.filename()
|
||||
.ok()
|
||||
.and_then(|filename| {
|
||||
if let Ok(filename) = WheelFilename::from_str(&filename) {
|
||||
if filename.version.is_local() {
|
||||
Some(filename.version)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
Some(filename.version)
|
||||
} else if let Ok(filename) =
|
||||
SourceDistFilename::parsed_normalized_filename(&filename)
|
||||
{
|
||||
if filename.version.is_local() {
|
||||
Some(filename.version)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
Some(filename.version)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.into_iter(),
|
||||
.into_iter()
|
||||
.filter(pep440_rs::Version::is_local),
|
||||
),
|
||||
RequirementSource::Git { .. } => Box::new(iter::empty()),
|
||||
RequirementSource::Path { path, .. } => Box::new(
|
||||
path.file_name()
|
||||
.and_then(|filename| {
|
||||
let filename = filename.to_string_lossy();
|
||||
if let Ok(filename) = WheelFilename::from_str(&filename) {
|
||||
Some(filename.version)
|
||||
} else if let Ok(filename) =
|
||||
SourceDistFilename::parsed_normalized_filename(&filename)
|
||||
{
|
||||
Some(filename.version)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.into_iter()
|
||||
.filter(pep440_rs::Version::is_local),
|
||||
),
|
||||
}
|
||||
}
|
||||
|
@ -192,47 +203,59 @@ mod tests {
|
|||
use anyhow::Result;
|
||||
use url::Url;
|
||||
|
||||
use distribution_types::{ParsedUrl, RequirementSource};
|
||||
use pep440_rs::{Operator, Version, VersionSpecifier, VersionSpecifiers};
|
||||
use pep508_rs::{VerbatimUrl, VersionOrUrl};
|
||||
use pep508_rs::VerbatimUrl;
|
||||
|
||||
use crate::resolver::locals::{iter_locals, Locals};
|
||||
|
||||
#[test]
|
||||
fn extract_locals() -> Result<()> {
|
||||
// Extract from a source distribution in a URL.
|
||||
let version_or_url = VersionOrUrl::Url(VerbatimUrl::from_url(Url::parse(
|
||||
"https://example.com/foo-1.0.0+local.tar.gz",
|
||||
)?));
|
||||
let locals: Vec<_> = iter_locals(&version_or_url).collect();
|
||||
let url = VerbatimUrl::from_url(Url::parse("https://example.com/foo-1.0.0+local.tar.gz")?);
|
||||
let source =
|
||||
RequirementSource::from_parsed_url(ParsedUrl::try_from(url.to_url()).unwrap(), url);
|
||||
let locals: Vec<_> = iter_locals(&source).collect();
|
||||
assert_eq!(locals, vec![Version::from_str("1.0.0+local")?]);
|
||||
|
||||
// Extract from a wheel in a URL.
|
||||
let version_or_url = VersionOrUrl::Url(VerbatimUrl::from_url(Url::parse(
|
||||
let url = VerbatimUrl::from_url(Url::parse(
|
||||
"https://example.com/foo-1.0.0+local-cp39-cp39-linux_x86_64.whl",
|
||||
)?));
|
||||
let locals: Vec<_> = iter_locals(&version_or_url).collect();
|
||||
)?);
|
||||
let source =
|
||||
RequirementSource::from_parsed_url(ParsedUrl::try_from(url.to_url()).unwrap(), url);
|
||||
let locals: Vec<_> = iter_locals(&source).collect();
|
||||
assert_eq!(locals, vec![Version::from_str("1.0.0+local")?]);
|
||||
|
||||
// Don't extract anything if the URL is opaque.
|
||||
let version_or_url = VersionOrUrl::Url(VerbatimUrl::from_url(Url::parse(
|
||||
"git+https://example.com/foo/bar",
|
||||
)?));
|
||||
let locals: Vec<_> = iter_locals(&version_or_url).collect();
|
||||
let url = VerbatimUrl::from_url(Url::parse("git+https://example.com/foo/bar")?);
|
||||
let source =
|
||||
RequirementSource::from_parsed_url(ParsedUrl::try_from(url.to_url()).unwrap(), url);
|
||||
let locals: Vec<_> = iter_locals(&source).collect();
|
||||
assert!(locals.is_empty());
|
||||
|
||||
// Extract from `==` specifiers.
|
||||
let version_or_url = VersionOrUrl::VersionSpecifier(VersionSpecifiers::from_iter([
|
||||
let version = VersionSpecifiers::from_iter([
|
||||
VersionSpecifier::from_version(Operator::GreaterThan, Version::from_str("1.0.0")?)?,
|
||||
VersionSpecifier::from_version(Operator::Equal, Version::from_str("1.0.0+local")?)?,
|
||||
]));
|
||||
let locals: Vec<_> = iter_locals(&version_or_url).collect();
|
||||
]);
|
||||
let source = RequirementSource::Registry {
|
||||
specifier: version,
|
||||
index: None,
|
||||
};
|
||||
let locals: Vec<_> = iter_locals(&source).collect();
|
||||
assert_eq!(locals, vec![Version::from_str("1.0.0+local")?]);
|
||||
|
||||
// Ignore other specifiers.
|
||||
let version_or_url = VersionOrUrl::VersionSpecifier(VersionSpecifiers::from_iter([
|
||||
VersionSpecifier::from_version(Operator::NotEqual, Version::from_str("1.0.0+local")?)?,
|
||||
]));
|
||||
let locals: Vec<_> = iter_locals(&version_or_url).collect();
|
||||
let version = VersionSpecifiers::from_iter([VersionSpecifier::from_version(
|
||||
Operator::NotEqual,
|
||||
Version::from_str("1.0.0+local")?,
|
||||
)?]);
|
||||
let source = RequirementSource::Registry {
|
||||
specifier: version,
|
||||
index: None,
|
||||
};
|
||||
let locals: Vec<_> = iter_locals(&source).collect();
|
||||
assert!(locals.is_empty());
|
||||
|
||||
Ok(())
|
||||
|
|
|
@ -18,11 +18,12 @@ use tracing::{debug, enabled, info_span, instrument, trace, warn, Instrument, Le
|
|||
|
||||
use distribution_types::{
|
||||
BuiltDist, Dist, DistributionMetadata, IncompatibleDist, IncompatibleSource, IncompatibleWheel,
|
||||
InstalledDist, RemoteSource, ResolvedDist, ResolvedDistRef, SourceDist, VersionOrUrl,
|
||||
InstalledDist, RemoteSource, Requirement, ResolvedDist, ResolvedDistRef, SourceDist,
|
||||
VersionOrUrl,
|
||||
};
|
||||
pub(crate) use locals::Locals;
|
||||
use pep440_rs::{Version, MIN_VERSION};
|
||||
use pep508_rs::{MarkerEnvironment, Requirement};
|
||||
use pep508_rs::MarkerEnvironment;
|
||||
use platform_tags::Tags;
|
||||
use pypi_types::Metadata23;
|
||||
pub(crate) use urls::Urls;
|
||||
|
@ -610,7 +611,7 @@ impl<
|
|||
debug!("Searching for a compatible version of {package} @ {url} ({range})");
|
||||
|
||||
// If the dist is an editable, return the version from the editable metadata.
|
||||
if let Some((_local, metadata)) = self.editables.get(package_name) {
|
||||
if let Some((_local, metadata, _)) = self.editables.get(package_name) {
|
||||
let version = &metadata.version;
|
||||
|
||||
// The version is incompatible with the requirement.
|
||||
|
@ -830,7 +831,7 @@ impl<
|
|||
}
|
||||
|
||||
// Add a dependency on each editable.
|
||||
for (editable, metadata) in self.editables.iter() {
|
||||
for (editable, metadata, _) in self.editables.iter() {
|
||||
let package =
|
||||
PubGrubPackage::from_package(metadata.name.clone(), None, &self.urls);
|
||||
let version = Range::singleton(metadata.version.clone());
|
||||
|
@ -885,9 +886,16 @@ impl<
|
|||
}
|
||||
|
||||
// Determine if the distribution is editable.
|
||||
if let Some((_local, metadata)) = self.editables.get(package_name) {
|
||||
if let Some((_local, metadata, _)) = self.editables.get(package_name) {
|
||||
let requirements: Vec<_> = metadata
|
||||
.requires_dist
|
||||
.iter()
|
||||
.cloned()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, _>>()
|
||||
.map_err(Box::new)?;
|
||||
let constraints = PubGrubDependencies::from_requirements(
|
||||
&metadata.requires_dist,
|
||||
&requirements,
|
||||
&self.constraints,
|
||||
&self.overrides,
|
||||
Some(package_name),
|
||||
|
@ -995,8 +1003,15 @@ impl<
|
|||
}
|
||||
};
|
||||
|
||||
let requirements: Vec<_> = metadata
|
||||
.requires_dist
|
||||
.iter()
|
||||
.cloned()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, _>>()
|
||||
.map_err(Box::new)?;
|
||||
let constraints = PubGrubDependencies::from_requirements(
|
||||
&metadata.requires_dist,
|
||||
&requirements,
|
||||
&self.constraints,
|
||||
&self.overrides,
|
||||
Some(package_name),
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
use rustc_hash::FxHashMap;
|
||||
use tracing::debug;
|
||||
|
||||
use distribution_types::Verbatim;
|
||||
use distribution_types::{RequirementSource, Verbatim};
|
||||
use pep508_rs::{MarkerEnvironment, VerbatimUrl};
|
||||
use uv_distribution::is_same_reference;
|
||||
use uv_normalize::PackageName;
|
||||
|
@ -20,8 +20,8 @@ impl Urls {
|
|||
) -> Result<Self, ResolveError> {
|
||||
let mut urls: FxHashMap<PackageName, VerbatimUrl> = FxHashMap::default();
|
||||
|
||||
// Add the themselves to the list of required URLs.
|
||||
for (editable, metadata) in &manifest.editables {
|
||||
// Add the editables themselves to the list of required URLs.
|
||||
for (editable, metadata, _) in &manifest.editables {
|
||||
if let Some(previous) = urls.insert(metadata.name.clone(), editable.url.clone()) {
|
||||
if !is_equal(&previous, &editable.url) {
|
||||
if is_same_reference(&previous, &editable.url) {
|
||||
|
@ -39,12 +39,11 @@ impl Urls {
|
|||
|
||||
// Add all direct requirements and constraints. If there are any conflicts, return an error.
|
||||
for requirement in manifest.requirements(markers, dependencies) {
|
||||
if let Some(pep508_rs::VersionOrUrl::Url(url)) = &requirement.version_or_url {
|
||||
if let Some(previous) = urls.insert(requirement.name.clone(), url.clone()) {
|
||||
if !is_equal(&previous, url) {
|
||||
if is_same_reference(&previous, url) {
|
||||
debug!("Allowing {url} as a variant of {previous}");
|
||||
} else {
|
||||
match &requirement.source {
|
||||
RequirementSource::Registry { .. } => {}
|
||||
RequirementSource::Url { url, .. } | RequirementSource::Path { url, .. } => {
|
||||
if let Some(previous) = urls.insert(requirement.name.clone(), url.clone()) {
|
||||
if !is_equal(&previous, url) {
|
||||
return Err(ResolveError::ConflictingUrlsDirect(
|
||||
requirement.name.clone(),
|
||||
previous.verbatim().to_string(),
|
||||
|
@ -53,6 +52,21 @@ impl Urls {
|
|||
}
|
||||
}
|
||||
}
|
||||
RequirementSource::Git { url, .. } => {
|
||||
if let Some(previous) = urls.insert(requirement.name.clone(), url.clone()) {
|
||||
if !is_equal(&previous, url) {
|
||||
if is_same_reference(&previous, url) {
|
||||
debug!("Allowing {url} as a variant of {previous}");
|
||||
} else {
|
||||
return Err(ResolveError::ConflictingUrlsDirect(
|
||||
requirement.name.clone(),
|
||||
previous.verbatim().to_string(),
|
||||
url.verbatim().to_string(),
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -1,7 +1,8 @@
|
|||
use distribution_types::RequirementSource;
|
||||
use rustc_hash::{FxHashMap, FxHashSet};
|
||||
|
||||
use pep440_rs::Version;
|
||||
use pep508_rs::{MarkerEnvironment, VersionOrUrl};
|
||||
use pep508_rs::MarkerEnvironment;
|
||||
use uv_normalize::PackageName;
|
||||
|
||||
use crate::{DependencyMode, Manifest, Preference};
|
||||
|
@ -23,11 +24,10 @@ impl AllowedYanks {
|
|||
.requirements(markers, dependencies)
|
||||
.chain(manifest.preferences.iter().map(Preference::requirement))
|
||||
{
|
||||
let Some(VersionOrUrl::VersionSpecifier(specifiers)) = &requirement.version_or_url
|
||||
else {
|
||||
let RequirementSource::Registry { specifier, .. } = &requirement.source else {
|
||||
continue;
|
||||
};
|
||||
let [specifier] = specifiers.as_ref() else {
|
||||
let [specifier] = specifier.as_ref() else {
|
||||
continue;
|
||||
};
|
||||
if matches!(
|
||||
|
|
|
@ -10,8 +10,8 @@ use anyhow::Result;
|
|||
use chrono::{DateTime, Utc};
|
||||
use once_cell::sync::Lazy;
|
||||
|
||||
use distribution_types::{IndexLocations, Resolution, SourceDist};
|
||||
use pep508_rs::{MarkerEnvironment, Requirement, StringVersion};
|
||||
use distribution_types::{IndexLocations, Requirement, Resolution, SourceDist};
|
||||
use pep508_rs::{MarkerEnvironment, StringVersion};
|
||||
use platform_tags::{Arch, Os, Platform, Tags};
|
||||
use uv_cache::Cache;
|
||||
use uv_client::RegistryClientBuilder;
|
||||
|
@ -154,7 +154,10 @@ macro_rules! assert_snapshot {
|
|||
|
||||
#[tokio::test]
|
||||
async fn black() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![Requirement::from_str("black<=23.9.1").unwrap()]);
|
||||
let manifest = Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("black<=23.9.1").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let options = OptionsBuilder::new()
|
||||
.exclude_newer(Some(*EXCLUDE_NEWER))
|
||||
.build();
|
||||
|
@ -180,9 +183,10 @@ async fn black() -> Result<()> {
|
|||
|
||||
#[tokio::test]
|
||||
async fn black_colorama() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![
|
||||
Requirement::from_str("black[colorama]<=23.9.1").unwrap()
|
||||
]);
|
||||
let manifest = Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("black[colorama]<=23.9.1").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let options = OptionsBuilder::new()
|
||||
.exclude_newer(Some(*EXCLUDE_NEWER))
|
||||
.build();
|
||||
|
@ -211,9 +215,10 @@ async fn black_colorama() -> Result<()> {
|
|||
/// Resolve Black with an invalid extra. The resolver should ignore the extra.
|
||||
#[tokio::test]
|
||||
async fn black_tensorboard() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![
|
||||
Requirement::from_str("black[tensorboard]<=23.9.1").unwrap()
|
||||
]);
|
||||
let manifest = Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("black[tensorboard]<=23.9.1").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let options = OptionsBuilder::new()
|
||||
.exclude_newer(Some(*EXCLUDE_NEWER))
|
||||
.build();
|
||||
|
@ -239,7 +244,10 @@ async fn black_tensorboard() -> Result<()> {
|
|||
|
||||
#[tokio::test]
|
||||
async fn black_python_310() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![Requirement::from_str("black<=23.9.1").unwrap()]);
|
||||
let manifest = Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("black<=23.9.1").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let options = OptionsBuilder::new()
|
||||
.exclude_newer(Some(*EXCLUDE_NEWER))
|
||||
.build();
|
||||
|
@ -272,10 +280,14 @@ async fn black_python_310() -> Result<()> {
|
|||
#[tokio::test]
|
||||
async fn black_mypy_extensions() -> Result<()> {
|
||||
let manifest = Manifest::new(
|
||||
vec![Requirement::from_str("black<=23.9.1").unwrap()],
|
||||
Constraints::from_requirements(vec![
|
||||
Requirement::from_str("mypy-extensions<0.4.4").unwrap()
|
||||
]),
|
||||
vec![
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("black<=23.9.1").unwrap())
|
||||
.unwrap(),
|
||||
],
|
||||
Constraints::from_requirements(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("mypy-extensions<0.4.4").unwrap(),
|
||||
)
|
||||
.unwrap()]),
|
||||
Overrides::default(),
|
||||
vec![],
|
||||
None,
|
||||
|
@ -311,10 +323,14 @@ async fn black_mypy_extensions() -> Result<()> {
|
|||
#[tokio::test]
|
||||
async fn black_mypy_extensions_extra() -> Result<()> {
|
||||
let manifest = Manifest::new(
|
||||
vec![Requirement::from_str("black<=23.9.1").unwrap()],
|
||||
Constraints::from_requirements(vec![
|
||||
Requirement::from_str("mypy-extensions[extra]<0.4.4").unwrap()
|
||||
]),
|
||||
vec![
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("black<=23.9.1").unwrap())
|
||||
.unwrap(),
|
||||
],
|
||||
Constraints::from_requirements(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("mypy-extensions[extra]<0.4.4").unwrap(),
|
||||
)
|
||||
.unwrap()]),
|
||||
Overrides::default(),
|
||||
vec![],
|
||||
None,
|
||||
|
@ -350,8 +366,14 @@ async fn black_mypy_extensions_extra() -> Result<()> {
|
|||
#[tokio::test]
|
||||
async fn black_flake8() -> Result<()> {
|
||||
let manifest = Manifest::new(
|
||||
vec![Requirement::from_str("black<=23.9.1").unwrap()],
|
||||
Constraints::from_requirements(vec![Requirement::from_str("flake8<1").unwrap()]),
|
||||
vec![
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("black<=23.9.1").unwrap())
|
||||
.unwrap(),
|
||||
],
|
||||
Constraints::from_requirements(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("flake8<1").unwrap(),
|
||||
)
|
||||
.unwrap()]),
|
||||
Overrides::default(),
|
||||
vec![],
|
||||
None,
|
||||
|
@ -384,7 +406,10 @@ async fn black_flake8() -> Result<()> {
|
|||
|
||||
#[tokio::test]
|
||||
async fn black_lowest() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![Requirement::from_str("black>21").unwrap()]);
|
||||
let manifest = Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("black>21").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let options = OptionsBuilder::new()
|
||||
.resolution_mode(ResolutionMode::Lowest)
|
||||
.exclude_newer(Some(*EXCLUDE_NEWER))
|
||||
|
@ -411,7 +436,10 @@ async fn black_lowest() -> Result<()> {
|
|||
|
||||
#[tokio::test]
|
||||
async fn black_lowest_direct() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![Requirement::from_str("black>21").unwrap()]);
|
||||
let manifest = Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("black>21").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let options = OptionsBuilder::new()
|
||||
.resolution_mode(ResolutionMode::LowestDirect)
|
||||
.exclude_newer(Some(*EXCLUDE_NEWER))
|
||||
|
@ -439,12 +467,12 @@ async fn black_lowest_direct() -> Result<()> {
|
|||
#[tokio::test]
|
||||
async fn black_respect_preference() -> Result<()> {
|
||||
let manifest = Manifest::new(
|
||||
vec![Requirement::from_str("black<=23.9.1")?],
|
||||
vec![Requirement::from_pep508(pep508_rs::Requirement::from_str("black<=23.9.1")?).unwrap()],
|
||||
Constraints::default(),
|
||||
Overrides::default(),
|
||||
vec![Preference::from_requirement(Requirement::from_str(
|
||||
"black==23.9.0",
|
||||
)?)],
|
||||
vec![Preference::from_requirement(
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("black==23.9.0")?).unwrap(),
|
||||
)],
|
||||
None,
|
||||
vec![],
|
||||
Exclusions::default(),
|
||||
|
@ -477,12 +505,12 @@ async fn black_respect_preference() -> Result<()> {
|
|||
#[tokio::test]
|
||||
async fn black_ignore_preference() -> Result<()> {
|
||||
let manifest = Manifest::new(
|
||||
vec![Requirement::from_str("black<=23.9.1")?],
|
||||
vec![Requirement::from_pep508(pep508_rs::Requirement::from_str("black<=23.9.1")?).unwrap()],
|
||||
Constraints::default(),
|
||||
Overrides::default(),
|
||||
vec![Preference::from_requirement(Requirement::from_str(
|
||||
"black==23.9.2",
|
||||
)?)],
|
||||
vec![Preference::from_requirement(
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("black==23.9.2")?).unwrap(),
|
||||
)],
|
||||
None,
|
||||
vec![],
|
||||
Exclusions::default(),
|
||||
|
@ -513,7 +541,10 @@ async fn black_ignore_preference() -> Result<()> {
|
|||
|
||||
#[tokio::test]
|
||||
async fn black_disallow_prerelease() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![Requirement::from_str("black<=20.0").unwrap()]);
|
||||
let manifest = Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("black<=20.0").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let options = OptionsBuilder::new()
|
||||
.prerelease_mode(PreReleaseMode::Disallow)
|
||||
.exclude_newer(Some(*EXCLUDE_NEWER))
|
||||
|
@ -534,7 +565,10 @@ async fn black_disallow_prerelease() -> Result<()> {
|
|||
|
||||
#[tokio::test]
|
||||
async fn black_allow_prerelease_if_necessary() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![Requirement::from_str("black<=20.0").unwrap()]);
|
||||
let manifest = Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("black<=20.0").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let options = OptionsBuilder::new()
|
||||
.prerelease_mode(PreReleaseMode::IfNecessary)
|
||||
.exclude_newer(Some(*EXCLUDE_NEWER))
|
||||
|
@ -555,7 +589,10 @@ async fn black_allow_prerelease_if_necessary() -> Result<()> {
|
|||
|
||||
#[tokio::test]
|
||||
async fn pylint_disallow_prerelease() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![Requirement::from_str("pylint==2.3.0").unwrap()]);
|
||||
let manifest = Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("pylint==2.3.0").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let options = OptionsBuilder::new()
|
||||
.prerelease_mode(PreReleaseMode::Disallow)
|
||||
.exclude_newer(Some(*EXCLUDE_NEWER))
|
||||
|
@ -578,7 +615,10 @@ async fn pylint_disallow_prerelease() -> Result<()> {
|
|||
|
||||
#[tokio::test]
|
||||
async fn pylint_allow_prerelease() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![Requirement::from_str("pylint==2.3.0").unwrap()]);
|
||||
let manifest = Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("pylint==2.3.0").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let options = OptionsBuilder::new()
|
||||
.prerelease_mode(PreReleaseMode::Allow)
|
||||
.exclude_newer(Some(*EXCLUDE_NEWER))
|
||||
|
@ -602,8 +642,10 @@ async fn pylint_allow_prerelease() -> Result<()> {
|
|||
#[tokio::test]
|
||||
async fn pylint_allow_explicit_prerelease_without_marker() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![
|
||||
Requirement::from_str("pylint==2.3.0").unwrap(),
|
||||
Requirement::from_str("isort>=5.0.0").unwrap(),
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("pylint==2.3.0").unwrap())
|
||||
.unwrap(),
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("isort>=5.0.0").unwrap())
|
||||
.unwrap(),
|
||||
]);
|
||||
let options = OptionsBuilder::new()
|
||||
.prerelease_mode(PreReleaseMode::Explicit)
|
||||
|
@ -628,8 +670,10 @@ async fn pylint_allow_explicit_prerelease_without_marker() -> Result<()> {
|
|||
#[tokio::test]
|
||||
async fn pylint_allow_explicit_prerelease_with_marker() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![
|
||||
Requirement::from_str("pylint==2.3.0").unwrap(),
|
||||
Requirement::from_str("isort>=5.0.0b").unwrap(),
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("pylint==2.3.0").unwrap())
|
||||
.unwrap(),
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("isort>=5.0.0b").unwrap())
|
||||
.unwrap(),
|
||||
]);
|
||||
let options = OptionsBuilder::new()
|
||||
.prerelease_mode(PreReleaseMode::Explicit)
|
||||
|
@ -655,7 +699,10 @@ async fn pylint_allow_explicit_prerelease_with_marker() -> Result<()> {
|
|||
/// fail with a pre-release-centric hint.
|
||||
#[tokio::test]
|
||||
async fn msgraph_sdk() -> Result<()> {
|
||||
let manifest = Manifest::simple(vec![Requirement::from_str("msgraph-sdk==1.0.0").unwrap()]);
|
||||
let manifest = Manifest::simple(vec![Requirement::from_pep508(
|
||||
pep508_rs::Requirement::from_str("msgraph-sdk==1.0.0").unwrap(),
|
||||
)
|
||||
.unwrap()]);
|
||||
let options = OptionsBuilder::new()
|
||||
.exclude_newer(Some(*EXCLUDE_NEWER))
|
||||
.build();
|
||||
|
|
|
@ -3,10 +3,12 @@ use std::str::FromStr;
|
|||
use rustc_hash::FxHashMap;
|
||||
use url::Url;
|
||||
|
||||
use distribution_types::{DistributionMetadata, HashPolicy, PackageId};
|
||||
use pep508_rs::{MarkerEnvironment, VersionOrUrl};
|
||||
use distribution_types::{
|
||||
DistributionMetadata, HashPolicy, PackageId, Requirement, RequirementSource,
|
||||
UnresolvedRequirement,
|
||||
};
|
||||
use pep508_rs::MarkerEnvironment;
|
||||
use pypi_types::{HashDigest, HashError};
|
||||
use requirements_txt::RequirementsTxtRequirement;
|
||||
use uv_normalize::PackageName;
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
|
@ -81,9 +83,9 @@ impl HashStrategy {
|
|||
}
|
||||
}
|
||||
|
||||
/// Generate the required hashes from a set of [`RequirementsTxtRequirement`] entries.
|
||||
/// Generate the required hashes from a set of [`UnresolvedRequirement`] entries.
|
||||
pub fn from_requirements<'a>(
|
||||
requirements: impl Iterator<Item = (&'a RequirementsTxtRequirement, &'a [String])>,
|
||||
requirements: impl Iterator<Item = (&'a UnresolvedRequirement, &'a [String])>,
|
||||
markers: &MarkerEnvironment,
|
||||
) -> Result<Self, HashStrategyError> {
|
||||
let mut hashes = FxHashMap::<PackageId, Vec<HashDigest>>::default();
|
||||
|
@ -97,37 +99,10 @@ impl HashStrategy {
|
|||
|
||||
// Every requirement must be either a pinned version or a direct URL.
|
||||
let id = match &requirement {
|
||||
RequirementsTxtRequirement::Pep508(requirement) => {
|
||||
match requirement.version_or_url.as_ref() {
|
||||
Some(VersionOrUrl::Url(url)) => {
|
||||
// Direct URLs are always allowed.
|
||||
PackageId::from_url(url)
|
||||
}
|
||||
Some(VersionOrUrl::VersionSpecifier(specifiers)) => {
|
||||
// Must be a single specifier.
|
||||
let [specifier] = specifiers.as_ref() else {
|
||||
return Err(HashStrategyError::UnpinnedRequirement(
|
||||
requirement.to_string(),
|
||||
));
|
||||
};
|
||||
|
||||
// Must be pinned to a specific version.
|
||||
if *specifier.operator() != pep440_rs::Operator::Equal {
|
||||
return Err(HashStrategyError::UnpinnedRequirement(
|
||||
requirement.to_string(),
|
||||
));
|
||||
}
|
||||
|
||||
PackageId::from_registry(requirement.name.clone())
|
||||
}
|
||||
None => {
|
||||
return Err(HashStrategyError::UnpinnedRequirement(
|
||||
requirement.to_string(),
|
||||
));
|
||||
}
|
||||
}
|
||||
UnresolvedRequirement::Named(requirement) => {
|
||||
uv_requirement_to_package_id(requirement)?
|
||||
}
|
||||
RequirementsTxtRequirement::Unnamed(requirement) => {
|
||||
UnresolvedRequirement::Unnamed(requirement) => {
|
||||
// Direct URLs are always allowed.
|
||||
PackageId::from_url(&requirement.url)
|
||||
}
|
||||
|
@ -151,6 +126,31 @@ impl HashStrategy {
|
|||
}
|
||||
}
|
||||
|
||||
fn uv_requirement_to_package_id(requirement: &Requirement) -> Result<PackageId, HashStrategyError> {
|
||||
Ok(match &requirement.source {
|
||||
RequirementSource::Registry { specifier, .. } => {
|
||||
// Must be a single specifier.
|
||||
let [specifier] = specifier.as_ref() else {
|
||||
return Err(HashStrategyError::UnpinnedRequirement(
|
||||
requirement.to_string(),
|
||||
));
|
||||
};
|
||||
|
||||
// Must be pinned to a specific version.
|
||||
if *specifier.operator() != pep440_rs::Operator::Equal {
|
||||
return Err(HashStrategyError::UnpinnedRequirement(
|
||||
requirement.to_string(),
|
||||
));
|
||||
}
|
||||
|
||||
PackageId::from_registry(requirement.name.clone())
|
||||
}
|
||||
RequirementSource::Url { url, .. }
|
||||
| RequirementSource::Git { url, .. }
|
||||
| RequirementSource::Path { url, .. } => PackageId::from_url(url),
|
||||
})
|
||||
}
|
||||
|
||||
#[derive(thiserror::Error, Debug)]
|
||||
pub enum HashStrategyError {
|
||||
#[error(transparent)]
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
use pep508_rs::Requirement;
|
||||
use distribution_types::Requirement;
|
||||
use uv_normalize::ExtraName;
|
||||
|
||||
/// A set of requirements as requested by a parent requirement.
|
||||
|
|
|
@ -3,14 +3,13 @@ use std::path::{Path, PathBuf};
|
|||
|
||||
use anyhow::Result;
|
||||
|
||||
use distribution_types::{IndexLocations, InstalledDist, Resolution, SourceDist};
|
||||
|
||||
use pep508_rs::{PackageName, Requirement};
|
||||
use distribution_types::{IndexLocations, InstalledDist, Requirement, Resolution, SourceDist};
|
||||
use pep508_rs::PackageName;
|
||||
use uv_cache::Cache;
|
||||
use uv_configuration::{BuildKind, NoBinary, NoBuild, SetupPyStrategy};
|
||||
use uv_interpreter::{Interpreter, PythonEnvironment};
|
||||
|
||||
use crate::BuildIsolation;
|
||||
use uv_configuration::{BuildKind, NoBinary, NoBuild, SetupPyStrategy};
|
||||
|
||||
/// Avoids cyclic crate dependencies between resolver, installer and builder.
|
||||
///
|
||||
|
|
|
@ -16,6 +16,7 @@ workspace = true
|
|||
[dependencies]
|
||||
distribution-types = { workspace = true }
|
||||
install-wheel-rs = { workspace = true, features = ["clap"], default-features = false }
|
||||
pep440_rs = { workspace = true }
|
||||
pep508_rs = { workspace = true }
|
||||
platform-tags = { workspace = true }
|
||||
pypi-types = { workspace = true }
|
||||
|
@ -45,6 +46,7 @@ clap = { workspace = true, features = ["derive", "string", "wrap_help"] }
|
|||
clap_complete_command = { workspace = true }
|
||||
flate2 = { workspace = true, default-features = false }
|
||||
fs-err = { workspace = true, features = ["tokio"] }
|
||||
indexmap = { workspace = true }
|
||||
indicatif = { workspace = true }
|
||||
itertools = { workspace = true }
|
||||
miette = { workspace = true, features = ["fancy"] }
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
use indexmap::IndexMap;
|
||||
use std::borrow::Cow;
|
||||
use std::env;
|
||||
use std::fmt::Write;
|
||||
|
@ -14,9 +15,12 @@ use owo_colors::OwoColorize;
|
|||
use tempfile::tempdir_in;
|
||||
use tracing::debug;
|
||||
|
||||
use distribution_types::{IndexLocations, LocalEditable, LocalEditables, Verbatim};
|
||||
use distribution_types::{IndexLocations, LocalEditable, LocalEditables, ParsedUrlError, Verbatim};
|
||||
use distribution_types::{Requirement, Requirements};
|
||||
use install_wheel_rs::linker::LinkMode;
|
||||
|
||||
use platform_tags::Tags;
|
||||
use pypi_types::Metadata23;
|
||||
use requirements_txt::EditableRequirement;
|
||||
use uv_auth::store_credentials_from_url;
|
||||
use uv_cache::Cache;
|
||||
|
@ -366,17 +370,33 @@ pub(crate) async fn pip_compile(
|
|||
|
||||
// Build all editables.
|
||||
let editable_wheel_dir = tempdir_in(cache.root())?;
|
||||
let editables: Vec<_> = downloader
|
||||
let editables: Vec<(LocalEditable, Metadata23, Requirements)> = downloader
|
||||
.build_editables(editables, editable_wheel_dir.path())
|
||||
.await
|
||||
.context("Failed to build editables")?
|
||||
.into_iter()
|
||||
.map(|built_editable| (built_editable.editable, built_editable.metadata))
|
||||
.collect();
|
||||
.map(|built_editable| {
|
||||
let requirements = Requirements {
|
||||
dependencies: built_editable
|
||||
.metadata
|
||||
.requires_dist
|
||||
.iter()
|
||||
.cloned()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, ParsedUrlError>>()?,
|
||||
optional_dependencies: IndexMap::default(),
|
||||
};
|
||||
Ok::<_, ParsedUrlError>((
|
||||
built_editable.editable,
|
||||
built_editable.metadata,
|
||||
requirements,
|
||||
))
|
||||
})
|
||||
.collect::<Result<_, _>>()?;
|
||||
|
||||
// Validate that the editables are compatible with the target Python version.
|
||||
let requirement = PythonRequirement::new(&interpreter, &markers);
|
||||
for (.., metadata) in &editables {
|
||||
for (_, metadata, _) in &editables {
|
||||
if let Some(python_requires) = metadata.requires_python.as_ref() {
|
||||
if !python_requires.contains(requirement.target()) {
|
||||
return Err(anyhow!(
|
||||
|
@ -518,7 +538,7 @@ pub(crate) async fn pip_compile(
|
|||
}
|
||||
|
||||
if include_marker_expression {
|
||||
let relevant_markers = resolution.marker_tree(&manifest, &top_level_index, &markers);
|
||||
let relevant_markers = resolution.marker_tree(&manifest, &top_level_index, &markers)?;
|
||||
writeln!(
|
||||
writer,
|
||||
"{}",
|
||||
|
|
|
@ -11,13 +11,16 @@ use tempfile::tempdir_in;
|
|||
use tracing::{debug, enabled, Level};
|
||||
|
||||
use distribution_types::{
|
||||
DistributionMetadata, IndexLocations, InstalledMetadata, LocalDist, LocalEditable,
|
||||
LocalEditables, Name, Resolution,
|
||||
DistributionMetadata, IndexLocations, InstalledMetadata, InstalledVersion, LocalDist,
|
||||
LocalEditable, LocalEditables, Name, ParsedUrl, ParsedUrlError, RequirementSource, Resolution,
|
||||
};
|
||||
use distribution_types::{Requirement, Requirements};
|
||||
use indexmap::IndexMap;
|
||||
use install_wheel_rs::linker::LinkMode;
|
||||
use pep508_rs::{MarkerEnvironment, Requirement};
|
||||
use pep440_rs::{VersionSpecifier, VersionSpecifiers};
|
||||
use pep508_rs::{MarkerEnvironment, VerbatimUrl};
|
||||
use platform_tags::Tags;
|
||||
use pypi_types::{Metadata23, Yanked};
|
||||
use pypi_types::Yanked;
|
||||
use requirements_txt::EditableRequirement;
|
||||
use uv_auth::store_credentials_from_url;
|
||||
use uv_cache::Cache;
|
||||
|
@ -184,7 +187,7 @@ pub(crate) async fn pip_install(
|
|||
if enabled!(Level::DEBUG) {
|
||||
for requirement in recursive_requirements
|
||||
.iter()
|
||||
.map(ToString::to_string)
|
||||
.map(|entry| entry.requirement.to_string())
|
||||
.sorted()
|
||||
{
|
||||
debug!("Requirement satisfied: {requirement}");
|
||||
|
@ -642,25 +645,62 @@ async fn resolve(
|
|||
|
||||
// Prefer current site packages; filter out packages that are marked for reinstall or upgrade
|
||||
let preferences = site_packages
|
||||
.requirements()
|
||||
.filter(|requirement| !exclusions.contains(&requirement.name))
|
||||
.map(Preference::from_requirement)
|
||||
.collect();
|
||||
.iter()
|
||||
.filter(|dist| !exclusions.contains(dist.name()))
|
||||
.map(|dist| {
|
||||
let source = match dist.installed_version() {
|
||||
InstalledVersion::Version(version) => RequirementSource::Registry {
|
||||
specifier: VersionSpecifiers::from(VersionSpecifier::equals_version(
|
||||
version.clone(),
|
||||
)),
|
||||
// TODO(konstin): track index
|
||||
index: None,
|
||||
},
|
||||
InstalledVersion::Url(url, _version) => {
|
||||
let parsed_url = ParsedUrl::try_from(url.clone())?;
|
||||
RequirementSource::from_parsed_url(
|
||||
parsed_url,
|
||||
VerbatimUrl::from_url(url.clone()),
|
||||
)
|
||||
}
|
||||
};
|
||||
let requirement = Requirement {
|
||||
name: dist.name().clone(),
|
||||
extras: vec![],
|
||||
marker: None,
|
||||
source,
|
||||
};
|
||||
Ok(Preference::from_requirement(requirement))
|
||||
})
|
||||
.collect::<Result<_, _>>()
|
||||
.map_err(Error::UnsupportedInstalledDist)?;
|
||||
|
||||
// Collect constraints and overrides.
|
||||
let constraints = Constraints::from_requirements(constraints);
|
||||
let overrides = Overrides::from_requirements(overrides);
|
||||
|
||||
// Map the editables to their metadata.
|
||||
let editables: Vec<(LocalEditable, Metadata23)> = editables
|
||||
let editables: Vec<_> = editables
|
||||
.iter()
|
||||
.map(|built_editable| {
|
||||
(
|
||||
let dependencies: Vec<_> = built_editable
|
||||
.metadata
|
||||
.requires_dist
|
||||
.iter()
|
||||
.cloned()
|
||||
.map(Requirement::from_pep508)
|
||||
.collect::<Result<_, _>>()?;
|
||||
Ok::<_, ParsedUrlError>((
|
||||
built_editable.editable.clone(),
|
||||
built_editable.metadata.clone(),
|
||||
)
|
||||
Requirements {
|
||||
dependencies,
|
||||
optional_dependencies: IndexMap::default(),
|
||||
},
|
||||
))
|
||||
})
|
||||
.collect();
|
||||
.collect::<Result<_, _>>()
|
||||
.map_err(|err| Error::ParsedUrl(Box::new(err)))?;
|
||||
|
||||
// Determine any lookahead requirements.
|
||||
let lookaheads = match options.dependency_mode {
|
||||
|
@ -1157,6 +1197,12 @@ enum Error {
|
|||
#[error(transparent)]
|
||||
Lookahead(#[from] uv_requirements::LookaheadError),
|
||||
|
||||
#[error(transparent)]
|
||||
ParsedUrl(Box<distribution_types::ParsedUrlError>),
|
||||
|
||||
#[error(transparent)]
|
||||
Anyhow(#[from] anyhow::Error),
|
||||
|
||||
#[error("Installed distribution has unsupported type")]
|
||||
UnsupportedInstalledDist(#[source] Box<distribution_types::ParsedUrlError>),
|
||||
}
|
||||
|
|
|
@ -5,9 +5,8 @@ use itertools::{Either, Itertools};
|
|||
use owo_colors::OwoColorize;
|
||||
use tracing::debug;
|
||||
|
||||
use distribution_types::{InstalledMetadata, Name};
|
||||
use pep508_rs::{Requirement, UnnamedRequirement};
|
||||
use requirements_txt::RequirementsTxtRequirement;
|
||||
use distribution_types::{InstalledMetadata, Name, Requirement, UnresolvedRequirement};
|
||||
use pep508_rs::UnnamedRequirement;
|
||||
use uv_cache::Cache;
|
||||
use uv_client::{BaseClientBuilder, Connectivity};
|
||||
use uv_configuration::KeyringProviderType;
|
||||
|
@ -97,8 +96,8 @@ pub(crate) async fn pip_uninstall(
|
|||
.requirements
|
||||
.into_iter()
|
||||
.partition_map(|entry| match entry.requirement {
|
||||
RequirementsTxtRequirement::Pep508(requirement) => Either::Left(requirement),
|
||||
RequirementsTxtRequirement::Unnamed(requirement) => Either::Right(requirement),
|
||||
UnresolvedRequirement::Named(requirement) => Either::Left(requirement),
|
||||
UnresolvedRequirement::Unnamed(requirement) => Either::Right(requirement),
|
||||
});
|
||||
|
||||
// Sort and deduplicate the packages, which are keyed by name. Like `pip`, we ignore the
|
||||
|
|
|
@ -3,11 +3,13 @@ use crate::commands::ExitStatus;
|
|||
use crate::commands::{elapsed, ChangeEvent, ChangeEventKind};
|
||||
use crate::printer::Printer;
|
||||
use anyhow::{Context, Result};
|
||||
use distribution_types::{IndexLocations, InstalledMetadata, LocalDist, Name, Resolution};
|
||||
use distribution_types::{
|
||||
IndexLocations, InstalledMetadata, LocalDist, Name, Requirement, Resolution,
|
||||
};
|
||||
use install_wheel_rs::linker::LinkMode;
|
||||
use itertools::Itertools;
|
||||
use owo_colors::OwoColorize;
|
||||
use pep508_rs::{MarkerEnvironment, PackageName, Requirement};
|
||||
use pep508_rs::{MarkerEnvironment, PackageName};
|
||||
use platform_tags::Tags;
|
||||
use pypi_types::Yanked;
|
||||
use std::ffi::OsString;
|
||||
|
@ -236,7 +238,7 @@ async fn environment_for_run(
|
|||
"All requirements satisfied: {}",
|
||||
recursive_requirements
|
||||
.iter()
|
||||
.map(ToString::to_string)
|
||||
.map(|entry| entry.requirement.to_string())
|
||||
.sorted()
|
||||
.join(" | ")
|
||||
);
|
||||
|
|
|
@ -10,9 +10,8 @@ use miette::{Diagnostic, IntoDiagnostic};
|
|||
use owo_colors::OwoColorize;
|
||||
use thiserror::Error;
|
||||
|
||||
use distribution_types::{DistributionMetadata, IndexLocations, Name, ResolvedDist};
|
||||
use distribution_types::{DistributionMetadata, IndexLocations, Name, Requirement, ResolvedDist};
|
||||
use install_wheel_rs::linker::LinkMode;
|
||||
use pep508_rs::Requirement;
|
||||
use uv_auth::store_credentials_from_url;
|
||||
use uv_cache::Cache;
|
||||
use uv_client::{Connectivity, FlatIndexClient, RegistryClientBuilder};
|
||||
|
@ -211,12 +210,21 @@ async fn venv_impl(
|
|||
.with_options(OptionsBuilder::new().exclude_newer(exclude_newer).build());
|
||||
|
||||
// Resolve the seed packages.
|
||||
let mut requirements = vec![Requirement::from_str("pip").unwrap()];
|
||||
let mut requirements =
|
||||
vec![
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("pip").unwrap()).unwrap(),
|
||||
];
|
||||
|
||||
// Only include `setuptools` and `wheel` on Python <3.12
|
||||
if interpreter.python_tuple() < (3, 12) {
|
||||
requirements.push(Requirement::from_str("setuptools").unwrap());
|
||||
requirements.push(Requirement::from_str("wheel").unwrap());
|
||||
requirements.push(
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("setuptools").unwrap())
|
||||
.unwrap(),
|
||||
);
|
||||
requirements.push(
|
||||
Requirement::from_pep508(pep508_rs::Requirement::from_str("wheel").unwrap())
|
||||
.unwrap(),
|
||||
);
|
||||
}
|
||||
let resolution = build_dispatch
|
||||
.resolve(&requirements)
|
||||
|
|
|
@ -5224,7 +5224,7 @@ fn unsupported_scheme() -> Result<()> {
|
|||
----- stdout -----
|
||||
|
||||
----- stderr -----
|
||||
error: Unsupported scheme `bzr+https` on URL: bzr+https://example.com/anyio (Bazaar is not supported)
|
||||
error: Unsupported URL prefix `bzr` in URL: `bzr+https://example.com/anyio`
|
||||
"###
|
||||
);
|
||||
|
||||
|
@ -5961,12 +5961,12 @@ fn compile_pyproject_toml_recursive_extra() -> Result<()> {
|
|||
name = "my-project"
|
||||
version = "0.0.1"
|
||||
dependencies = [
|
||||
"tomli",
|
||||
"tomli>=2,<3",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
test = [
|
||||
"pep517",
|
||||
"pep517>=0.13,<0.14",
|
||||
"my-project[dev]"
|
||||
]
|
||||
dev = [
|
||||
|
|
|
@ -184,8 +184,9 @@ fn invalid_pyproject_toml_schema() -> Result<()> {
|
|||
Ok(())
|
||||
}
|
||||
|
||||
/// For user controlled pyproject.toml files, we enforce PEP 621.
|
||||
#[test]
|
||||
fn invalid_pyproject_toml_requirement() -> Result<()> {
|
||||
fn invalid_pyproject_toml_requirement_direct() -> Result<()> {
|
||||
let context = TestContext::new("3.12");
|
||||
let pyproject_toml = context.temp_dir.child("pyproject.toml");
|
||||
pyproject_toml.write_str(
|
||||
|
@ -208,7 +209,54 @@ dependencies = ["flask==1.0.x"]
|
|||
----- stdout -----
|
||||
|
||||
----- stderr -----
|
||||
error: Failed to build: `file://[TEMP_DIR]/`
|
||||
error: Failed to parse `pyproject.toml`
|
||||
Caused by: after parsing '1.0', found '.x', which is not part of a valid version
|
||||
flask==1.0.x
|
||||
^^^^^^^
|
||||
"###
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// For indirect, non-user controlled pyproject.toml, we don't enforce correctness.
|
||||
///
|
||||
/// If we fail to extract the PEP 621 metadata, we fall back to treating it as a source
|
||||
/// tree, as there are some cases where the `pyproject.toml` may not be a valid PEP
|
||||
/// 621 file, but might still resolve under PEP 517. (If the source tree doesn't
|
||||
/// resolve under PEP 517, we'll catch that later.)
|
||||
///
|
||||
/// For example, Hatch's "Context formatting" API is not compliant with PEP 621, as
|
||||
/// it expects dynamic processing by the build backend for the static metadata
|
||||
/// fields. See: <https://hatch.pypa.io/latest/config/context/>
|
||||
#[test]
|
||||
fn invalid_pyproject_toml_requirement_indirect() -> Result<()> {
|
||||
let context = TestContext::new("3.12");
|
||||
let pyproject_toml = context.temp_dir.child("path_dep/pyproject.toml");
|
||||
pyproject_toml.write_str(
|
||||
r#"[project]
|
||||
name = "project"
|
||||
dependencies = ["flask==1.0.x"]
|
||||
"#,
|
||||
)?;
|
||||
let requirements_txt = context.temp_dir.child("requirements.txt");
|
||||
requirements_txt.write_str("./path_dep")?;
|
||||
|
||||
let filters = [("exit status", "exit code")]
|
||||
.into_iter()
|
||||
.chain(context.filters())
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
uv_snapshot!(filters, context.install()
|
||||
.arg("-r")
|
||||
.arg("requirements.txt"), @r###"
|
||||
success: false
|
||||
exit_code: 2
|
||||
----- stdout -----
|
||||
|
||||
----- stderr -----
|
||||
error: Failed to download and build: `project @ file://[TEMP_DIR]/path_dep`
|
||||
Caused by: Failed to build: `project @ file://[TEMP_DIR]/path_dep`
|
||||
Caused by: Build backend failed to determine extra requires with `build_wheel()` with exit code: 1
|
||||
--- stdout:
|
||||
configuration error: `project.dependencies[0]` must be pep508
|
||||
|
@ -3658,7 +3706,7 @@ fn already_installed_dependent_editable() {
|
|||
);
|
||||
}
|
||||
|
||||
/// Install an local package that depends on a previously installed local package.
|
||||
/// Install a local package that depends on a previously installed local package.
|
||||
#[test]
|
||||
fn already_installed_local_path_dependent() {
|
||||
let context = TestContext::new("3.12");
|
||||
|
@ -4460,7 +4508,7 @@ fn require_hashes_unnamed_repeated() -> Result<()> {
|
|||
--hash=sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2 \
|
||||
--hash=sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc
|
||||
# via anyio
|
||||
"} )?;
|
||||
"})?;
|
||||
|
||||
uv_snapshot!(context.install()
|
||||
.arg("-r")
|
||||
|
@ -4540,3 +4588,91 @@ fn require_hashes_override() -> Result<()> {
|
|||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn tool_uv_sources() -> Result<()> {
|
||||
let context = TestContext::new("3.12");
|
||||
// Use a subdir to test path normalization.
|
||||
let require_path = "some_dir/pyproject.toml";
|
||||
let pyproject_toml = context.temp_dir.child(require_path);
|
||||
pyproject_toml.write_str(indoc! {r#"
|
||||
[project]
|
||||
name = "foo"
|
||||
version = "0.0.0"
|
||||
dependencies = [
|
||||
"tqdm>4,<=5",
|
||||
"packaging @ git+https://github.com/pypa/packaging",
|
||||
"poetry_editable",
|
||||
"urllib3 @ https://files.pythonhosted.org/packages/a2/73/a68704750a7679d0b6d3ad7aa8d4da8e14e151ae82e6fee774e6e0d05ec8/urllib3-2.2.1-py3-none-any.whl",
|
||||
# Windows consistency
|
||||
"colorama>0.4,<5",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
utils = [
|
||||
"boltons==24.0.0"
|
||||
]
|
||||
dont_install_me = [
|
||||
"borken @ https://example.org/does/not/exist"
|
||||
]
|
||||
|
||||
[tool.uv.sources]
|
||||
tqdm = { url = "https://files.pythonhosted.org/packages/a5/d6/502a859bac4ad5e274255576cd3e15ca273cdb91731bc39fb840dd422ee9/tqdm-4.66.0-py3-none-any.whl" }
|
||||
boltons = { git = "https://github.com/mahmoud/boltons", rev = "57fbaa9b673ed85b32458b31baeeae230520e4a0" }
|
||||
poetry_editable = { path = "../poetry_editable" }
|
||||
"#})?;
|
||||
|
||||
let project_root = fs_err::canonicalize(std::env::current_dir()?.join("../.."))?;
|
||||
fs_err::create_dir_all(context.temp_dir.join("poetry_editable/poetry_editable"))?;
|
||||
fs_err::copy(
|
||||
project_root.join("scripts/packages/poetry_editable/pyproject.toml"),
|
||||
context.temp_dir.join("poetry_editable/pyproject.toml"),
|
||||
)?;
|
||||
fs_err::copy(
|
||||
project_root.join("scripts/packages/poetry_editable/poetry_editable/__init__.py"),
|
||||
context
|
||||
.temp_dir
|
||||
.join("poetry_editable/poetry_editable/__init__.py"),
|
||||
)?;
|
||||
|
||||
// Install the editable packages.
|
||||
uv_snapshot!(context.filters(), windows_filters=false, context.install()
|
||||
.arg("-r")
|
||||
.arg(require_path)
|
||||
.arg("--extra")
|
||||
.arg("utils"), @r###"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
|
||||
----- stderr -----
|
||||
Resolved 9 packages in [TIME]
|
||||
Downloaded 9 packages in [TIME]
|
||||
Installed 9 packages in [TIME]
|
||||
+ anyio==4.3.0
|
||||
+ boltons==24.0.1.dev0 (from git+https://github.com/mahmoud/boltons@57fbaa9b673ed85b32458b31baeeae230520e4a0)
|
||||
+ colorama==0.4.6
|
||||
+ idna==3.6
|
||||
+ packaging==24.1.dev0 (from git+https://github.com/pypa/packaging@32deafe8668a2130a3366b98154914d188f3718e)
|
||||
+ poetry-editable==0.1.0 (from file://[TEMP_DIR]/poetry_editable)
|
||||
+ sniffio==1.3.1
|
||||
+ tqdm==4.66.0 (from https://files.pythonhosted.org/packages/a5/d6/502a859bac4ad5e274255576cd3e15ca273cdb91731bc39fb840dd422ee9/tqdm-4.66.0-py3-none-any.whl)
|
||||
+ urllib3==2.2.1 (from https://files.pythonhosted.org/packages/a2/73/a68704750a7679d0b6d3ad7aa8d4da8e14e151ae82e6fee774e6e0d05ec8/urllib3-2.2.1-py3-none-any.whl)
|
||||
"###
|
||||
);
|
||||
|
||||
// Install the editable packages.
|
||||
uv_snapshot!(context.install()
|
||||
.arg("-r")
|
||||
.arg(require_path), @r###"
|
||||
success: true
|
||||
exit_code: 0
|
||||
----- stdout -----
|
||||
|
||||
----- stderr -----
|
||||
Resolved 8 packages in [TIME]
|
||||
Audited 8 packages in [TIME]
|
||||
"###
|
||||
);
|
||||
Ok(())
|
||||
}
|
||||
|
|
221
docs/specifying_dependencies.md
Normal file
221
docs/specifying_dependencies.md
Normal file
|
@ -0,0 +1,221 @@
|
|||
**Warning: this documentation applies to a future version of uv. Please refer to
|
||||
[README.md](../README.md) for documentation for the latest release.**
|
||||
|
||||
# Specifying dependencies
|
||||
|
||||
In uv, dependency specification is divided between two tables: `project.dependencies` and
|
||||
`tool.uv.sources`.
|
||||
|
||||
At a high-level, the former is used to define the standards-compliant dependency metadata,
|
||||
propagated when uploading to PyPI or building a wheel. The latter is used to specify the _sources_
|
||||
required to install the dependencies, which can come from a Git repository, a URL, a local path, a
|
||||
different index, etc.
|
||||
|
||||
## `project.dependencies`
|
||||
|
||||
The `project.dependencies` table represents the dependencies that are used when uploading to PyPI or
|
||||
building a wheel. Individual dependencies are specified using [PEP 508](#PEP 508), and the table as
|
||||
a whole follows the [PEP 621](https://packaging.python.org/en/latest/specifications/pyproject-toml/)
|
||||
standard.
|
||||
|
||||
You should think of `project.dependencies` as defining the packages that are required for your
|
||||
project, along with the version constraints that should be used when installing them.
|
||||
|
||||
`project.dependencies` is structured as a list in which each entry includes a dependency name and
|
||||
version, and optionally extras or environment markers for platform-specific packages, as in:
|
||||
|
||||
```toml
|
||||
[project]
|
||||
name = "albatross"
|
||||
version = "0.1.0"
|
||||
dependencies = [
|
||||
# Any version in this range
|
||||
"tqdm >=4.66.2,<5",
|
||||
# Exactly this version of torch
|
||||
"torch ==2.2.2",
|
||||
# Install transformers with the torch extra
|
||||
"transformers[torch] >=4.39.3,<5",
|
||||
# Only install this package on older python versions
|
||||
# See "Environment Markers" for more information
|
||||
"importlib_metadata >=7.1.0,<8; python_version < '3.10'",
|
||||
"mollymawk ==0.1.0"
|
||||
]
|
||||
```
|
||||
|
||||
If you only require packages from PyPI or a single `--index-url`, then `project.dependencies` is all
|
||||
you need. If, however, you depend on local packages, Git dependencies, or packages from a different
|
||||
index, you should use `tool.uv.sources`.
|
||||
|
||||
## `tool.uv.sources`
|
||||
|
||||
During development, you may rely on a package that isn't available on PyPI. For example, let’s say
|
||||
that we need to pull in a version of `tqdm` from a specific Git commit, `importlib_metadata` from
|
||||
a dedicated URL, `torch` from the PyTorch-specific index, and `mollymawk` from our own workspace.
|
||||
|
||||
We can express these requirements by enriching the `project.dependencies` table with
|
||||
`tool.uv.sources`:
|
||||
|
||||
```toml
|
||||
[project]
|
||||
name = "albatross"
|
||||
version = "0.1.0"
|
||||
dependencies = [
|
||||
# Any version in this range.
|
||||
"tqdm >=4.66.2,<5",
|
||||
# Exactly this version of torch.
|
||||
"torch ==2.2.2",
|
||||
# Install transformers with the torch extra.
|
||||
"transformers[torch] >=4.39.3,<5",
|
||||
# Only install this package on Python versions prior to 3.10.
|
||||
"importlib_metadata >=7.1.0,<8; python_version < '3.10'",
|
||||
"mollymawk ==0.1.0"
|
||||
]
|
||||
|
||||
[tool.uv.sources]
|
||||
# Install a specific Git commit.
|
||||
tqdm = { git = "https://github.com/tqdm/tqdm", rev = "cc372d09dcd5a5eabdc6ed4cf365bdb0be004d44" }
|
||||
# Install a remote source distribution (`.zip`, `.tar.gz`) or wheel (`.whl`).
|
||||
importlib_metadata = { url = "https://github.com/python/importlib_metadata/archive/refs/tags/v7.1.0.zip" }
|
||||
# Pin a dependency for a specific registry.
|
||||
torch = { index = "torch-cu118" }
|
||||
# Use a package included in the same repository (editable installation).
|
||||
mollymawk = { workspace = true }
|
||||
|
||||
# See "Workspaces".
|
||||
[tool.uv.workspace]
|
||||
include = [
|
||||
"packages/mollymawk"
|
||||
]
|
||||
|
||||
# See "Indexes".
|
||||
[tool.uv.indexes]
|
||||
torch-cu118 = "https://download.pytorch.org/whl/cu118"
|
||||
```
|
||||
|
||||
We support the following sources (which are mutually exclusive for a given dependency):
|
||||
|
||||
- Git: Use `git` with a Git URL, optionally one of `rev`, `tag`, or `branch`, and
|
||||
optionally a `subdirectoy`, if the package isn't in the repository root.
|
||||
- URL: A `url` key with an `https://` URL to a wheel (ending in `.whl`) or a source distribution
|
||||
(ending in `.zip` or `.tar.gz`), and optionally a `subdirectory` if the source distribution isn't
|
||||
in the archive root.
|
||||
- Path: The `path` is an absolute or relative path to a wheel (ending in `.whl`), a source
|
||||
distribution (ending in `.zip` or `.tar.gz`), or a directory containing a `pyproject.toml`. We
|
||||
recommend using workspaces over manual path dependencies. For directories, you can specify
|
||||
`editable = true` for an [editable](#Editables) installation.
|
||||
- Index: Set the `index` key to the name of an index name to install it
|
||||
from this registry instead of your default index.
|
||||
- Workspace: Set `workspace = true` to use the workspace dependency. You need to explicitly require
|
||||
all workspace dependencies you use. They are [editable](#Editables) by default; specify
|
||||
`editable = false` to install them as regular dependencies.
|
||||
|
||||
Note that if a non-uv project uses this project as a Git- or path-dependency, only
|
||||
`project.dependencies` is transferred, and you'll need to apply the information in the source table
|
||||
using the configuration of the other project's package manager.
|
||||
|
||||
## Optional dependencies
|
||||
|
||||
For libraries, you may want to make certain features and their dependencies optional. For example,
|
||||
pandas has an [`excel` extra](https://pandas.pydata.org/docs/getting_started/install.html#excel-files)
|
||||
and a [`plot` extra](https://pandas.pydata.org/docs/getting_started/install.html#visualization) to limit the installation of Excel parsers and (e.g.) `matplotlib` to
|
||||
those that explicitly require them. In the case of Pandas, you can install those extras with:
|
||||
`pandas[plot, excel]`.
|
||||
|
||||
Optional dependencies are specified in `[project.optional-dependencies]`, a TOML table that maps
|
||||
from extra name to its dependencies, following the [PEP 508](#PEP 508) syntax.
|
||||
|
||||
`tool.uv.sources` applies to this table equally.
|
||||
|
||||
```toml
|
||||
[project]
|
||||
name = "pandas"
|
||||
version = "1.0.0"
|
||||
|
||||
[project.optional-dependencies]
|
||||
plot = [
|
||||
"matplotlib>=3.6.3"
|
||||
]
|
||||
excel = [
|
||||
"odfpy>=1.4.1",
|
||||
"openpyxl>=3.1.0",
|
||||
"python-calamine>=0.1.7",
|
||||
"pyxlsb>=1.0.10",
|
||||
"xlrd>=2.0.1",
|
||||
"xlsxwriter>=3.0.5"
|
||||
]
|
||||
```
|
||||
|
||||
## Development dependencies
|
||||
|
||||
_N.B. This feature is not yet implemented._
|
||||
|
||||
Unlike optional dependencies, development dependencies are local-only and will _not_ be published
|
||||
to PyPI or other indexes. As such, development dependencies are included under `[tool.uv]` instead
|
||||
of `[project]`. `tool.uv.sources` applies to them equally.
|
||||
|
||||
```toml
|
||||
[tool.uv]
|
||||
dev-dependencies = [
|
||||
"pytest >=8.1.1,<9"
|
||||
]
|
||||
```
|
||||
|
||||
You can also put development dependencies into groups and install them individually:
|
||||
|
||||
```toml
|
||||
[tool.uv.dev-dependencies]
|
||||
test = [
|
||||
"pytest >=8.1.1,<9"
|
||||
]
|
||||
lint = [
|
||||
"mypy >=1,<2"
|
||||
]
|
||||
|
||||
[tool.uv]
|
||||
default-dev-dependencies = ["test"]
|
||||
```
|
||||
|
||||
## PEP 508
|
||||
|
||||
The [PEP 508](https://peps.python.org/pep-0508/) syntax allows you to specify, in order:
|
||||
|
||||
* The dependency name
|
||||
* The extras you want (optional)
|
||||
* The version specifier
|
||||
* An environment marker (optional)
|
||||
|
||||
The version specifiers are comma separated and added together, e.g., `foo >=1.2.3,<2,!=1.4.0` is
|
||||
interpreted as "a version of `foo` that's at least 1.2.3, but less than 2, and not 1.4.0".
|
||||
|
||||
Specifiers are padded with trailing zeros if required, so `foo ==2` matches foo 2.0.0, too.
|
||||
|
||||
You can use a star for the last digit with equals, e.g. `foo ==2.1.*` will accept any release from
|
||||
the 2.1 series. Similarly, `~=` matches where the last digit is equal or higher, e.g., `foo ~=1.2`
|
||||
is equal to `foo >=1.2,<2`, and `foo ~=1.2.3` is equal to `foo >=1.2.3,<1.3`.
|
||||
|
||||
Extras are comma-separated in square bracket between name and version, e.g., `pandas[excel,plot] ==2.2`.
|
||||
|
||||
Some dependencies are only required in specific environments, e.g., a specific Python version or
|
||||
operating system. For example to install the `importlib-metadata` backport for the
|
||||
`importlib.metadata` module, you would use `importlib-metadata >=7.1.0,<8; python_version < '3.10'`.
|
||||
To install `colorama` on Windows (but omit it on other platforms), use
|
||||
`colorama >=0.4.6,<5; platform_system == "Windows"`.
|
||||
|
||||
You combine markers with `and` and `or` and parentheses, e.g., `aiohttp >=3.7.4,<4; (sys_platform != 'win32' or implementation_name != 'pypy') and python_version >= '3.10'`.
|
||||
Note that versions within markers must be quoted, while versions _outside_ of markers must _not_ be
|
||||
quoted.
|
||||
|
||||
## Editables
|
||||
|
||||
A regular installation of a directory with a Python package first builds a wheel and then installs
|
||||
that wheel into your virtual environment, copying all source files. When you edit the source files,
|
||||
the virtual environment will contain outdated versions.
|
||||
|
||||
Editable installations instead add a link to the project within the virtual environment
|
||||
(a `.pth` file), which instructs the interpreter to include your sources directly.
|
||||
|
||||
There are some limitations to editables (mainly: your build backend needs to support them, and
|
||||
native modules aren't recompiled before import), but they are useful for development, as your
|
||||
virtual environment will always use the latest version of your package.
|
||||
|
||||
uv uses editable installation for workspace packages and patched dependencies by default.
|
222
uv.schema.json
generated
222
uv.schema.json
generated
|
@ -1,7 +1,7 @@
|
|||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"title": "Options",
|
||||
"description": "A `[tool.uv]` section.",
|
||||
"title": "ToolUv",
|
||||
"description": "Metadata and configuration for uv.",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"cache-dir": {
|
||||
|
@ -37,6 +37,25 @@
|
|||
"boolean",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"sources": {
|
||||
"type": [
|
||||
"object",
|
||||
"null"
|
||||
],
|
||||
"additionalProperties": {
|
||||
"$ref": "#/definitions/Source"
|
||||
}
|
||||
},
|
||||
"workspace": {
|
||||
"anyOf": [
|
||||
{
|
||||
"$ref": "#/definitions/ToolUvWorkspace"
|
||||
},
|
||||
{
|
||||
"type": "null"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"additionalProperties": false,
|
||||
|
@ -573,6 +592,180 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
"Source": {
|
||||
"description": "A `tool.uv.sources` value.",
|
||||
"anyOf": [
|
||||
{
|
||||
"description": "A remote git repository, either over HTTPS or over SSH.\n\nExample: ```toml flask = { git = \"https://github.com/pallets/flask\", tag = \"3.0.0\" } ```",
|
||||
"type": "object",
|
||||
"required": [
|
||||
"git"
|
||||
],
|
||||
"properties": {
|
||||
"branch": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"git": {
|
||||
"type": "string",
|
||||
"format": "uri"
|
||||
},
|
||||
"rev": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"subdirectory": {
|
||||
"description": "The path to the directory with the `pyproject.toml` if it is not in the archive root.",
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"tag": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
},
|
||||
{
|
||||
"description": "A remote `http://` or `https://` URL, either a wheel (`.whl`) or a source distribution (`.zip`, `.tar.gz`).\n\nExample: ```toml flask = { url = \"https://files.pythonhosted.org/packages/61/80/ffe1da13ad9300f87c93af113edd0638c75138c42a0994becfacac078c06/flask-3.0.3-py3-none-any.whl\" } ```",
|
||||
"type": "object",
|
||||
"required": [
|
||||
"url"
|
||||
],
|
||||
"properties": {
|
||||
"subdirectory": {
|
||||
"description": "For source distributions, the path to the directory with the `pyproject.toml` if it is not in the archive root.",
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"url": {
|
||||
"type": "string",
|
||||
"format": "uri"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
},
|
||||
{
|
||||
"description": "The path to a dependency. It can either be a wheel (a `.whl` file), a source distribution as archive (a `.zip` or `.tag.gz` file) or a source distribution as directory (a directory with a pyproject.toml in, or a legacy directory with only a setup.py but non pyproject.toml in it).",
|
||||
"type": "object",
|
||||
"required": [
|
||||
"path"
|
||||
],
|
||||
"properties": {
|
||||
"editable": {
|
||||
"description": "`false` by default.",
|
||||
"type": [
|
||||
"boolean",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"path": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
},
|
||||
{
|
||||
"description": "When using a version as requirement, you can optionally pin the requirement to an index you defined, e.g. `torch` after configuring `torch` to `https://download.pytorch.org/whl/cu118`.",
|
||||
"type": "object",
|
||||
"required": [
|
||||
"index"
|
||||
],
|
||||
"properties": {
|
||||
"index": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
},
|
||||
{
|
||||
"description": "A dependency on another package in the workspace.",
|
||||
"type": "object",
|
||||
"required": [
|
||||
"workspace"
|
||||
],
|
||||
"properties": {
|
||||
"editable": {
|
||||
"description": "`true` by default.",
|
||||
"type": [
|
||||
"boolean",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"workspace": {
|
||||
"type": "boolean"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
},
|
||||
{
|
||||
"description": "Show a better error message for invalid combinations of options.",
|
||||
"type": "object",
|
||||
"required": [
|
||||
"git",
|
||||
"index",
|
||||
"patch",
|
||||
"url",
|
||||
"workspace"
|
||||
],
|
||||
"properties": {
|
||||
"branch": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"git": {
|
||||
"type": "string"
|
||||
},
|
||||
"index": {
|
||||
"type": "string"
|
||||
},
|
||||
"patch": {
|
||||
"type": "string"
|
||||
},
|
||||
"rev": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"subdirectory": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"tag": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"url": {
|
||||
"type": "string"
|
||||
},
|
||||
"workspace": {
|
||||
"type": "boolean"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
}
|
||||
]
|
||||
},
|
||||
"String": {
|
||||
"type": "string"
|
||||
},
|
||||
"TargetTriple": {
|
||||
"description": "The supported target triples. Each triple consists of an architecture, vendor, and operating system.\n\nSee: <https://doc.rust-lang.org/nightly/rustc/platform-support.html>",
|
||||
"oneOf": [
|
||||
|
@ -675,6 +868,31 @@
|
|||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"ToolUvWorkspace": {
|
||||
"description": "`tool.uv.workspace`.",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"exclude": {
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
],
|
||||
"items": {
|
||||
"$ref": "#/definitions/String"
|
||||
}
|
||||
},
|
||||
"members": {
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
],
|
||||
"items": {
|
||||
"$ref": "#/definitions/String"
|
||||
}
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
}
|
||||
}
|
||||
}
|
Loading…
Add table
Add a link
Reference in a new issue