mirror of
https://github.com/astral-sh/uv.git
synced 2025-07-07 21:35:00 +00:00

Some checks failed
CI / Determine changes (push) Has been cancelled
CI / lint (push) Has been cancelled
CI / cargo shear (push) Has been cancelled
CI / typos (push) Has been cancelled
CI / mkdocs (push) Has been cancelled
CI / cargo clippy | ubuntu (push) Has been cancelled
CI / cargo clippy | windows (push) Has been cancelled
CI / cargo dev generate-all (push) Has been cancelled
CI / cargo test | ubuntu (push) Has been cancelled
CI / cargo test | macos (push) Has been cancelled
CI / cargo test | windows (push) Has been cancelled
CI / check windows trampoline | aarch64 (push) Has been cancelled
CI / check windows trampoline | i686 (push) Has been cancelled
CI / check windows trampoline | x86_64 (push) Has been cancelled
CI / test windows trampoline | i686 (push) Has been cancelled
CI / test windows trampoline | x86_64 (push) Has been cancelled
CI / build binary | windows aarch64 (push) Has been cancelled
CI / build binary | linux libc (push) Has been cancelled
CI / build binary | linux musl (push) Has been cancelled
CI / build binary | macos aarch64 (push) Has been cancelled
CI / build binary | macos x86_64 (push) Has been cancelled
CI / build binary | windows x86_64 (push) Has been cancelled
CI / cargo build (msrv) (push) Has been cancelled
CI / build binary | freebsd (push) Has been cancelled
CI / ecosystem test | pydantic/pydantic-core (push) Has been cancelled
CI / ecosystem test | prefecthq/prefect (push) Has been cancelled
CI / ecosystem test | pallets/flask (push) Has been cancelled
CI / smoke test | linux (push) Has been cancelled
CI / check system | alpine (push) Has been cancelled
CI / smoke test | macos (push) Has been cancelled
CI / smoke test | windows x86_64 (push) Has been cancelled
CI / smoke test | windows aarch64 (push) Has been cancelled
CI / integration test | conda on ubuntu (push) Has been cancelled
CI / integration test | deadsnakes python3.9 on ubuntu (push) Has been cancelled
CI / integration test | free-threaded on windows (push) Has been cancelled
CI / integration test | pypy on ubuntu (push) Has been cancelled
CI / integration test | pypy on windows (push) Has been cancelled
CI / integration test | graalpy on ubuntu (push) Has been cancelled
CI / integration test | graalpy on windows (push) Has been cancelled
CI / integration test | pyodide on ubuntu (push) Has been cancelled
CI / integration test | github actions (push) Has been cancelled
CI / integration test | free-threaded python on github actions (push) Has been cancelled
CI / integration test | determine publish changes (push) Has been cancelled
CI / integration test | uv publish (push) Has been cancelled
CI / integration test | uv_build (push) Has been cancelled
CI / check cache | ubuntu (push) Has been cancelled
CI / check cache | macos aarch64 (push) Has been cancelled
CI / check system | python on debian (push) Has been cancelled
CI / check system | python on fedora (push) Has been cancelled
CI / check system | python on ubuntu (push) Has been cancelled
CI / check system | python on rocky linux 8 (push) Has been cancelled
CI / check system | python on rocky linux 9 (push) Has been cancelled
CI / check system | graalpy on ubuntu (push) Has been cancelled
CI / check system | pypy on ubuntu (push) Has been cancelled
CI / check system | pyston (push) Has been cancelled
CI / check system | python on macos aarch64 (push) Has been cancelled
CI / check system | python3.10 on windows x86 (push) Has been cancelled
CI / check system | python3.13 on windows x86-64 (push) Has been cancelled
CI / check system | x86-64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | windows registry (push) Has been cancelled
CI / check system | python3.12 via chocolatey (push) Has been cancelled
CI / check system | python3.9 via pyenv (push) Has been cancelled
CI / check system | python3.13 (push) Has been cancelled
CI / check system | homebrew python on macos aarch64 (push) Has been cancelled
CI / check system | python on macos x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86-64 (push) Has been cancelled
CI / benchmarks | walltime aarch64 linux (push) Has been cancelled
CI / benchmarks | instrumented (push) Has been cancelled
CI / check system | conda3.11 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.8 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.11 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.8 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.11 on windows x86-64 (push) Has been cancelled
CI / check system | conda3.8 on windows x86-64 (push) Has been cancelled
CI / check system | amazonlinux (push) Has been cancelled
CI / check system | embedded python3.10 on windows x86-64 (push) Has been cancelled
(or legacy tool.uv.workspace). This cleaves out a dedicated SourcedDependencyGroups type based on RequiresDist but with only the DependencyGroup handling implemented. This allows `uv pip` to read `dependency-groups` from pyproject.tomls that only have that table defined, per PEP 735, and as implemented by `pip`. However we want our implementation to respect various uv features when they're available: * `tool.uv.sources` * `tool.uv.index` * `tool.uv.dependency-groups.mygroup.requires-python` (#13735) As such we want to opportunistically detect "as much as possible" while doing as little as possible when things are missing. The issue with the old RequiresDist path was that it fundamentally wanted to build the package, and if `[project]` was missing it would try to desperately run setuptools on the pyproject.toml to try to find metadata and make a hash of things. At the same time, the old code also put in a lot of effort to try to pretend that `uv pip` dependency-groups worked like `uv` dependency-groups with defaults and non-only semantics, only to separate them back out again. By explicitly separating them out, we confidently get the expected behaviour. Note that dependency-group support is still included in RequiresDist, as some `uv` paths still use it. It's unclear to me if those paths want this same treatment -- for now I conclude no. Fixes #13138
206 lines
7.8 KiB
Rust
206 lines
7.8 KiB
Rust
use std::borrow::Cow;
|
|
use std::path::Path;
|
|
use std::sync::Arc;
|
|
|
|
use anyhow::{Context, Result};
|
|
use futures::TryStreamExt;
|
|
use futures::stream::FuturesOrdered;
|
|
use url::Url;
|
|
|
|
use uv_configuration::ExtrasSpecification;
|
|
use uv_distribution::{DistributionDatabase, FlatRequiresDist, Reporter, RequiresDist};
|
|
use uv_distribution_types::Requirement;
|
|
use uv_distribution_types::{
|
|
BuildableSource, DirectorySourceUrl, HashGeneration, HashPolicy, SourceUrl, VersionId,
|
|
};
|
|
use uv_fs::Simplified;
|
|
use uv_normalize::{ExtraName, PackageName};
|
|
use uv_pep508::RequirementOrigin;
|
|
use uv_redacted::DisplaySafeUrl;
|
|
use uv_resolver::{InMemoryIndex, MetadataResponse};
|
|
use uv_types::{BuildContext, HashStrategy};
|
|
|
|
#[derive(Debug, Clone)]
|
|
pub struct SourceTreeResolution {
|
|
/// The requirements sourced from the source trees.
|
|
pub requirements: Box<[Requirement]>,
|
|
/// The names of the projects that were resolved.
|
|
pub project: PackageName,
|
|
/// The extras used when resolving the requirements.
|
|
pub extras: Box<[ExtraName]>,
|
|
}
|
|
|
|
/// A resolver for requirements specified via source trees.
|
|
///
|
|
/// Used, e.g., to determine the input requirements when a user specifies a `pyproject.toml`
|
|
/// file, which may require running PEP 517 build hooks to extract metadata.
|
|
pub struct SourceTreeResolver<'a, Context: BuildContext> {
|
|
/// The extras to include when resolving requirements.
|
|
extras: &'a ExtrasSpecification,
|
|
/// The hash policy to enforce.
|
|
hasher: &'a HashStrategy,
|
|
/// The in-memory index for resolving dependencies.
|
|
index: &'a InMemoryIndex,
|
|
/// The database for fetching and building distributions.
|
|
database: DistributionDatabase<'a, Context>,
|
|
}
|
|
|
|
impl<'a, Context: BuildContext> SourceTreeResolver<'a, Context> {
|
|
/// Instantiate a new [`SourceTreeResolver`] for a given set of `source_trees`.
|
|
pub fn new(
|
|
extras: &'a ExtrasSpecification,
|
|
hasher: &'a HashStrategy,
|
|
index: &'a InMemoryIndex,
|
|
database: DistributionDatabase<'a, Context>,
|
|
) -> Self {
|
|
Self {
|
|
extras,
|
|
hasher,
|
|
index,
|
|
database,
|
|
}
|
|
}
|
|
|
|
/// Set the [`Reporter`] to use for this resolver.
|
|
#[must_use]
|
|
pub fn with_reporter(self, reporter: Arc<dyn Reporter>) -> Self {
|
|
Self {
|
|
database: self.database.with_reporter(reporter),
|
|
..self
|
|
}
|
|
}
|
|
|
|
/// Resolve the requirements from the provided source trees.
|
|
pub async fn resolve(
|
|
self,
|
|
source_trees: impl Iterator<Item = &Path>,
|
|
) -> Result<Vec<SourceTreeResolution>> {
|
|
let resolutions: Vec<_> = source_trees
|
|
.map(async |source_tree| self.resolve_source_tree(source_tree).await)
|
|
.collect::<FuturesOrdered<_>>()
|
|
.try_collect()
|
|
.await?;
|
|
Ok(resolutions)
|
|
}
|
|
|
|
/// Infer the dependencies for a directory dependency.
|
|
async fn resolve_source_tree(&self, path: &Path) -> Result<SourceTreeResolution> {
|
|
let metadata = self.resolve_requires_dist(path).await?;
|
|
let origin = RequirementOrigin::Project(path.to_path_buf(), metadata.name.clone());
|
|
|
|
// Determine the extras to include when resolving the requirements.
|
|
let extras = self
|
|
.extras
|
|
.extra_names(metadata.provides_extras.iter())
|
|
.cloned()
|
|
.collect::<Vec<_>>();
|
|
|
|
let mut requirements = Vec::new();
|
|
|
|
// Flatten any transitive extras and include dependencies
|
|
// (unless something like --only-group was passed)
|
|
requirements.extend(
|
|
FlatRequiresDist::from_requirements(metadata.requires_dist, &metadata.name)
|
|
.into_iter()
|
|
.map(|requirement| Requirement {
|
|
origin: Some(origin.clone()),
|
|
marker: requirement.marker.simplify_extras(&extras),
|
|
..requirement
|
|
}),
|
|
);
|
|
|
|
let requirements = requirements.into_boxed_slice();
|
|
let project = metadata.name;
|
|
let extras = metadata.provides_extras;
|
|
|
|
Ok(SourceTreeResolution {
|
|
requirements,
|
|
project,
|
|
extras,
|
|
})
|
|
}
|
|
|
|
/// Resolve the [`RequiresDist`] metadata for a given source tree. Attempts to resolve the
|
|
/// requirements without building the distribution, even if the project contains (e.g.) a
|
|
/// dynamic version since, critically, we don't need to install the package itself; only its
|
|
/// dependencies.
|
|
async fn resolve_requires_dist(&self, path: &Path) -> Result<RequiresDist> {
|
|
// Convert to a buildable source.
|
|
let source_tree = fs_err::canonicalize(path).with_context(|| {
|
|
format!(
|
|
"Failed to canonicalize path to source tree: {}",
|
|
path.user_display()
|
|
)
|
|
})?;
|
|
let source_tree = source_tree.parent().ok_or_else(|| {
|
|
anyhow::anyhow!(
|
|
"The file `{}` appears to be a `pyproject.toml`, `setup.py`, or `setup.cfg` file, which must be in a directory",
|
|
path.user_display()
|
|
)
|
|
})?;
|
|
|
|
// If the path is a `pyproject.toml`, attempt to extract the requirements statically. The
|
|
// distribution database will do this too, but we can be even more aggressive here since we
|
|
// _only_ need the requirements. So, for example, even if the version is dynamic, we can
|
|
// still extract the requirements without performing a build, unlike in the database where
|
|
// we typically construct a "complete" metadata object.
|
|
if let Some(metadata) = self.database.requires_dist(source_tree).await? {
|
|
return Ok(metadata);
|
|
}
|
|
|
|
let Ok(url) = Url::from_directory_path(source_tree).map(DisplaySafeUrl::from) else {
|
|
return Err(anyhow::anyhow!("Failed to convert path to URL"));
|
|
};
|
|
let source = SourceUrl::Directory(DirectorySourceUrl {
|
|
url: &url,
|
|
install_path: Cow::Borrowed(source_tree),
|
|
editable: false,
|
|
});
|
|
|
|
// Determine the hash policy. Since we don't have a package name, we perform a
|
|
// manual match.
|
|
let hashes = match self.hasher {
|
|
HashStrategy::None => HashPolicy::None,
|
|
HashStrategy::Generate(mode) => HashPolicy::Generate(*mode),
|
|
HashStrategy::Verify(_) => HashPolicy::Generate(HashGeneration::All),
|
|
HashStrategy::Require(_) => {
|
|
return Err(anyhow::anyhow!(
|
|
"Hash-checking is not supported for local directories: {}",
|
|
path.user_display()
|
|
));
|
|
}
|
|
};
|
|
|
|
// Fetch the metadata for the distribution.
|
|
let metadata = {
|
|
let id = VersionId::from_url(source.url());
|
|
if self.index.distributions().register(id.clone()) {
|
|
// Run the PEP 517 build process to extract metadata from the source distribution.
|
|
let source = BuildableSource::Url(source);
|
|
let archive = self.database.build_wheel_metadata(&source, hashes).await?;
|
|
|
|
let metadata = archive.metadata.clone();
|
|
|
|
// Insert the metadata into the index.
|
|
self.index
|
|
.distributions()
|
|
.done(id, Arc::new(MetadataResponse::Found(archive)));
|
|
|
|
metadata
|
|
} else {
|
|
let response = self
|
|
.index
|
|
.distributions()
|
|
.wait(&id)
|
|
.await
|
|
.expect("missing value for registered task");
|
|
let MetadataResponse::Found(archive) = &*response else {
|
|
panic!("Failed to find metadata for: {}", path.user_display());
|
|
};
|
|
archive.metadata.clone()
|
|
}
|
|
};
|
|
|
|
Ok(RequiresDist::from(metadata))
|
|
}
|
|
}
|