Apply Cache-Control overrides to response, not request headers (#14736)
Some checks failed
CI / Determine changes (push) Has been cancelled
CI / lint (push) Has been cancelled
CI / cargo shear (push) Has been cancelled
CI / typos (push) Has been cancelled
CI / mkdocs (push) Has been cancelled
CI / cargo clippy | ubuntu (push) Has been cancelled
CI / cargo clippy | windows (push) Has been cancelled
CI / cargo dev generate-all (push) Has been cancelled
CI / cargo test | ubuntu (push) Has been cancelled
CI / cargo test | macos (push) Has been cancelled
CI / cargo test | windows (push) Has been cancelled
CI / check windows trampoline | aarch64 (push) Has been cancelled
CI / check windows trampoline | i686 (push) Has been cancelled
CI / check windows trampoline | x86_64 (push) Has been cancelled
CI / test windows trampoline | i686 (push) Has been cancelled
CI / test windows trampoline | x86_64 (push) Has been cancelled
CI / build binary | linux libc (push) Has been cancelled
CI / build binary | linux aarch64 (push) Has been cancelled
CI / build binary | linux musl (push) Has been cancelled
CI / build binary | macos aarch64 (push) Has been cancelled
CI / build binary | macos x86_64 (push) Has been cancelled
CI / build binary | windows x86_64 (push) Has been cancelled
CI / build binary | windows aarch64 (push) Has been cancelled
CI / build binary | msrv (push) Has been cancelled
CI / build binary | freebsd (push) Has been cancelled
CI / ecosystem test | pydantic/pydantic-core (push) Has been cancelled
CI / ecosystem test | prefecthq/prefect (push) Has been cancelled
CI / ecosystem test | pallets/flask (push) Has been cancelled
CI / smoke test | linux (push) Has been cancelled
CI / smoke test | linux aarch64 (push) Has been cancelled
CI / check system | alpine (push) Has been cancelled
CI / smoke test | macos (push) Has been cancelled
CI / smoke test | windows x86_64 (push) Has been cancelled
CI / smoke test | windows aarch64 (push) Has been cancelled
CI / integration test | conda on ubuntu (push) Has been cancelled
CI / integration test | deadsnakes python3.9 on ubuntu (push) Has been cancelled
CI / integration test | free-threaded on windows (push) Has been cancelled
CI / integration test | aarch64 windows implicit (push) Has been cancelled
CI / integration test | aarch64 windows explicit (push) Has been cancelled
CI / integration test | pypy on ubuntu (push) Has been cancelled
CI / integration test | pypy on windows (push) Has been cancelled
CI / integration test | graalpy on ubuntu (push) Has been cancelled
CI / integration test | graalpy on windows (push) Has been cancelled
CI / integration test | pyodide on ubuntu (push) Has been cancelled
CI / integration test | github actions (push) Has been cancelled
CI / integration test | free-threaded python on github actions (push) Has been cancelled
CI / integration test | determine publish changes (push) Has been cancelled
CI / integration test | registries (push) Has been cancelled
CI / integration test | uv publish (push) Has been cancelled
CI / integration test | uv_build (push) Has been cancelled
CI / check cache | ubuntu (push) Has been cancelled
CI / check cache | macos aarch64 (push) Has been cancelled
CI / check system | python on debian (push) Has been cancelled
CI / check system | python on fedora (push) Has been cancelled
CI / check system | python on ubuntu (push) Has been cancelled
CI / check system | python on rocky linux 8 (push) Has been cancelled
CI / check system | python on rocky linux 9 (push) Has been cancelled
CI / check system | graalpy on ubuntu (push) Has been cancelled
CI / check system | pypy on ubuntu (push) Has been cancelled
CI / check system | pyston (push) Has been cancelled
CI / check system | python on macos aarch64 (push) Has been cancelled
CI / check system | homebrew python on macos aarch64 (push) Has been cancelled
CI / check system | python on macos x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86-64 (push) Has been cancelled
CI / check system | python3.10 on windows x86 (push) Has been cancelled
CI / check system | python3.13 on windows x86-64 (push) Has been cancelled
CI / check system | x86-64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | aarch64 python3.13 on windows aarch64 (push) Has been cancelled
CI / check system | windows registry (push) Has been cancelled
CI / check system | python3.12 via chocolatey (push) Has been cancelled
CI / check system | python3.9 via pyenv (push) Has been cancelled
CI / check system | python3.13 (push) Has been cancelled
CI / check system | conda3.11 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.8 on macos aarch64 (push) Has been cancelled
CI / check system | conda3.11 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.8 on linux x86-64 (push) Has been cancelled
CI / check system | conda3.11 on windows x86-64 (push) Has been cancelled
CI / check system | conda3.8 on windows x86-64 (push) Has been cancelled
CI / check system | amazonlinux (push) Has been cancelled
CI / check system | embedded python3.10 on windows x86-64 (push) Has been cancelled
CI / benchmarks | walltime aarch64 linux (push) Has been cancelled
CI / benchmarks | instrumented (push) Has been cancelled

## Summary

This was just an oversight on my part in the initial implementation.

Closes https://github.com/astral-sh/uv/issues/14719.

## Test Plan

With:

```toml
[project]
name = "foo"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.13.2"
dependencies = [
]

[[tool.uv.index]]
url = "https://download.pytorch.org/whl/cpu"
cache-control = { api = "max-age=600" }
```

Ran `cargo run lock -vvv` and verified that the PyTorch index response
was cached (whereas it typically returns `cache-control:
no-cache,no-store,must-revalidate`).
This commit is contained in:
Charlie Marsh 2025-07-18 16:32:29 -04:00 committed by GitHub
parent 574aa1ef11
commit d0efe1ed9c
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
4 changed files with 201 additions and 50 deletions

View file

@ -304,7 +304,7 @@ impl CachedClient {
.await?
} else {
debug!("No cache entry for: {}", req.url());
let (response, cache_policy) = self.fresh_request(req).await?;
let (response, cache_policy) = self.fresh_request(req, cache_control).await?;
CachedResponse::ModifiedOrNew {
response,
cache_policy,
@ -318,8 +318,13 @@ impl CachedClient {
"Broken fresh cache entry (for payload) at {}, removing: {err}",
cache_entry.path().display()
);
self.resend_and_heal_cache(fresh_req, cache_entry, response_callback)
.await
self.resend_and_heal_cache(
fresh_req,
cache_entry,
cache_control,
response_callback,
)
.await
}
},
CachedResponse::NotModified { cached, new_policy } => {
@ -339,8 +344,13 @@ impl CachedClient {
(for payload) at {}, removing: {err}",
cache_entry.path().display()
);
self.resend_and_heal_cache(fresh_req, cache_entry, response_callback)
.await
self.resend_and_heal_cache(
fresh_req,
cache_entry,
cache_control,
response_callback,
)
.await
}
}
}
@ -355,8 +365,13 @@ impl CachedClient {
// ETag didn't match). We need to make a fresh request.
if response.status() == http::StatusCode::NOT_MODIFIED {
warn!("Server returned unusable 304 for: {}", fresh_req.url());
self.resend_and_heal_cache(fresh_req, cache_entry, response_callback)
.await
self.resend_and_heal_cache(
fresh_req,
cache_entry,
cache_control,
response_callback,
)
.await
} else {
self.run_response_callback(
cache_entry,
@ -379,9 +394,10 @@ impl CachedClient {
&self,
req: Request,
cache_entry: &CacheEntry,
cache_control: CacheControl<'_>,
response_callback: Callback,
) -> Result<Payload, CachedClientError<CallBackError>> {
let (response, cache_policy) = self.fresh_request(req).await?;
let (response, cache_policy) = self.fresh_request(req, cache_control).await?;
let payload = self
.run_response_callback(cache_entry, cache_policy, response, async |resp| {
@ -401,10 +417,11 @@ impl CachedClient {
&self,
req: Request,
cache_entry: &CacheEntry,
cache_control: CacheControl<'_>,
response_callback: Callback,
) -> Result<Payload::Target, CachedClientError<CallBackError>> {
let _ = fs_err::tokio::remove_file(&cache_entry.path()).await;
let (response, cache_policy) = self.fresh_request(req).await?;
let (response, cache_policy) = self.fresh_request(req, cache_control).await?;
self.run_response_callback(cache_entry, cache_policy, response, response_callback)
.await
}
@ -476,20 +493,13 @@ impl CachedClient {
) -> Result<CachedResponse, Error> {
// Apply the cache control header, if necessary.
match cache_control {
CacheControl::None | CacheControl::AllowStale => {}
CacheControl::None | CacheControl::AllowStale | CacheControl::Override(..) => {}
CacheControl::MustRevalidate => {
req.headers_mut().insert(
http::header::CACHE_CONTROL,
http::HeaderValue::from_static("no-cache"),
);
}
CacheControl::Override(value) => {
req.headers_mut().insert(
http::header::CACHE_CONTROL,
http::HeaderValue::from_str(value)
.map_err(|_| ErrorKind::InvalidCacheControl(value.to_string()))?,
);
}
}
Ok(match cached.cache_policy.before_request(&mut req) {
BeforeRequest::Fresh => {
@ -499,8 +509,13 @@ impl CachedClient {
BeforeRequest::Stale(new_cache_policy_builder) => match cache_control {
CacheControl::None | CacheControl::MustRevalidate | CacheControl::Override(_) => {
debug!("Found stale response for: {}", req.url());
self.send_cached_handle_stale(req, cached, new_cache_policy_builder)
.await?
self.send_cached_handle_stale(
req,
cache_control,
cached,
new_cache_policy_builder,
)
.await?
}
CacheControl::AllowStale => {
debug!("Found stale (but allowed) response for: {}", req.url());
@ -513,7 +528,7 @@ impl CachedClient {
"Cached request doesn't match current request for: {}",
req.url()
);
let (response, cache_policy) = self.fresh_request(req).await?;
let (response, cache_policy) = self.fresh_request(req, cache_control).await?;
CachedResponse::ModifiedOrNew {
response,
cache_policy,
@ -525,12 +540,13 @@ impl CachedClient {
async fn send_cached_handle_stale(
&self,
req: Request,
cache_control: CacheControl<'_>,
cached: DataWithCachePolicy,
new_cache_policy_builder: CachePolicyBuilder,
) -> Result<CachedResponse, Error> {
let url = DisplaySafeUrl::from(req.url().clone());
debug!("Sending revalidation request for: {url}");
let response = self
let mut response = self
.0
.execute(req)
.instrument(info_span!("revalidation_request", url = url.as_str()))
@ -538,6 +554,16 @@ impl CachedClient {
.map_err(|err| ErrorKind::from_reqwest_middleware(url.clone(), err))?
.error_for_status()
.map_err(|err| ErrorKind::from_reqwest(url.clone(), err))?;
// If the user set a custom `Cache-Control` header, override it.
if let CacheControl::Override(header) = cache_control {
response.headers_mut().insert(
http::header::CACHE_CONTROL,
http::HeaderValue::from_str(header)
.expect("Cache-Control header must be valid UTF-8"),
);
}
match cached
.cache_policy
.after_response(new_cache_policy_builder, &response)
@ -566,16 +592,26 @@ impl CachedClient {
async fn fresh_request(
&self,
req: Request,
cache_control: CacheControl<'_>,
) -> Result<(Response, Option<Box<CachePolicy>>), Error> {
let url = DisplaySafeUrl::from(req.url().clone());
trace!("Sending fresh {} request for {}", req.method(), url);
let cache_policy_builder = CachePolicyBuilder::new(&req);
let response = self
let mut response = self
.0
.execute(req)
.await
.map_err(|err| ErrorKind::from_reqwest_middleware(url.clone(), err))?;
// If the user set a custom `Cache-Control` header, override it.
if let CacheControl::Override(header) = cache_control {
response.headers_mut().insert(
http::header::CACHE_CONTROL,
http::HeaderValue::from_str(header)
.expect("Cache-Control header must be valid UTF-8"),
);
}
let retry_count = response
.extensions()
.get::<reqwest_retry::RetryCount>()
@ -690,6 +726,7 @@ impl CachedClient {
&self,
req: Request,
cache_entry: &CacheEntry,
cache_control: CacheControl<'_>,
response_callback: Callback,
) -> Result<Payload, CachedClientError<CallBackError>> {
let mut past_retries = 0;
@ -698,7 +735,7 @@ impl CachedClient {
loop {
let fresh_req = req.try_clone().expect("HTTP request must be cloneable");
let result = self
.skip_cache(fresh_req, cache_entry, &response_callback)
.skip_cache(fresh_req, cache_entry, cache_control, &response_callback)
.await;
// Check if the middleware already performed retries

View file

@ -441,6 +441,26 @@ impl<'a> IndexLocations {
}
}
}
/// Return the Simple API cache control header for an [`IndexUrl`], if configured.
pub fn simple_api_cache_control_for(&self, url: &IndexUrl) -> Option<&str> {
for index in &self.indexes {
if index.url() == url {
return index.cache_control.as_ref()?.api.as_deref();
}
}
None
}
/// Return the artifact cache control header for an [`IndexUrl`], if configured.
pub fn artifact_cache_control_for(&self, url: &IndexUrl) -> Option<&str> {
for index in &self.indexes {
if index.url() == url {
return index.cache_control.as_ref()?.files.as_deref();
}
}
None
}
}
impl From<&IndexLocations> for uv_auth::Indexes {

View file

@ -20,7 +20,7 @@ use uv_client::{
};
use uv_distribution_filename::WheelFilename;
use uv_distribution_types::{
BuildableSource, BuiltDist, Dist, HashPolicy, Hashed, InstalledDist, Name, SourceDist,
BuildableSource, BuiltDist, Dist, HashPolicy, Hashed, IndexUrl, InstalledDist, Name, SourceDist,
};
use uv_extract::hash::Hasher;
use uv_fs::write_atomic;
@ -201,6 +201,7 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
match self
.stream_wheel(
url.clone(),
dist.index(),
&wheel.filename,
wheel.file.size,
&wheel_entry,
@ -236,6 +237,7 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
let archive = self
.download_wheel(
url,
dist.index(),
&wheel.filename,
wheel.file.size,
&wheel_entry,
@ -272,6 +274,7 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
match self
.stream_wheel(
wheel.url.raw().clone(),
None,
&wheel.filename,
None,
&wheel_entry,
@ -301,6 +304,7 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
let archive = self
.download_wheel(
wheel.url.raw().clone(),
None,
&wheel.filename,
None,
&wheel_entry,
@ -534,6 +538,7 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
async fn stream_wheel(
&self,
url: DisplaySafeUrl,
index: Option<&IndexUrl>,
filename: &WheelFilename,
size: Option<u64>,
wheel_entry: &CacheEntry,
@ -616,13 +621,24 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
// Fetch the archive from the cache, or download it if necessary.
let req = self.request(url.clone())?;
// Determine the cache control policy for the URL.
let cache_control = match self.client.unmanaged.connectivity() {
Connectivity::Online => CacheControl::from(
self.build_context
.cache()
.freshness(&http_entry, Some(&filename.name), None)
.map_err(Error::CacheRead)?,
),
Connectivity::Online => {
if let Some(header) = index.and_then(|index| {
self.build_context
.locations()
.artifact_cache_control_for(index)
}) {
CacheControl::Override(header)
} else {
CacheControl::from(
self.build_context
.cache()
.freshness(&http_entry, Some(&filename.name), None)
.map_err(Error::CacheRead)?,
)
}
}
Connectivity::Offline => CacheControl::AllowStale,
};
@ -654,7 +670,12 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
.managed(async |client| {
client
.cached_client()
.skip_cache_with_retry(self.request(url)?, &http_entry, download)
.skip_cache_with_retry(
self.request(url)?,
&http_entry,
cache_control,
download,
)
.await
.map_err(|err| match err {
CachedClientError::Callback { err, .. } => err,
@ -671,6 +692,7 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
async fn download_wheel(
&self,
url: DisplaySafeUrl,
index: Option<&IndexUrl>,
filename: &WheelFilename,
size: Option<u64>,
wheel_entry: &CacheEntry,
@ -783,13 +805,24 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
// Fetch the archive from the cache, or download it if necessary.
let req = self.request(url.clone())?;
// Determine the cache control policy for the URL.
let cache_control = match self.client.unmanaged.connectivity() {
Connectivity::Online => CacheControl::from(
self.build_context
.cache()
.freshness(&http_entry, Some(&filename.name), None)
.map_err(Error::CacheRead)?,
),
Connectivity::Online => {
if let Some(header) = index.and_then(|index| {
self.build_context
.locations()
.artifact_cache_control_for(index)
}) {
CacheControl::Override(header)
} else {
CacheControl::from(
self.build_context
.cache()
.freshness(&http_entry, Some(&filename.name), None)
.map_err(Error::CacheRead)?,
)
}
}
Connectivity::Offline => CacheControl::AllowStale,
};
@ -821,7 +854,12 @@ impl<'a, Context: BuildContext> DistributionDatabase<'a, Context> {
.managed(async |client| {
client
.cached_client()
.skip_cache_with_retry(self.request(url)?, &http_entry, download)
.skip_cache_with_retry(
self.request(url)?,
&http_entry,
cache_control,
download,
)
.await
.map_err(|err| match err {
CachedClientError::Callback { err, .. } => err,

View file

@ -32,7 +32,7 @@ use uv_client::{
use uv_configuration::{BuildKind, BuildOutput, ConfigSettings, SourceStrategy};
use uv_distribution_filename::{SourceDistExtension, WheelFilename};
use uv_distribution_types::{
BuildableSource, DirectorySourceUrl, GitSourceUrl, HashPolicy, Hashed, PathSourceUrl,
BuildableSource, DirectorySourceUrl, GitSourceUrl, HashPolicy, Hashed, IndexUrl, PathSourceUrl,
SourceDist, SourceUrl,
};
use uv_extract::hash::Hasher;
@ -148,6 +148,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
self.url(
source,
&url,
Some(&dist.index),
&cache_shard,
None,
dist.ext,
@ -168,6 +169,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
self.url(
source,
&dist.url,
None,
&cache_shard,
dist.subdirectory.as_deref(),
dist.ext,
@ -213,6 +215,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
self.url(
source,
resource.url,
None,
&cache_shard,
resource.subdirectory,
resource.ext,
@ -288,9 +291,18 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
.await;
}
self.url_metadata(source, &url, &cache_shard, None, dist.ext, hashes, client)
.boxed_local()
.await?
self.url_metadata(
source,
&url,
Some(&dist.index),
&cache_shard,
None,
dist.ext,
hashes,
client,
)
.boxed_local()
.await?
}
BuildableSource::Dist(SourceDist::DirectUrl(dist)) => {
// For direct URLs, cache directly under the hash of the URL itself.
@ -302,6 +314,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
self.url_metadata(
source,
&dist.url,
None,
&cache_shard,
dist.subdirectory.as_deref(),
dist.ext,
@ -340,6 +353,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
self.url_metadata(
source,
resource.url,
None,
&cache_shard,
resource.subdirectory,
resource.ext,
@ -395,6 +409,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
&self,
source: &BuildableSource<'data>,
url: &'data DisplaySafeUrl,
index: Option<&'data IndexUrl>,
cache_shard: &CacheShard,
subdirectory: Option<&'data Path>,
ext: SourceDistExtension,
@ -406,7 +421,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
// Fetch the revision for the source distribution.
let revision = self
.url_revision(source, ext, url, cache_shard, hashes, client)
.url_revision(source, ext, url, index, cache_shard, hashes, client)
.await?;
// Before running the build, check that the hashes match.
@ -448,6 +463,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
source,
ext,
url,
index,
&source_dist_entry,
revision,
hashes,
@ -511,6 +527,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
&self,
source: &BuildableSource<'data>,
url: &'data Url,
index: Option<&'data IndexUrl>,
cache_shard: &CacheShard,
subdirectory: Option<&'data Path>,
ext: SourceDistExtension,
@ -521,7 +538,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
// Fetch the revision for the source distribution.
let revision = self
.url_revision(source, ext, url, cache_shard, hashes, client)
.url_revision(source, ext, url, index, cache_shard, hashes, client)
.await?;
// Before running the build, check that the hashes match.
@ -578,6 +595,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
source,
ext,
url,
index,
&source_dist_entry,
revision,
hashes,
@ -689,18 +707,31 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
source: &BuildableSource<'_>,
ext: SourceDistExtension,
url: &Url,
index: Option<&IndexUrl>,
cache_shard: &CacheShard,
hashes: HashPolicy<'_>,
client: &ManagedClient<'_>,
) -> Result<Revision, Error> {
let cache_entry = cache_shard.entry(HTTP_REVISION);
// Determine the cache control policy for the request.
let cache_control = match client.unmanaged.connectivity() {
Connectivity::Online => CacheControl::from(
self.build_context
.cache()
.freshness(&cache_entry, source.name(), source.source_tree())
.map_err(Error::CacheRead)?,
),
Connectivity::Online => {
if let Some(header) = index.and_then(|index| {
self.build_context
.locations()
.artifact_cache_control_for(index)
}) {
CacheControl::Override(header)
} else {
CacheControl::from(
self.build_context
.cache()
.freshness(&cache_entry, source.name(), source.source_tree())
.map_err(Error::CacheRead)?,
)
}
}
Connectivity::Offline => CacheControl::AllowStale,
};
@ -750,6 +781,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
.skip_cache_with_retry(
Self::request(DisplaySafeUrl::from(url.clone()), client)?,
&cache_entry,
cache_control,
download,
)
.await
@ -2056,6 +2088,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
source: &BuildableSource<'_>,
ext: SourceDistExtension,
url: &Url,
index: Option<&IndexUrl>,
entry: &CacheEntry,
revision: Revision,
hashes: HashPolicy<'_>,
@ -2063,6 +2096,28 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
) -> Result<Revision, Error> {
warn!("Re-downloading missing source distribution: {source}");
let cache_entry = entry.shard().entry(HTTP_REVISION);
// Determine the cache control policy for the request.
let cache_control = match client.unmanaged.connectivity() {
Connectivity::Online => {
if let Some(header) = index.and_then(|index| {
self.build_context
.locations()
.artifact_cache_control_for(index)
}) {
CacheControl::Override(header)
} else {
CacheControl::from(
self.build_context
.cache()
.freshness(&cache_entry, source.name(), source.source_tree())
.map_err(Error::CacheRead)?,
)
}
}
Connectivity::Offline => CacheControl::AllowStale,
};
let download = |response| {
async {
// Take the union of the requested and existing hash algorithms.
@ -2096,6 +2151,7 @@ impl<'a, T: BuildContext> SourceDistributionBuilder<'a, T> {
.skip_cache_with_retry(
Self::request(DisplaySafeUrl::from(url.clone()), client)?,
&cache_entry,
cache_control,
download,
)
.await