mirror of
https://github.com/astral-sh/uv.git
synced 2025-07-07 21:35:00 +00:00
![]() ## Summary This seems to be one of the most consistent benchmark cases we have in terms of standard deviation: ``` ➜ hyperfine "target/profiling/main pip compile scripts/requirements/airflow.in" --runs 200 Benchmark 1: target/profiling/main pip compile scripts/requirements/airflow.in Time (mean ± σ): 292.6 ms ± 6.6 ms [User: 414.1 ms, System: 194.2 ms] Range (min … max): 282.7 ms … 320.1 ms 200 runs ``` For smaller benchmarks, scispacy and dtlssocket seem to be a bit more consistent than our current jupyter benchmark, but it hasn't given us any problems so I'll leave it for now. |
||
---|---|---|
.. | ||
compiled | ||
airflow.in | ||
all-kinds.in | ||
bio_embeddings.in | ||
black.in | ||
boto3.in | ||
dtlssocket.in | ||
flyte.in | ||
home-assistant.in | ||
jupyter.in | ||
meine_stadt_transparent.in | ||
pdm_2193.in | ||
pydantic.in | ||
scispacy.in | ||
slow.in | ||
transformers-extras.in | ||
trio.in |