feat: benchmarking (#999)

* feat: add benchmarking dashboard, CI hook on PR, and store lifetime results

* refactor: change python env to 3.13 in benchmarks

* refactor: add verbosity, use 3.11 for benchmarking

* fix: OSError: [Errno 7] Argument list too long

* refactor: add debug statements

* refactor: remove extraneous -e

* refactor: fix tests and linter errors

* fix: track main package in coverage

* refactor: fix test coverage testing

* refactor: fix repo owner name in benchmark on pushing comment

* refactor: add asv monkeypatch to docs workflow

* refactor: temporarily allow building docs in forks

* refactor: use py 3.13 for benchmarking

* refactor: run only a single benchmark for PRs to speed them up

* refactor: install asv in the docs build workflow

* refactor: use hatch docs env to generate benhcmarks in docs CI

* refactor: more trying

* refactor: move tests

* Add benchmark results for 0.137

* Trigger Build

* Add benchmark results for 0.138

* refactor: set constant machine name when benchmarking

* Add benchmark results for 0.139

* refactor: fix issue with paths too long

* Add benchmark results for 0.140

* docs: update comment

* refactor: remove test benchmarking data

* refactor: fix comment

* refactor: allow the benchmark workflow to write to PRs

* refactor: use personal access token to set up the PR benchmark bot

* refactor: split the benchmark PR flow into two to make it work with PRs from forks

* refactor: update deprecated actions/upload-artifact@v3 to v4

* refactor: fix missing directory in benchmarking workflow

* refactor: fix triggering of second workflow

* refactor: fix workflow finally?

* docs: add comments to cut-offs and direct people to benchmarks PR

---------

Co-authored-by: github-actions <github-actions@github.com>
This commit is contained in:
Juro Oravec 2025-02-23 16:18:57 +01:00 committed by GitHub
parent dcd4203eea
commit f36581ed86
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
90 changed files with 40817 additions and 443 deletions

View file

@ -0,0 +1,29 @@
from asv_runner.benchmarks.timeraw import TimerawBenchmark, _SeparateProcessTimer
# Fix for https://github.com/airspeed-velocity/asv_runner/pull/44
def _get_timer(self, *param):
"""
Returns a timer that runs the benchmark function in a separate process.
#### Parameters
**param** (`tuple`)
: The parameters to pass to the benchmark function.
#### Returns
**timer** (`_SeparateProcessTimer`)
: A timer that runs the function in a separate process.
"""
if param:
def func():
# ---------- OUR CHANGES: ADDED RETURN STATEMENT ----------
return self.func(*param)
# ---------- OUR CHANGES END ----------
else:
func = self.func
return _SeparateProcessTimer(func)
TimerawBenchmark._get_timer = _get_timer