Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement fixture system for benchmarks similar to pytest #170

Closed
2 of 4 tasks
nicholasjng opened this issue Nov 21, 2024 · 0 comments · Fixed by #182
Closed
2 of 4 tasks

Implement fixture system for benchmarks similar to pytest #170

nicholasjng opened this issue Nov 21, 2024 · 0 comments · Fixed by #182

Comments

@nicholasjng
Copy link
Collaborator

nicholasjng commented Nov 21, 2024

Instead of the current solution of parametrization via default arguments:

import nnbench


@nnbench.benchmark
def add(a: int = 1, b: int = 2) -> int:
    return a + b


@nnbench.benchmark
def mul(a: int = 1, b: int = 2) -> int:
    return a * b

I would to offer at least a partial parametrization of benchmarks via fixtures, meaning that we during parameter hydration in the main benchmark loop, we resolve fixture values and place them inside the parameters.

Fixtures should also be sourced from a Python config file like conftest.py. For simplicity, we can do the first iteration with a single, global file.

Similarly, scopes do not have to be supported immediately, maybe a session-wide scope works for now.

Design:

# conf.py (name TBD)
import nnbench
import numpy as np

@nnbench.fixture
def model() -> MyModel:
    return MyModel.load("path/to/my_model.npz")


@nnbench.fixture
def validation_data() -> np.ndarray:
    return np.load("path/to/my_val_data.npz")

----------------------------------------------

# meanwhile in benchmark.py:
@nnbench.benchmark
def accuracy(model: MyModel, data: np.ndarray) -> float:
    X, y = data["X"], data["y"]
    return np.argmax(model(X) == y)

Now, running with nnbench benchmark.py should make nnbench go into conf.py, instantiate the fixture values, and add them to the parameters.

Requirements:

  • Add a fixture decorator wrapping a callable, import into top-level package as nnbench.fixture.
  • In the benchmark loop, import from hard-coded path, either conf.py if the input path to nnbench run <path> is a file, or dir/conf.py if the input is a directory.
  • Give an example on advanced configuration in fixtures, maybe via envvars.
  • Document the fixture facility.
@nicholasjng nicholasjng linked a pull request Dec 3, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant