Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add dry run option that includes stats #1079

Open
leondz opened this issue Jan 15, 2025 · 0 comments
Open

Add dry run option that includes stats #1079

leondz opened this issue Jan 15, 2025 · 0 comments
Labels
architecture Architectural upgrades generators Interfaces with LLMs

Comments

@leondz
Copy link
Collaborator

leondz commented Jan 15, 2025

Summary

Allow a dry run that reports stats on # calls, # unique prompts, # tokens in all those prompts.

This might be a command on the CLI, that overrides config with use of a test.Tracking generator which keeps track of how many prompts it sees and how many tokens that is, and perhaps returns some modulated version of the input as output to allow adaptive probes to get estimated to some extent.

@leondz leondz added architecture Architectural upgrades generators Interfaces with LLMs labels Jan 15, 2025
@leondz leondz added this to the 25.02 Efficiency milestone Jan 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
architecture Architectural upgrades generators Interfaces with LLMs
Projects
None yet
Development

No branches or pull requests

1 participant