Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

benchmarks/tracking_performances: implement CI #29

Merged
merged 3 commits into from
Jul 6, 2024

Conversation

veprbl
Copy link
Member

@veprbl veprbl commented Jun 30, 2024

This enables tracking_performances as a benchmark running on eicweb. This also adds a Snakefile that can be used interactively. The simulations produced should match the blueprint of the official productions, so that the workflow can be extended to running over campaigns next.

@veprbl veprbl force-pushed the pr/tracking_performances_ci branch from 1207f8d to 6c3f0ff Compare June 30, 2024 00:24
@veprbl veprbl force-pushed the pr/tracking_performances_ci branch from 6c3f0ff to 399b4c9 Compare June 30, 2024 00:27
@Simple-Shyam
Copy link
Contributor

This enables tracking_performances as a benchmark running on eicweb. This also adds a Snakefile that can be used interactively. The simulations produced should match the blueprint of the official productions, so that the workflow can be extended to running over campaigns next.

Thanks @veprbl for this work. Is it running? I want to see the output. From Matt's (simulation campaign) and mine (local simulation with latest tag) presentation in tracking you can see each simulation campaign is producing different performances. The idea is when you change something major then we can run with latest tag and can check the performances. If there are some strange feature, I can easily spot them. You can also make the epic_tracking_only.xml as the default everywhere with the latest tag. Even we can print the graph in the log file if required then we can match the numbers (momentum resolutions), they will be the same.

@veprbl
Copy link
Member Author

veprbl commented Jul 3, 2024

Hi @Simple-Shyam, this is, indeed, now running. You can check plots from a latest run at https://eicweb.phy.anl.gov/EIC/benchmarks/detector_benchmarks/-/jobs/3474456/artifacts/browse/results/tracking_performances/ . For now, it's all for the latest container and simulations are only generated on the fly, the campaign processing is no implemented on CI anywhere yet. I still need to figure out a good procedure for that. The epic_tracking_only.xml is used currently, yes. Also, regarding, benchmarks/tracking_performances/Script_widebin.sh, we don't need it anymore. All the logic is now implemented in the Snakefile now. We can remove your shell script, or keep it for your convenience (we don't want to force you to switch to Snakemake for your work, if you don't want to!). Regarding testing, I think, @zsweger figured it out in his eic/physics_benchmarks#3 that you can write a special json file to declare numerical variables and tolerances for them. We could start implementing this for detector benchmarks too.

@Simple-Shyam
Copy link
Contributor

Thanks @veprbl for this task. It's very nice that I can see the performances now. You can see a decrease at low momentum (nonphysical) and higher eta which I am trying to understand. I also shared the code with Torri which works with simulation campaign (automatically). You can remove that script or I can? For the numbers, if you want we can print the graph (truth and real both calling graph->Print()) in to the text or log file which can be directly be compared with some reference values visually or we put a difference in terms of percentage or json file also fine.

@Simple-Shyam
Copy link
Contributor

I have a further comment about the plots why did you remove the debug plots directory? If some strange behavior is seen in the tracking performances plot then I go to 1 dimensional Gaussian fit (debug directory) to check whether things are fine or not? Also can we increase the number of events (50k, just asking?) so that we can see the decrease behavior at large eta with less uncertainties.

@Simple-Shyam Simple-Shyam merged commit 6b06711 into master Jul 6, 2024
2 checks passed
@veprbl
Copy link
Member Author

veprbl commented Jul 6, 2024

I didn't remove the debug directory. If you look at the last Snakemake rule, it does aggregation into results/ directory for artifact. There we could implement placing debug directories as well. In local running they are always produced at your path (Debug_Plots/)

@veprbl veprbl deleted the pr/tracking_performances_ci branch July 6, 2024 17:52
@Simple-Shyam
Copy link
Contributor

Thank you Dmitry. It will be helpful for me. I need to check to understand decrease at low momentum and at large eta.
I will also push soon the code for DCAT and DCAZ resolution, then I don't need to run locally myself. I can directly present these plots in the tracking meeting.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants