Skip to content

Combining toys from several batch jobs #1948

Answered by kratsg
madbaron asked this question in Q&A
Discussion options

You must be logged in to vote

This is a really good question and I've had a refresh of what we currently do in pyhf. For v0.6.3 of the code, you should be able to do something like this:

# all the same code from your example ...

from numpy import random

def job(calculator, seed=0):
    random.seed(seed)
    return calculator.distributions(0.0)

calculator = pyhf.infer.utils.create_calculator(
    calc_type, data, model, test_stat="q0", track_progress=False, **kwargs
)
test_stat = calculator.teststatistic(0.0)

from multiprocessing import Pool
with Pool(processes=4) as pool:
    res = pool.map(lambda x: job(calculator, seed=x), range(10))

from pyhf.infer.calculators import EmpiricalDistribution
sig_plus_bkg_dist = E…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@madbaron
Comment options

Answer selected by madbaron
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants