Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

uncertainty quantification #10

Open
jyang68sh opened this issue Jan 27, 2022 · 4 comments
Open

uncertainty quantification #10

jyang68sh opened this issue Jan 27, 2022 · 4 comments

Comments

@jyang68sh
Copy link

Hi nice paper.

I was wondering from the methods, could you get a quantified number of uncertainty for the model output?

@nikitadurasov
Copy link
Owner

Hey @jyang68sh !

Sure, the whole method is designed to produce uncertainty for models' outputs and do that better than MC-Dropout (and on par with Ensembles).

You can just drop in (or insert) Masksembles layers instead of Dropout and get uncertainties. Please refer to our examples for more of information: link

Best,
Nikita

@jyang68sh
Copy link
Author

jyang68sh commented Feb 9, 2022

@nikitadurasov
Sorry I got confused.

To acquire predictions from different submodels one should transform input (with shape [1, H, W, C]) into batch (with shape [M, H, W, C]) that consists of M copies of original input (H - height of image, W - width of image, C - number of channels).

As we can see Masksembles submodels produce similar predictions for training set samples.

From the given example, I can see that Masksembles produce submodels which give similar predictions, but how can I get uncertainty for models' outputs?

@nikitadurasov
Copy link
Owner

@jyang68sh

for easy or in-distribution samples Masksembles indeed should produce similar predictions (because our model is confident about them).

E.g. in the notebook above you could run

entropies = -tf.reduce_mean(predictions * tf.math.log(predictions), axis=1) # generating entropies for every of 4 submodels
uncertainty = tf.reduce_mean(entropies) # averaging entropies to get final uncertainty

after the last cell to get final uncertainty for your image. In entropies the entropy of every particular submodel is stored and we generate final uncertainty based on variations of those entropies.

For more information about the methods how to turn predictions of several model into uncertainty please refer to this paper .

Hope that helps!

@jyang68sh
Copy link
Author

jyang68sh commented Feb 10, 2022

@nikitadurasov
Thanks for the quick response. Now I see that you have already described the process in the paper. Sorry for the confusion.

I just have a quick question. Masksembles can be worked in segmentation methods too right? But some segmentation methods, e.g. STDC, have a classification head and a detail head.

Can masksembles be used in non-classification head?

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants