Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question on Mutual Information MI() #38

Open
jonnyschaefer opened this issue Dec 29, 2024 · 1 comment
Open

Question on Mutual Information MI() #38

jonnyschaefer opened this issue Dec 29, 2024 · 1 comment

Comments

@jonnyschaefer
Copy link

Hello,

I have problems understanding the way the use of MI(x, y, xy, unit = "log2") is intended.

The documentation states:

x: a numeric probability vector P(X).
y: a numeric probability vector P(Y).
xy: a numeric joint-probability vector P(X,Y).

How can xy be a vector?
If I understand mutual information correctly it should be a 2D matrix, so that xy[i,j] gives the joint probability, right?

https://en.wikipedia.org/wiki/Mutual_information#In_terms_of_PMFs_for_discrete_distributions

@HajkD
Copy link
Member

HajkD commented Jan 6, 2025

Hi @jonnyschaefer

Yes, there are two ways the MI() function could have been implemented.

  • Version one, MI() takes the the joint probability P(X,Y) as 2D matrix and then internally converts it to a joint probability vector xy[i,j] through indexing.

  • Version two (which is what I implemented), MI() takes the the joint probability P(X,Y) directly as probability vector AFTER the user converts their 2D matrix to a joint probability vector { xy[i,j] }.

I chose version two to give users maximum flexibility on how to define their joint probabilities.

I hope this clarifies the usability?

Any concrete suggestions for improvement are always welcome.

With many thanks and very best wishes,
Hajk

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants