
What does knowing Y tell us about X?
This Y↔X relationship is called mutual information.
Which is,
"reduction in uncertainty of X given the knowledge of Y".
There are at least two approaches to deriving the mutual information:
-
Shannon's approach (using Entropy Algebra)
I(X; Y) = H(X) − H(X|Y) -
Statistical approach (using Probability distribution).
Statistical approach of deriving I(X; Y) and also showing I(X; Y) = I(Y; X).
Since, mutual information is the relative entropy between the joint distribution and the product distribution p(xi) p(yj),
![]() |
Thus,
![]() |
Partial mutual information
From the above (statistical) expression of I(X; Y)
![]() |
![]() |
Next:
Directed graphs (p:2) ➽