This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
############################################################################### | |
# Entropy baseline | |
# | |
# Madhavun Candadai | |
# Dec, 2018 | |
# | |
# Entropy of a coin flip for different probabilities of HEADS ranging from 0 to | |
# 1 should give an inverted-U shaped curve | |
############################################################################### |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
//**************************************************************************// | |
// Infotheory demo to estimate MI between two 2D random vars | |
// https://github.com/madvn/infotheory | |
// | |
// C++ demo | |
// | |
// Madhavun Candadai | |
// Nov 2018 | |
// | |
//**************************************************************************// |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
############################################################################### | |
# Mutual Information demo for Python package - infotheory | |
# https://github.com/madvn/infotheory | |
# | |
# Madhavun Candadai | |
# Dec, 2018 | |
# | |
# Mutual information should be high for identical variables, slightly lower for | |
# noisy identical variables and low for random variables | |
############################################################################### |