Minimizing and Maximizing the Shannon Entropy for Fixed Marginals
Paula Franke, Kay Hamacher, Paul Manns
公開日: 2025/9/5
Abstract
The mutual information (MI) between two random variables is an important correlation measure in data analysis. The Shannon entropy of a joint probability distribution is the variable part under fixed marginals. We aim to minimize and maximize it to obtain the largest and smallest MI possible in this case, leading to a scaled MI ratio for better comparability. We present algorithmic approaches and optimal solutions for a set of problem instances based on data from molecular evolution. We show that this allows us to construct a sensible, systematic correction to raw MI values.