The Entropy of Parallel Systems

Temitayo Adefemi

公開日: 2025/9/12

Abstract

Ever since Claude Shannon used entropy for his "Mathematical Theory of Communication", entropy has become a buzzword in research circles with scientists applying entropy to describe any phenomena that are reminiscent of disorder. In this paper, we used entropy to describe the incompatibility between components in the computer, which can cause noise and disorder within the parallel cluster. We develop a mathematical theory, primarily based on graph theory and logarithms, to quantify the entropy of a parallel cluster by accounting for the entropy of each system within the cluster. We proceed using this model to calculate the entropy of the Top 10 supercomputers in the Top500 list. Our entropy framework reveals a statistically significant negative correlation between system entropy and computational performance across the world's fastest supercomputers. Most notably, the LINPACK benchmark demonstrates a strong negative correlation (r = -0.7832, p = 0.0077) with our entropy measure, indicating that systems with lower entropy consistently achieve higher computational efficiency, this Relationship is further supported by moderate correlations with MLPerf mixed-precision benchmarks (r = -0.6234) and HPCC composite scores (r = -0.5890), suggesting the framework's applicability extends beyond traditional dense linear algebra workloads.

The Entropy of Parallel Systems | SummarXiv | SummarXiv