Entropy, diversity and magnitude: a survey, with questions

Date
2024/05/10 Fri 15:00 - 17:00
Room
3号館108号室
Speaker
Emily Roff
Affiliation
University of Edinburgh, Osaka University
Abstract

It is a basic fact about entropy that the size of a set controls how much disorder it can support: on a finite set X, Shannon entropy is maximized by the uniform distribution p, whose entropy is H(p) = log(#X). A dynamical version of this relationship between entropy and cardinality can be seen in the variational principle that relates the measure-theoretic entropy to the topological entropy of a dynamical system.

The starting point of this talk is the idea that if X is not just a set but a finite metric space, then a reasonable measure of its “size” ought to take into account not just the number of points, but also the distances between them. Similarly, a reasonable measure of “disorder” for probability distributions on X ought to take into account how their mass is clustered within the space. Taking these ideas seriously leads to the notions of magnitude and diversity (due to Leinster) and turns out to link information theory, category theory and theoretical ecology. A maximum diversity theorem relating diversity to magnitude is proved for finite metric spaces in [1] and extended to compact metric spaces in [2].

In this talk I will tell that story, and formulate the question of how diversity might be dynamicalized.

[1] Leinster and Meckes. Maximizing diversity in biology and beyond. Entropy 18(3), 88, 2016.
[2] Leinster and Roff. The maximum entropy of a metric space. Q. J. Math. 72(4):1271–1309, 2021.