You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
Tai-Danae Bradley: Entropy as an Operad Derivation
This talk features a small connection between information theory, algebra, and topology—namely, a correspondence between Shannon entropy and derivations of the operad of topological simplices. We will begin with a brief review of operads and their representations with topological simplices and the real line as the main example. We then give a general definition for a derivation of an operad in any category with values in an abelian bimodule over the operad. The main result is that Shannon entropy defines a derivation of the operad of topological simplices, and that for every derivation of this operad there exists a point at which it is given by a constant multiple of Shannon entropy. We show this is compatible with, and relies heavily on, a well-known characterization of entropy given by Faddeev in 1956 and a recent variation given by Leinster.
Zoom: https://topos-institute.zoom.us/j/84392523736?pwd=bjdVS09wZXVscjQ0QUhTdGhvZ3pUdz09
YouTube: https://youtu.be/_cAEfQQcELA
I have found a bit of time off from work to try to follow a little the many interesting courses on entropy and CT that have appeared recently.
https://twitter.com/bblfish/status/1528969976950005760
"Tutorial on Categorical Semantics of Entropy" by @johncarlosbaez and @math3ma, starting with Shannon entropy. This looks like it could be the course that was missing from Stiegler's philosophical work on negentropy @ArsIndustrialis https://www.youtube.com/watch?v=5phJVSWdWg4
- The 🐟 BabelFish (@bblfish)My recent interest in Entropy came from listening to Bernard Stiegler a well known French philosopher (with quite an amazing history: he spend 5 years in prison after robbing a few banks in the 70s, sudied philosophy there, came out and did his Phd under Derrida on technics and time). He liked the long term views of humanity starting 2 million years ago with ability of pre-humans to control fire, which then changed them genetically over the ages leading to our ancestors. That story is captured by the greek myth of Prometheus stealing fire from the gods and giving it to humans, for which he was condemned to be chained to a rock and have an eagle eat his liver out every day which would then grow back (the liver in French is foi, which is homophonic in French to the word that we may translate as faith).
https://eduscol.education.fr/odysseum/le-mythe-de-promethee
Bernard Stiegler in the last 5 years or so developed the theme of entropy and emphasised the work of Shannon's negentropy as well as the relation of that to questions on what life is. Physical entropy has as endpoint the heat death of the universe and Life seems to be a local negentropic bubble that goes in the opposite direction to the physical system, creating more and more order where physical entropy is more about the creation of disorder (and hence why we have to keep organising ourselves, cleaning up, etc...).
Is this notion of order vs disorder captured by the definitions of entropy that are being put forwards recently?
My guess is that @John Baez's definition on "compositional thermostatics" is taking this into account via the notion of Symmetric Monoidal Categories.
Symmetric monoidal structures on a categories are another formalism that allows
one to discuss morphisms with multiple inputs, and to permute these inputs [12], and in fact there is in fact a strong relationship between operads and symmetric monoidal categories. Every symmetric monoidal category has an underlying operad.
https://arxiv.org/pdf/2111.10315.pdf
@Brendan Fong clearly relates Symmetric Monoidal Categories to biology in his thesis and to open and closed systems
https://twitter.com/bblfish/status/1398230682237911041
@johncarlosbaez @ejpatters Ah yes, the preface to Brendan Fong's thesis "The Algebra of Open and Interconnected Systems" https://arxiv.org/abs/1609.05382 should be of great interest to Philosophers as well as #linkeddata folks interested in what is behind the Open World Assumption. cc @DJRoss70 https://twitter.com/bblfish/status/1398230682237911041/photo/1
- The 🐟 BabelFish (@bblfish)Stiegler's belief is these negentropic discoveries required rethinking all of philosophy and many other sciences as well, such as Economics. A key thinker in Economics he mentioned a lot was Georgescu Roegen, who studied under the famous "Creative Destruction" Schumpeter and who then wrote "The entropy Law and the Economic Process". Stiegler thought that modern economists or poltics had not yet correctly integrated this thinking. Perhaps because the definitions of entropy were so muddled in the 1980s. So Perhaps these new analysis on entropy being put forward, linked to economics explained as Game theory (@Jules Hedges ) can fix the misunderstandings that stalled thinking on the subect?
https://en.wikipedia.org/wiki/Nicholas_Georgescu-Roegen
I get the feeling that the book Entropy and Diversity: the Axiomatic Approach on the axiv may be the best starting point.
Is this notion of order vs disorder captured by the definitions of entropy that are being put forwards recently?
I don't know any new definition of entropy being put forward recently, just new ways of understanding the usual ones.
I don't think "order vs disorder" is a very helpful way to understand entropy. At the very least you have to be extremely careful. For example, the entropy of a frozen dead cat is much, much less than that of a warm live cat, and the latter is just a very tiny bit less than that of a warm dead cat.
For my thoughts on entropy, check out my talk video and slides. Tom Leinster's book is really good, but of course much longer. He's not a physicist, so for a physics perspective you should go somewhere else (and not my talk, I didn't have time for any physics).
John Baez said:
I don't think "order vs disorder" is a very helpful way to understand entropy. At the very least you have to be extremely careful. For example, the entropy of a frozen dead cat is much, much less than that of a warm live cat, and the latter is just a very tiny bit less than that of a warm dead cat.
Or take some oil and water and shake it up. It separates again over time because in the separated state it has a higher entropy than the mixed state (due to being very slightly warmer), even though the mixed state is intuitively more disordered.
Where do you find out what the entropy of a dead cat, a frozen one, mixed oil and vinegar, etc are?
I was wondering: what is the entropy of the web with and without the index? (does that make a difference?) Say, before AltaVista crawled the web and released its index in 1994. I guess the operadic tools could should be useful here as we are combining entropies.
I guess, to answer my question, there are research departments and magazines on entropy where one goes to ask these questions. :-) I just found this The Entropy Universe which gives a whole history of entropy formulations (but only for time series data?)
Henry Story said:
Where do you find out what the entropy of a dead cat, a frozen one, mixed oil and vinegar, etc are?
As for dead, frozen etc. cats you don't need to think about cats per se to understand which one has more entropy - the entropy of any sort of meat is about the same as a function of temperature; the difference in entropy due to being alive or dead is tiny.
Just wondering. Have the presentations of the recent 2 day Seminar on the Categorical Semantics of Entropy (also scse day 2) succeeded in unifying the concepts of entropy that evolved over 160 years as described in the 2021 article The Entropy Universe?
Diagram of the history of entropy
About oil and water, I was cheating a bit: we know the end result must have a higher entropy since otherwise the separation of oil and water wouldn't be a spontaneous process. (I'm assuming it would indeed be spontaneous if the experiment were performed in an adiabatic container - I think probably if you do it in the kitchen it's closer to adiabatic than isothermal.) It probably is the sort of thing one could look up though, if one knew the right terms to search for.
When oil and water separate they get slightly warmer since the gravitational potential energy of the water at the top of the jar gets converted to kinetic energy as the water falls, and then by friction to thermal energy when the water settles at the bottom.
@John Baez said:
I don't know any new definition of entropy being put forward recently, just new ways of understanding the usual ones.
See Physicists Rewrite the Fundamental Law That Leads to Disorder for research basing entropy on the flow of quantum information.
Daniel Geisler said:
See Physicists Rewrite the Fundamental Law That Leads to Disorder for research basing entropy on the flow of quantum information.
Now that is really hot off the press: Just two days old! :-)
On the second day of scse @Tom Mainiero and @Arthur Parzygnat spoke about quantum entropy. They may have some interesting views on that research and article. Perhaps even @David Spivak will be interested as he had the feeling that his Poly take on entropy was loosing something and did not quite tie in with dynamical systems.