You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
distinguishing the parts of disciplines-of-application that require some creativity from the aspects that are "trivially trivial" (a matter of paperwork).
really, this sounds like it applies to the stuff youve called "monoidal" as well
to some extent
sarahzrf said:
distinguishing the parts of disciplines-of-application that require some creativity from the aspects that are "trivially trivial" (a matter of paperwork).
really, this sounds like it applies to the stuff you've called "monoidal" as well
Yes, I think Freyd's famous quote ("Perhaps the purpose of categorical algebra is to make that which is trivial, trivially trivial") was intended to apply to all of category theory.
I was really upset/amused by Timothy Gowers response to that quote: he basically said "I'm happy enough for it to be trivial."
sounds like someone isn't a programmer!
oh, that's ambiguous: i'm referring to him, not you
He's a Fields-medal-winning mathematician. I think he's working on computer programs that find proofs, now.
(If you don't follow him on Twitter, maybe you should. If you do, I guess I didn't get that from what you said - sorry.)
He may not be doing programming intensively enough to want to trivialize the trivial.
Or, more likely, he was just thinking about math when someone told him that quote.
More mathematicians that quote is a pretty crappy advertisement for category theory, like If you learn this complicated stuff, you can prove trivial things even more trivially!!!
John Baez said:
Or, more likely, he was just thinking about math when someone told him that quote.
i meant something more like this :>
programmers are constantly trying to Reduce Boilerplate
Right.
Good mathematicians let their unconscious mind do a lot of that work.
Whenever I talk to him he seems to still be interested in computer-aided proofs. I think the last time was October 2018 when I gave a talk in Cambridge.
(I didn't mean to say he'd stopped doing his usual work.)
I guess I should ask him how active he is in the computer-aided proof business.
He was working with some guy trying to do this stuff. He's not into "proof assistants" - he wants computers to prove their own theorems, I think. I sorta forget.
Yeah, okay, that conference is more about the kind of stuff I mean.
Okay. Maybe Gowers is still working with Ganesalingam, maybe he's stopped.
John Baez said:
More mathematicians that quote is a pretty crappy advertisement for category theory, like If you learn this complicated stuff, you can prove trivial things even more trivially!!!
Yes, I heard Peter May say several times that it would be better to talk about "making that which is formal, formally formal", because not everything formal is trivial.
Although to be fair, I doubt that Freyd intended his remark as an "advertisement" for category theory...
Mike Shulman said:
John Baez said:
More mathematicians that quote is a pretty crappy advertisement for category theory, like If you learn this complicated stuff, you can prove trivial things even more trivially!!!
Yes, I heard Peter May say several times that it would be better to talk about "making that which is formal, formally formal", because not everything formal is trivial.
Although to be fair, I doubt that Freyd intended his remark as an "advertisement" for category theory...
I like that "formally formal" version... That seems actually quite a bit more accurate way to describe what is happening.
Jonathan Sterling said:
Mike Shulman said:
John Baez said:
More mathematicians that quote is a pretty crappy advertisement for category theory, like If you learn this complicated stuff, you can prove trivial things even more trivially!!!
Yes, I heard Peter May say several times that it would be better to talk about "making that which is formal, formally formal", because not everything formal is trivial.
Although to be fair, I doubt that Freyd intended his remark as an "advertisement" for category theory...
I like that "formally formal" version... That seems actually quite a bit more accurate way to describe what is happening.
I think a "generic mathematician" (to appropriate Kevin Buzzard's phrase) would find that description every bit as much of a turn-off as "trivially trivial". The following exegesis of "trivially trivial" appears in the nLab:
This can be taken to mean that one thing category theory does is help make the softer bits seem utterly natural and obvious, so as to quickly get to the heart of the matter, isolating the hard nuggets, which one may then attack with abandon. This is an invaluable service category theory performs for mathematics; therefore, category theory is plain good pragmatics.
I cannot speak for what is going on in "generic mathematics", but let me just say that I feel the spirit of this quip is distinguishing signal from noise, and I do not really care what words we use to describe that ;-) I think the nLab exegesis is quite nice.
All I will say is that in the discipline of logic, there is historically quite an unbelievable amount of noise (more noise, I think, than in other mathematical disciplines) and category theory has historically proved itself a good means to "turn down the volume" so that we can hear the whispering voices of the parts that matter.
I think Sarah and Mike are right that this trait is not specific to one style or facet of category theory... I meant to indicate before just where my experience lies, not where the exclusive strengths of category theory lie.
Regarding "trivially trivial" I think @John Baez is right that this is a crappy advertisement for category theory in mathematical disciplines... But I think there may be some underestimation of how big the mountains-ex-molehills are in some areas --- I am a type theorist and a logician by training, and our molehills have grown so large that half of us write 100 page reports proving things that are trivial, and the other half of us write those proofs in 2 lines but probably couldn't reconstruct the details if we were asked to. So we have a real need for a scientific way to find out what things are really trivial, and then realize this in a technical sense. As soon as we finish doing this, we may quickly reach the limit of the areas of our field that can be reduced to some High Energy Category Paperwork, but we aren't remotely there yet.
So, the "trivially trivial" aspect of categories may be more important in some disciplines than in others. Necessarily, the "trivially trivial" aspect decreases in importance over time, in correlation with the scientific development of the field.
Jonathan Sterling said:
So we have a real need for a scientific way to find out what things are really trivial, and then realize this in a technical sense. As soon as we finish doing this, we may quickly reach the limit of the areas of our field that can be reduced to some High Energy Category Paperwork, but we aren't remotely there yet.
That sounds like reduction of accidental complexity (as opposed to essential complexity), which is mentioned in some functional programming circles.
"Flattening mountains of knowledge" has also come to mind, an academic metaphor for making knowledge easier for later "climbers" to grasp.
I find all those (incl. topical) notions inspiring and insightful.
Gregory Chaitin's take on mathematics as a whole is that the cost of the powerful portions of mathematics is that large sections of mathematics are essentially random truth.
I am a type theorist and a logician by training, and our molehills have grown so large that half of us write 100 page reports proving things that are trivial, and the other half of us write those proofs in 2 lines but probably couldn't reconstruct the details if we were asked to.
just wanna chime in that this matches my experience :upside_down:
When I first read David Spivak's work on Functorial Databases I thought to myself: "Category theory is actually a database of concepts!" and could hardly believe it. Is that a good interpretation? If so it's role could be conceived of as a map to help one find one's way around. Maps are extremely useful, needless to say.
But the random truths are mostly useless: what we really need in math are results that we understand. For example, the Collatz Conjecture could be true just "by chance" (and Terry Tao has recently proved some results vaguely in this direction) but this conjecture is not anything we really need to know.
speak for yourself
John Baez said:
But the random truths are mostly useless: what we really need in math are results that we understand. For example, the Collatz Conjecture could be true just "by chance" (and Terry Tao has recently proved some results vaguely in this direction) but this conjecture is not anything we really need to know.
As we've seen with all the biggest problems in recent times, the structures and techniques that will be developed to solve it will likely be more impactful than this specific result itself.
Right! We were talking about Chaitin's observation that there are a large number of "randomly true" facts in mathematics: facts that are "true" - let's say even provable - but for no good reason.
For example, if you name any recursive function , infinitely many provable statements in arithmetic have their shortest proof of length , where is the length of the statement. If we take to be a rapidly growing function like Ackermann's function, we can claim most of these statements have "no good reason" for being true. (Some might be provable more quickly using interesting axioms, but others presumably only using an exhaustive search of some sort.)
If you can find a long proof of a statement but that proof is full of ideas that are interesting in their own right, then... well, the statement was worth proving, even if the benefit of knowing it's true is vastly exceeded by the cost of proving it!