You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
There's also a bit of categorical numerical analysis, which I sometimes think about
Where?
The key name is Michael Robinson
A lot of numerical analysis morally looks like sheaf theory if you squint at it a bit - you're dealing with "approximate real numbers"
There's some papers making that precise
Thanks
There's also been some sporadic ideas along the lines of "diagrams that commute up to approximate equality". The obvious way to do it is to use categories enriched in metric spaces, so every hom-set has a metric and you can consistently say that the distance between two paths through a diagram have distance
There's also a thing called "metagories" if I remember the name correctly, which is a similar idea to that but a bit different. I forgot everything about what they are, but I believe some people think they're the right way to do it
Stuff like that is (going to be) important in some areas of applied category theory that deal with continuous things, possibly with a computer
The idea of a diagram commuting up to epsilon seems pretty interesting. Maybe you could even have a sequence of diagrams which converge to a commuting diagram.
But that seems more like an application of analysis to CT vs. the converse
Yeah, that's probably true, but it ought to be on-topic for this stream either way
Although for such a thing maybe an internal category makes more sense than an enriched one
Does the category of metric spaces have the structure you need to do that... finite limits if I remember correctly? Hum, I guess it does... the "subset of a cartesian product" construction probably works for pullbacks of metric spaces?
I never thought about inheriting a metric to an arbitrary subset before, it feels like a nasty thing to do for some reason, but I can't think of anything that goes wrong
Jules Hedges said:
I never thought about inheriting a metric to an arbitrary subset before, it feels like a nasty thing to do for some reason, but I can't think of anything that goes wrong
It's fine, it just won't preserve nice properties like completeness, compactness, and connectedness in general
In any case I don't have any ready-to-go intuition for what categories internal in Mat look like (whereas I think I have a sort of idea what categories enriched in Mat look like)
Assuming that the morphisms between metric spaces are in particular continuous, the subsets that would arise in the formation of finite limits are cut out by equations between continuous functions, so they are always closed subsets, and not arbitrary ones.
I think the big distinction is that when enriching in Met, you'd want to use the tensor product (which gives rise to a closed monoidal structure, unlike the Cartesian product), but obviously for internal categories you'd have to use the Cartesian product.
Fawzi Hreiki said:
But that seems more like an application of analysis to CT vs. the converse
I think there's also stuff going the other way, where such up to epsilon-tools in CT are used to study some Banach space stuff, e.g. https://arxiv.org/abs/2006.01399
Jules Hedges said:
There's also a thing called "metagories" if I remember the name correctly, which is a similar idea to that but a bit different. I forgot everything about what they are, but I believe some people think they're the right way to do it
Metagories are cool. I vaguely recall that for any and any triangle of morphisms you can say " up to ".
So, you have a graph with objects as vertices and morphisms as triangles... but then for any you get a set of triangles which mean " up to ". Then there's a consistency condition involving tetrahedra.
This simplicial stuff makes me feel something non-ad hoc is going on here, though I haven't figured out what.
(Your distances can take values in any commutative quantale.)
I've had the sense that metagories should be special cases of "fuzzy simplicial sets", where the existence of a simplex with edges with "fuzziness" witnesses the fact that up to . (Here is just any contravriant order-isomorphism ).
Then the coherence property on tetrahedra should be some sort of "fuzzy inner horn extension property"
Metric approximate categories, as introduced in this paper and nick-named metagories (with stress on the second syllable)
this seems like a lost cause from the start
This concept does sound cool, though. There's a bunch of "approximate exactness" stuff that goes into the proof of the main theorem in the theory of liquid condensed modules, things of the form "if (or maybe just if is small) then is close to something of the form ", and it feels like there ought to be some more conceptual framework that's suitable for thinking about things like these.
Potentially related to the notion of "approximately injective" object in the paper linked to by @Martti Karvonen, for example
I am not sure if this is completely on topic, but in the following paper, the authors consider string diagrams which are not equal, but merely -close: https://arxiv.org/pdf/1704.08668.pdf
Reid Barton said:
This concept does sound cool, though. There's a bunch of "approximate exactness" stuff that goes into the proof of the main theorem in the theory of liquid condensed modules, things of the form "if (or maybe just if is small) then is close to something of the form ", and it feels like there ought to be some more conceptual framework that's suitable for thinking about things like these.
"Liquid condensed modules" definitely sounds like something out of physics or engineering
Eigil Rischel said:
I've had the sense that metagories should be special cases of "fuzzy simplicial sets", where the existence of a simplex with edges with "fuzziness" witnesses the fact that up to . (Here is just any contravriant order-isomorphism ).
Then the coherence property on tetrahedra should be some sort of "fuzzy inner horn extension property"
One should also probably allow the set of vertices (objects) and edges (morphisms) to depend on , with more objects and more vertices coming into existence as one increases . Of course, a morphism can't pop into existence until its objects have, and a commuting triangle can't pop into existence until its morphisms have. So, when one gets an inclusion of simplicial sets .
This should be connected to persistent homology and magnitude homology:
Does this giant hand-wave sound approximately right... metagories are to simplicial as Met-enriched categories are to globular?
maybe for sufficiently large :upside_down:
It seems like in a metagory you don't have any composition operation on morphisms. Instead what you can "compose" is this "approximately commutes" property of triangles. So this seems less directly related to the notion of a category than a Met-enriched category does.
For example if we only allow distances and then a Met-enriched category would just reduce to an ordinary category (since a metric space with a metric that takes values and is just a set as a discrete space) while a metagory is something that has triangles that may or may not commute, and you can paste commuting triangles in certain ways, but there's nothing that says that for any and there's a commuting triangle with two edges and .
The simplicial nerve of such a thing would satisfy the inner horn lifting condition in dimension 3 (and by definition in higher dimensions) but not in dimension 2
That's not something I have a name for. So a metagory is like a metric version of that.
@David Spivak also has this paper on "fuzzy simplicial sets": http://math.mit.edu/~dspivak/files/metric_realization.pdf (But yes, this should all hook up with the general topic of "persistence")
John Baez said:
Eigil Rischel said:
I've had the sense that metagories should be special cases of "fuzzy simplicial sets", where the existence of a simplex with edges with "fuzziness" witnesses the fact that up to . (Here is just any contravriant order-isomorphism ).
Then the coherence property on tetrahedra should be some sort of "fuzzy inner horn extension property"
One should also probably allow the set of vertices (objects) and edges (morphisms) to depend on , with more objects and more vertices coming into existence as one increases . Of course, a morphism can't pop into existence until its objects have, and a commuting triangle can't pop into existence until its morphisms have. So, when one gets an inclusion of simplicial sets .
Yes, indeed - there's an obvious notion of metric simplicial set, namely a functor .
Then there should be something like a fuzzy Segal property one can put on this to recover a notion of fuzzy category.
The interpretation of being something like "n-simplices of error at most .
(Maybe the maps should be assumed injective, in this interpretation? But maybe not)
One reason it might be useful to have "maps with error", i.e why the map might not be surjective, is because of commutative diagrams.
I.e if you form the "approximate arrow category", morphisms should be commutative diagrams - so maybe "morphisms of error " should be diagrams that commute up to (this might just be exactly what you get if you take the internal hom object )
Interesting, so then by the same argument, it would also make sense to have objects with error, as well.
Yes. Higher-dimensional diagram categories would be one example.
Right, Reid!
I guess it's good to not require that the map of simplicial sets that we get when be a monomorphism.
So, as we increase the error tolerance , new objects can appear: "erroneous" objects that only exist if you have a big enough error tolerance. But they can also merge: that is, two objects can count as the same object if you have a big enough error tolerance.
Jules Hedges said:
There's also been some sporadic ideas along the lines of "diagrams that commute up to approximate equality". The obvious way to do it is to use categories enriched in metric spaces, so every hom-set has a metric and you can consistently say that the distance between two paths through a diagram have distance
Diagrams that commute up to epsilon came up in Cstar-algebras, and there were important uses in the classification program. I really wanted to consider generalizations of these ideas back when I wasn't mathematically dead, but I didn't know what to learn both from analysis and category theory. I still think there's something to considering this idea, but whenever I mentioned it to anyone back in the day, no one seemed interested. I look forward to any development of ideas here.
John Baez said:
Right, Reid!
I guess it's good to not require that the map of simplicial sets that we get when be a monomorphism.
So, as we increase the error tolerance , new objects can appear: "erroneous" objects that only exist if you have a big enough error tolerance. But they can also merge: that is, two objects can count as the same object if you have a big enough error tolerance.
I guess one problem, or at least weird thing, with this is that, if is the restriction map, the relation is an equivalence relation - but obviously in a metric space, the relation is not transitive.
It should be "skew-transitive" (if that's the right word) for the right way of combining errors quantitatively... the same version of transitivity you get in the axioms of a convex set (ie. algebras of probability)
Actually that may not be true, consider that a wild guess
I don't think it's a "problem" that is not transitive except at . Transitivity of is a special case of the triangle inequality , and there must be an appallingly abstract way to say this fact using Lawvere's formulation of metric spaces as enriched categories. Something about change of base...
Well, I guess the obvious interpretation of the statement " become indistinguishable at error tolerance " would be that - but the above shows that that interpretation can't be right. But I agree that it's not necessarily a big issue.
Well, it just means that you can't consider the approximation error as a constant forever. You need to keep track of how approximation errors evolve through a calculation
Which is probably something like the 0th think you learn in analysis, I'd guess
Okay, now I get it: Eigil was reacting to me saying:
So, as we increase the error tolerance ϵ, new objects can appear: "erroneous" objects that only exist if you have a big enough error tolerance. But they can also merge: that is, two objects can count as the same object if you have a big enough error tolerance.
So yeah, it's a bit weird. If we define a "persistent simplicial set" to be a functor from the poset to simplicial sets, we get simplicial sets and maps when , and two simplices in can be mapped to the same simplex in . That's what I meant by merging.
But this is not the same with identifying two things whenever the distance between them is . That wouldn't make sense.
Yes, that's what I was commenting on.