You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
I’m trying to understand some basic ideas in topology.
Topology is often presented as a way to “talk about space”, but it can be studied strictly algebraically.
You have a set of subsets, which are “closed” under the operations of union and intersection.
However, union and intersection can be defined in terms of each other. . I think set comprehension can be expressed in terms of an implication arrow: . We can use De Morgan’s law to show that . Negate both side to get . That says, “if it is not the case that is neither in or , then must be in either or ”.
Now, I believe you can also get rid of the negation operator using a similar strategy. These are the arguments behind the idea that a Boolean algebra can actually be defined in terms of a single binary operator.
So, when a topology is viewed as an algebraic structure, can it be expressed in terms of a single law of composition on a set of elements?
I guess you're thinking to the definition using only the NAND? With a law of composition on a set of elements, you have more flexibility because of the large size of your input compared to a binary operator.
You could look at an operator which takes in input a family of subsets of a set and do this:
But it's cheating because you have in fact an infinite family of operators depending on the sets , .
On the other hand, a more algebraic definition of a topological space is given by the Kuratowski axioms. It uses a unary operation which corresponds to taking the closure of a subset, the binary union and the empty set, but also the symbols and . Note that when you use , you can replace it by two uses of .
You can probably use an operator
on
where I use the usual notation for the closure instead of the of Wikipedia.
Now I believe you can write the axioms for a topological space using this operator, the empty set and the symbol .
It is not extremely satisfying because it is still using three different symbols, compared to two ones for a boolean algebra which are the nand and the equality. Moreover the important one takes four inputs compared to two for a boolean algebra.
Note that we probably can't use exactly "a law of composition on a set of elements". As you know, everything is in terms of subsets of a fixed set so the primitive objects are not exactly elements but these subsets of a fixed set. It is not exactly like in algebra.
Well, by looking more at the Wikipedia page, I now think we can maybe do better.
Screenshot-2024-04-14-at-11.28.44AM.png
You can maybe define a single operator of the type for some nonnegative integer and rewrite this equation in terms of this operator only.
But I think we will still have to put in the entries of our operator in order to reobtain all the axioms of a topological space, as it is required when you use this single axiom if you look at this Wikipedia page.
Anyway, this is probably the best we can do to answer your question.
The final answer should be this:
If you define the operator by:
then the preceding equation can be written:
So that a topological space can be defined as a set together with an operator such that:
It looks funny but I think it's correct (tell me if I made a mistake!).
Hmm, it doesn't really work because you can't get back to the closure by any way but it must be fixable by allowing more inputs.
This one must be correct:
Define the operator by:
Now Pervin's equation can be rewritten:
So that a topological space can be defined as a set together with an operator such that:
for every .
To get back to a topological space in terms of closure, define:
and then follow what's said on the Wikipedia page about Pervin's axiomatization and Kuratowski axioms in general to get back to the usual definition.
I'm still a but surprised about what I write but I invite you to verify everything and tell me if it doesn't work!
Damn, there is still a mistake.
Because
But maybe it's correct because you should get at some point. I don't know.
Brr I think it's all very confused but it's maybe or maybe not possible to get a definition in this style which works. I'll think about this.
Now I think none of this makes sense but I tried :upside_down:
Maybe I'll have an "epiphany" later :sweat_smile:
Maybe not what you're looking for but (classically) you can define compact Hausdorff spaces as algebraic structure on sets that specifies what every ultrafilter converges to (https://ncatlab.org/nlab/show/compactum#algebras). You can weaken this definition to get all topological spaces by only giving a relation that says which ultrafilters converge to which points (https://ncatlab.org/nlab/show/relational+beta-module)
Jean-Baptiste Vienney said:
Maybe I'll have an "epiphany" later :sweat_smile:
Thanks, yeah I appreciate you writing all this out, it’ll give me lots of work through and think about
Max New said:
Maybe not what you're looking for but (classically) you can define compact Hausdorff spaces as algebraic structure on sets that specifies what every ultrafilter converges to (https://ncatlab.org/nlab/show/compactum#algebras). You can weaken this definition to get all topological spaces by only giving a relation that says which ultrafilters converge to which points (https://ncatlab.org/nlab/show/relational+beta-module)
Yeah that’s great. I read up a little bit on Stone duality which sounds relevant.
Jean-Baptiste Vienney said:
[Kuratowski's closure operator]
You can maybe define a single operator of the type for some nonnegative integer and rewrite this equation in terms of this operator only.
Kuratowski's closure definition and Pervin's single-equation expression is very cool (in fact Pervin's equation is satisfied also for modular lattices, not just topologies!) but there are good reasons that others like Moore used different axioms.
If you want to think about topology using algebra, Moore closure is what I would focus more on (so, split out axiom K4 into K4' and K4''). The reason is very practical: lattices of submonoids, sub-algebras, etc. organized by generators do not satisfy , but they do satisfy K1-K3 and monotonicity.
Lets say a set of geneators generates a group usually written ; you can think of the generators as bijective functions, and generation as composing them with eachother ad infinitum. (And everything here is a subgroup of some symmetric group if any readers are wondering where the generating relations went) acts as a closure operator by K1-K3+K4' and yet for Ithat is are subgroups of ):
$$\langle J_H;\rho_G\rangle\cup\langle J_K;\rho_G\rangle \neq \langle J_H\cup J_K;\rho_G\rangle.
In fact, the LHS isn't even a group unless or !
Which is very useful, because it allows us to think of a new join operator on the sublattice of fixed points. . We can do this because of a theorem of Knaster and Tarski that the fixed points of monotonic functions on posets form a lattice. And so we can think of non-associative algebraic structures as the lattice points that are not fixed by the closure under composition.
In contrast, is every case of closure operators in algebra (at least of which I am aware)
And this survives even up into the mountain-tops of abstraction in the [[Lawvere-Tierney topology]].
So if you like algebra I would recommend developing intuition for Moore closures rather than Kuratowski's, and accept that at some point, you may need to discard extensivity. That should require the least re-learning, I think?
Another takeaway is that the shortest most "elegant" description accessible to beginners at one level of abstraction (such as Pervin's single equation) is rarely the most durable/general at higher levels. The exception to this rule are things written by Lawvere. :)
Also @Julius Hamilton very good to be aware that what allows us to construct all of boolean logic from NAND etc. is that NAND is a non-associative operation! Good to be aware of which boolean operators are monoids vs which not, in general.
In general there can be "more information" packed into a single non-associative operation. So as an example, the finite loop formed by the unit octonions (not associative) requires only 3 generators, while their Moore closure, the extraspecial group of order 128 (plus type) has rank 6. The group requires more generators, because a group is a more constrained algebra.
In contrast, is every case of closure operators in algebra (at least of which I am aware)
That's just not true. In fact, it's rarely true. Quick example: if is a vector space and sends a subset to the vector space it spans, and if are distinct nonzero elements that are scalar multiples of each other, then...
In the case of topological closure, the axiom that you add to Moore closure is preservation of finite joins, not finite meets.
On a separate note: while questions about minimal presentations of the theory of Boolean algebras (how many operations do you need? how many axioms do you need?) are amusing, I've never derived a whole lot of insight from them. I guess part of their interest might be circuit design?
While I'm on this, I'll quickly register a pet peeve, and then I'll back away. You hear people say things like "you only need one operation, like NAND, to describe Boolean algebras". Um, no. Unless you're living a hundred years ago and think that all sets are nonempty, by fiat.
Thanks a bunch for catching that! Its a great example. I will have to get more specific about what areas of algebra I was thinking of.
Perhaps I should say "you only need one operation to implement boolean algebra on circuit-board", where the non-emptiness criteria is dealt with by the digital abstraction; either a wire is rounded down to 0 volts or up to 5 volts. (Of course, when it really matters some analog-digital converters also use non-boolean truth values as a means of error-correction, but that's not so much to do with empty sets.
As for the second note I think we're in agreement? I am advocating adding montonicity to Kuratowski's axiom, and then subadditivity, or alternately, proving that if a closure is homomorphic on joins then it defines a topology.
By the way, that second note with the "pet peeve" -- that's not meant to be picking on anyone in this discussion! It's just something I see all the time, and it bugs me a little every time, but I do recognize that it's not such a big deal. Most pet peeves aren't!
Todd Trimble said:
In contrast, is every case of closure operators in algebra (at least of which I am aware)
That's just not true. In fact, it's rarely true. Quick example: if is a vector space and sends a subset to the vector space it spans, and if are distinct nonzero elements that are scalar multiples of each other, then...
Ah ok, can you give an example where the extensive monotonic closure operator that doesn't preserve meets is a self-map, e.g. ? I think that's the statement I intend to make and believe counter-examples to be rare.
I certainly don't believe that generally . Indeed, in the Lawvere-Tierney example I mentioned is a self-map. Having two topoi with compatible is beyond me but interesting.
I believe you that they are out there, but I maintain that they are rare, because usually the lattices e.g. in algebraic varieties are formed from generators, and the lattice is residuated by the operation on generators used to define the closure. So generative effects (e.g. non-homomorphic maps over the operation) are covariant with joins, sort of tautologically.
Sorry, I'm not following what you want. I was taking . (In the language I speak, a [[Moore closure]] operator is a monad on a power set .) Of course one could work with other suitable posets as well, such as a set of down-sets, or topologies, etc. etc.
Also, I want to touch base on your terminology. By "extensive", do you mean the condition (which I would call "expansive")? Or something else?
Also, in the language I speak, closure operators are self-maps by definition, if by "self-map" you mean an endomorphism. So here too I'm confused by your question.
(I'm going to be heading out in a few minutes, so it may be some hours before I can get back to you.)
All that lines up -- maybe spell out concretely what you wanted to mean then (the common subspace?) you described the objects as subsets but also subspaces, which sounds like a map from a lattice of say finite cardinality sets of vectors to a lattice of finite dimension subspaces.
So the power set in my example is the set of all subsets of (the underlying set of) the vector space , and is the ordinary intersection of subsets. The closure operator takes a subset of to the vector subspace that is spanned by , considered as a subset. For the ellipsis I had, I meant to consider subsets and , where were described.
ahhhh ok thanks. So there's an adjoint you've hidden inside that, which takes us into then back into . Ok, great, thank you!
I'll need to think of how to proscribe such maps... I want to stay in the same category "the whole time" if that even makes sense; absolutely limits and colimits exchanging is a great way to get inequalities.
Thanks again! I need to work on my communication a lot! Too many unstated assumptions make for poor understanding.
Would be cool if someone could suggest some exercises for me to prove to get up to speed in the above concepts. Thanks.
I am inclined to try to state definitions, as that alone will help to clarify understanding.
For example, a topology is defined in terms of a set. What is a set?
According to the formalism I have studied (Ebbinghaus 2021), a set is an element of a domain which is a model of a “set theory”.
Let me consult with Ebbinghaus on the notation for “A model of a theory.” BRB.
I realize now that Ebbinghaus et al. do not use the term “theory”, and they also do not use the term “language” alone, but only in the phrase “alphabet of a first-order language”; which is definitely noteworthy.
Instead, they use these concepts:
Next, I will be ready to give a concise definition of “a model of a set of formulas ”, and finally to pose the question, “If a set is an element of a model of a ‘set theory’, then can we delineate ‘a set theory’ in general, which encompasses multiple specific set theories?”
I don't know Ebbinghaus, maybe its a great reference, but learning model theory before basic set theory sounds like playing the game on hard mode. Like to the extent you might not end up finishing the game?
There are many more senior people here, but FWIW: Two alternatives that are free, category-friendly in their treatment, and require almost no pre-reqs, and designed for self-study:
Clive Newsteads text. It does talk about the space of formulas (uses the Kleene star) and proves some cool results on logic and set theory throughout, while also giving a solid foundation in everything else. It doesn't directly talk about topology, though, iirc. Still recommend.
I think Schechter's analysis book
covers point-set topology, Moore closures, the foundations of set theory, etc. and its very self-contained as well as being easily the most categorical analysis book I've ever seen. It's not pretty, is crammed and the indentation can be downright chaotic, but still really great attention to the kinds of foundational issues you seem to enjoy.
I'd love to hear what @Todd Trimble recommends, on algebraic closures, monads, etc in particular.
(I learned the algebraic lattice theory stuff by reading category theory texts, and then popping back into 70-year-old papers by Oyestein Ore and Hanna Neumann. Clearly I have some blindpots, though!)
Thanks. I’ll read both those texts. But I actually have enough reading material on my plate right now. It’s been really satisfying to stick to Ebbinghaus and try to understand every part fully. I’m really enjoying applying my understanding to this right now. I agree that my question is actually a detour question, but I want to finish formulating the question of what a set can be defined as, before returning to questions about topology. But that said, if anyone can simply hand me any exercises specifically relating to the topology-algebra-Boolean algebra questions in this thread, that would be ideal.
I’ll soon post the next segment on defining a model of a set of formulas.
Chapter 5 of Schechter looks good!
And section C.
Actually, the below was what I was drafting, but now that I have Schechter, I’ll leave off on the below and switch to that.
“””
The next part, I don’t feel as conceptually confident in yet. I have some subtle questions.
First, we define the satisfaction relation.
There are only 2 given rules for the satisfaction relation. The rest are “inductive”, in the sense that they are implications defined in terms of other cases of satisfaction holding.
These two base cases are:
In the above, is the symbol in the alphabet which was called “the equality symbol”. However, I do not think we have yet been given any information on the meaning of this symbol. In other words, this is the first time we are seeing what the semantics of this symbol are.
A preliminary to this was defining the meaning of inductively, which I leave out for now, as the meaning seems intuitive.
What (1) says is that if maps two terms in to the same element in domain , we can “write the sentence” .
Because this is an if-and-only-if statement, it follows that the sentence tells us that under interpretation , .
(…)
“””
The subject of this thread was "basic topology" but by now it's drifted to several very different subjects. This reminds me of Matteo Capucci's comment:
In Julius' account of his experience I also see something I experienced many times (which might not be what they're talking about but here we go), which I feel like calling 'ADD-mode' based on what I know about that condition (which, ftr, I haven't been diagnosed with). Sometimes my mind just gets incessantly distracted in side matters during problem solving.
It would be possible to learn topology without getting massively sidetracked into axiomatic set theory, monads and other topics. After all, it's not as if most students learn a about set theory before studying topology: most only use "naive set theory", and most never bother learning axiomatic set theory. They just take a course on topology, or grab a textbook on topology and read that!
But if for some reason you want to reach topology in a 100% rigorous way from the ground up (which I don't actually recommend), one standard approach would be:
There are also more modern approaches using category theory, but each approach takes roughly a year of work, and it's probably not good to start by studying all possible approaches.
ahhhh ok thanks. So there's an adjoint you've hidden inside that, which takes us into Vect then back into Set. Ok, great, thank you!
Well, yes and no. Yes: for any algebraic theory (which we here in category land often like to think of in terms of Lawvere theories, or alternatively in terms of monads, or even something else altogether), if you have a model of the theory, then you can define a closure operator that takes any subset of to the intersection of all substructures or submodels of that contain . This intersection is again a submodel, and the operation thus defined is a Moore closure operation, i.e., gives a monad on the poset . It's a nice exercise to prove this (and it's not very difficult).
But also no, in the sense that you don't have to start with a monad on ; the construction works much more broadly. For example, if is a field, then you could define so that is the smallest subfield that contains , i.e., the intersection of all subfields that contain . But fields are not monadic over sets!
Basically, any time you have a collection of subsets of a set that is closed under arbitrary intersections of subcollections, then you can likewise define a Moore closure operator in the manner described above. In fact, this thread illustrates that: if is a set equipped with a topology, and you take the collection of subsets of that are closed relative to the topology, then that collection is closed (different meaning of "closed"!) under arbitrary intersections of subcollections.
These are very pleasant and easy things to prove.
As a pure aside: you may have heard of the Kuratowski 14 problem. Given a topological space , and starting with a subset , how many different sets can you generate from by applying the closure operator and complementation operator? Well, it depends on which and which , but 14 is the upper limit. This is a famous and classical exercise. What seems less well known is that actually, the same is true for any Moore closure operator whatsoever; topological closure is a bit beside the point. (I don't say you can always hit the upper bound 14, but 14 is an upper bound regardless.) For example, if you start with the theory of monoids, and take for example the free monoid on two letters, and take the Moore closure associated with the collection of submonoids, then for a suitable you can get 14; this is a nontrivial "exercise" in the theory of formal languages.
Eric Downes wrote
I don't know Ebbinghaus, maybe its a great reference, but learning model theory before basic set theory sounds like playing the game on hard mode. Like to the extent you might not end up finishing the game?
Yes, with emphasis on basic. Maybe you can liken it to learning a foreign language. Most people want to learn a foreign language so that they can speak to other speakers, with facility and fluency, and not to become experts in the grammar -- mostly, all the fine points can come later. Likewise, with set theory, it's best to gain facility with how it works as a concomitant to studying something else, like real analysis or topology or algebra. The first chapter of Munkres's topology book gives an idea of what I mean. Basically, do a ton of exercises, and fluency will eventually be yours.
Eric Downes also wrote:
I think Schechter's analysis book
covers point-set topology, Moore closures, the foundations of set theory, etc. and its very self-contained as well as being easily the most categorical analysis book I've ever seen.
I've never looked at it, but Toby Bartels (once very active at the nLab) used to refer to it all the time, with approval.
Eric again:
I'd love to hear what @Todd Trimble recommends, on algebraic closures, monads, etc in particular.
Eh, I might not be the best one to ask. Regarding algebraic things, I mostly used Lang's Algebra as a graduate student (I've seen all three editions, and the first edition has a kind of rough charm that later editions paved over, to some degree). But later in life, I've found that Jacobson's books have a lot of treasures not found in Lang; they're nicely complementary.
I learned category theory (including monads) the old-fashioned way: reading Mac Lane's book (and here too, I find that the first edition has a kind of freshness and charm that the second edition didn't improve upon). But a lot of people here might find him too old and crusty, or possibly too demanding in the mathematical prerequisites, and point to newer books, with which I am less familiar. The trouble is, I've not experienced the pleasure of being a classroom instructor in category theory [an exception being taking over for Ross for an undergraduate course while he was away; Mark Weber was in that class]; if I were, then I would be on more familiar terms with the books by Awodey, Leinster, Riehl, etc., and know better what to say.
I can't help but think that learning from the nLab would be an oftentimes weird experience. There is richness aplenty, but my god some of it reads forbiddingly.
Todd Trimble said:
As a pure aside: you may have heard of the Kuratowski 14 problem. Given a topological space , and starting with a subset , how many different sets can you generate from by applying the closure operator and complementation operator? (...)
What seems less well known is that actually, the same is true for any Moore closure operator whatsoever; topological closure is a bit beside the point. (...)
if you start with the theory of monoids, and take for example the free monoid on two letters, and take the Moore closure associated with the collection of submonoids, then for a suitable you can get 14; this is a nontrivial "exercise" in the theory of formal languages.
Thanks! Somehow I'd not heard of it, though it's quite fun to play with already.
For the generic poset problem. It appears I'm allowed to assume the poset is a uniquely-complemented lattice, otherwise the problem doesn't seem well-defined. This is reasonably WLOG-safe due to a result of Dilworth that every lattice is a sublattice of one uniquely complemented.
Otherwise I'll try not to assume anything else, such as modularity, etc. If there are other "suitable" restrictions you do mean to require of the lattice, do tell!
(I don't think submonoid lattices are in general uniquely complemented though, so either I am missing something or the objects in the poset must live in some kind of formal completion of the monoid theory? I wonder if those objects have any meaning.)