You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
In the category of vector spaces over a field , Wedderburn's theorem says a semisimple algebra over is a finite product of matrix algebras with entries in some [[division algebra]] over .
For example a finite-dimensional simple algebra over is an matrix algebra with entries in or , and a semisimple algebra over is a finite product of such matrix algebras.
In the category of abelian groups, the Wedderburn-Artin theorem says a semisimple ring is finite product of rings with entries in division rings.
I think I've seen a version of Wedderburn-Artin for the category of -graded vector spaces over a field. There's a notion of -graded division algebra which is actually a bit subtle, etc.
So, I'm wondering if someone has tried to generalize Wedderburn-Artin to, say, a sufficiently nice symmetric monoidal abelian category . I think we can define 'ideals' for a monoid object in . And I think we can define a 'division algebra' in to be a monoid object in that has no left or right ideals except and itself.
I think we can define a 'semisimple' monoid in to be a monoid whose category of actions is a [[semisimple category]].
Then we can try to prove any semisimple monoid in is a finite product of matrix algebras over division algebras in .
That would be a monoid for which the only non-trivial congruence is the maximal one, I think.
"That" meaning what I was calling a "division algebra", I guess?
No, the semisimple monoid
Oh, better.
But isn't that more appropriate for simple monoids? Isn't there a nontrivial congruence on the semisimple real algebra , for example? There's certainly a nontrivial two-sided ideal.
Hmmm... I missed the "abelian" condition and was drawing intuition from the set-monoid case, where the action of a monoid on itself is not decomposable in its category of actions. Is decomposable in the way suggested by the notation within its category of actions?
(I think the answer is yes; I had also ignored the abelianness condition on semisimple categories)
Yes, is the common notation for the commutative ring that deserves to be called in the category of commutative rings, and is the sum of two -modules.
(For nonexperts: it's just the ring of ordered pairs of real numbers, with componentwise and . A module of this thing is just a pair of real vector spaces.)
When you say "category of actions", you need some finite generation condition, right? is not a finite direct sum of simple -vector spaces, for example.
You're right... let me think about how that usually gets handled.
Oh, I screwed up. This is wrong:
John Baez said:
I think we can define a 'semisimple' monoid in to be a monoid whose category of actions is a [[semisimple category]].
In the case where , for example, I want a semisimple monoid to be a semisimple ring. And this is not a defined to be a ring whose category of modules is semisimple; it's a ring that's semisimple as a left module over itself!
The word "semisimple" is massively overloaded in algebra - I just used in 3 different standard senses.
So let me define the key one here: a module of a ring is semisimple if it's a direct sum of simple modules, which in turn are defined to be nonzero modules with no proper submodules.
Okay, here's the theorem Morgan probably wanted me to say: "Moreover, a ring R is semi-simple if and only if the category of finitely generated R-modules is semisimple. "
A crux of the usual proof is that if is a minimal left ideal of then is a division ring. To make sense of endomorphism monoids/rings we might need a cartesian-closedness condition on our abelian category?
If we have that, I think the argument should proceed as in the classical case - will be a division ring in the sense you described, since if it had a non-trivial ideal we could take its annihilator to contradict minimality (minimality meaning simple-ness, I expect), and we can also deduce that the endomorphism ring is a direct sum of these division rings.
The converse might be trickier? It's hard for me to be sure.
Thanks! I was lazily hoping someone had already done this generalization, but you're diving in and starting to do it!
Heh Wedderburn-Artin is the first result I encountered in a "grown-up" representation theory course in my Masters, and it's remained a little mysterious to me so I was keen to look at it more closely. I'm not on top of the literature for this, though.
I'm far from an expert on Wedderburn-Artin, but I seem to need to generalize it for my work on the tenfold way and its generalizations.
By the way, did you really mean cartesian closedness? I've often seen symmetric monoidal closed categories of modules of a commutative ring, with an internal hom right adjoint to the tensor product, but never cartesian closed ones. (In fact I think there should be some obstruction to interesting cartesian closed categories when the product is a biproduct, as it is in abelian categories.)
Hmm, here's a generalization that goes in a different direction:
So [[fusion categories]] are merely monoidal, not symmetric monoidal, which is more general than I need, but also they have other properties that I may not want to assume. Most notably, they force us to work over a field, so Ostrik's theorem doesn't include the classical Artin-Wedderburn theorem:
"A fusion category is a rigid semisimple linear (Vect-enriched) monoidal category (“tensor category”), with only finitely many isomorphism classes of simple objects, such that the endomorphisms of the unit object form just the ground field k."
The author claims Victor Ostrik showed something like this:
Any semisimple algebra object in a fusion category C is isomorphic (as an algebra object) to the internal endomorphisms End(X) for some object X in a semisimple module category M over C.
John Baez said:
By the way, did you really mean cartesian closedness?
I suppose not, it seems like monoidal closed is what I need, since I probably want ? Or at least that's the relationship that holds when we're really working in a category of (left) -modules, although the tensor product isn't globally defined...
The main thing is that I want to be able to talk about the internal monoid of endomorphisms of my object.
Yes, I think you get all those things in the category of modules of a commutative ring, which is a paradigmatic example that I want to be able to handle.
I don't know what you mean by saying the tensor product isn't "globally defined". For (left) modules of a commutative ring, which automatically become bimodules, the tensor product is well-defined.
I'd be delighted if I could prove Wedderburn-Artin in any symmetric monoidal closed abelian category.
Or if anyone could.
A very first step would consist into defining what is a matrix monoid over a monoid in a symmetric monoidal closed category with biproducts. I would say that the underlying object of the object of matrices is and then you should be able to define multiplication of matrices as some morphism . I thought to that the other time but I didn't how to define this multiplication.
(In our case, I guess that the case is sufficient, in general it would give a kind of graded monoid which should make commute the appropriate associativity diagram and with a unit )
That's stuff which seems obvious to generalize to a pure categorical setting but is not, to me at least. I would be glad if you could explain to me. I'm not very experienced with monoidal closed categories.
So yeah, let's define matrices first :upside_down: , that's a tool I would be glad to be able to use
If you had matrices, you could also try to define their determinants which would be fun
John Baez said:
I don't know what you mean by saying the tensor product isn't "globally defined". For (left) modules of a commutative ring, which automatically become bimodules, the tensor product is well-defined.
I mean in the non-commutative case; the commutative case of Wedderburn-Artin is not quite so exciting.
My advancement in defining matrices is currently at this point:
I would now be glad to succeed to define
One other useful step would be to define
Hopefully, you will be able to define the multiplication of matrices from and
Jean-Baptiste Vienney said:
A very first step would consist into defining what is a matrix monoid over a monoid in a symmetric monoidal closed category with biproducts. I would say that the underlying object of the object of matrices is and then you should be able to define multiplication of matrices as some morphism . I thought to that the other time but I didn't how to define this multiplication.
There should be a way to do it. Here is something simpler, maybe helpful, that doesn't use the monoid structure on at all.
There's a morphism called internal composition:
in any symmetric monoidal closed category. (In fact a monoidal closed category is sufficient if we tensor things in the correct order, but I'm too lazy to remember the correct order so I'll use a symmetric monoidal closed category.) Using the hom-tensor adjunction we can get the above morphism if we have a morphism
But notice that in any symmetric monoidal closed category there are evaluation morphisms
By suitably composing/tensoring these we get the desired morphism .
By the way, it's also true that internal composition is "associative" and "unital", so for any object in a symmetric monoidal closed category, the object becomes a monoid object with internal composition as its multiplication.
There's a special case when we take , the -fold direct sum of the unit object. If our symmetric monoidal category is the category of modules of some commutative ring , consists of matrices with entries in , and the procedure I'm sketching makes it into a matrix algebra with the usual matrix multiplication.
If we instead want to generalize the algebra of matrices with entries in some algebra over , we can do this:
Let be any monoid in our symmetric monoidal closed category. Then becomes a monoid since it's a tensor product of monoids.
So, I guess the statement I'd like to prove goes roughly like this:
Suppose is a symmetric monoidal closed category. We can define left, right, and two-sided ideals in any monoid object . We say is simple if it has no two-sided ideals except and , and division if it has no left or right ideals except and .
Wedderburn for : every simple monoid is isomorphic to one of the form where is some division monoid.
I don't expect this is true without some extra conditions on , like it being an abelian category. I know it's not true unless we add some sort of "finiteness" condition on .
For example, if is the category of vector spaces, we have:
Wedderburn for : every finite-dimensional simple monoid is isomorphic to one of the form where is some division monoid in .
Perhaps the only finiteness condition we need is that has some minimal left ideal. For example Nicholson gives a short proof of this version of Wedderburn's theorem when is the category of abelian groups.
Wedderburn for : if is a simple monoid with a minimal left ideal, then it's isomorphic to one of the form where is some division monoid in .
Morgan pointed out the importance of this "minimal left ideal" business. Paraphrasing him:
A crux of the usual proof is that if is a minimal left ideal of then is a division ring.
Nicholson mentions something called "Brauer's Lemma", which also involves a minimal left ideal and a division ring.
I haven't seen the notation before. Is this another notation for where is an internal hom functor?
Yes, this is the notation used in linear logic for the linear implication, ie. if are proposition, is the proposition which means "from one unit of , I get one unit of " which is not the same that which means "from two units of "x", I get one unit of "
Categorically, the connectors and of linear logic correspond to symmetric monoidal closed categories.
Whereas, the connectors and of intuitionist logic correspond to cartesian closed categories.
David Egolf said:
I haven't seen the notation before. Is this another notation for where is an internal hom functor?
Yes, it's an internal hom. This notation emphasizes that we've got an internal hom related to a tensor product that's not necessarily cartesian. For example in Set or any other topos we've got
but in Vect we've got
where is the vector space of linear maps from to . But as Jean-Baptiste noted, it's mainly linear logicians who use the symbol . Also category theorists. is perhaps more common.
(that symbol is called a lollipop by the way)
Awesome, thanks for clarifying!
By the way, one reason I gave up on this thread is that I realized that this version of Wedderburn's theorem is a bit limited:
Wedderburn for the symmetric monoidal closed category : every simple monoid is isomorphic to one of the form where is some division monoid.
For example, if is the category of complex vector bundles over a topological space, a monoid in is an algebra bundle, and I believe there are simple monoids not of this form, coming from algebra bundles that are locally isomorphic to trivial bundles of matrix algebras, but not globally.
So, I think we should expect "Wedderburn for " to hold in the above form only under some conditions, e.g. when all projective objects are free. It would still be interesting (to me) to find a bunch of symmetric monoidal closed categories for which this limited form of Wedderburn holds.... or to state Wedderburn in a more general way. But I decided to go in another direction, which took me to separable algebras and Azumaya algebras.
Okay, I finally understand the Wedderburn-Artin theorem well enough to have a chance of generalizing it. As part of my struggles I rewrote these articles:
The Wikipedia article Simple ring had a completely bogus proof of the Wedderburn-Artin theorem which one editor had been complaining about for a year! I deleted it.
So let me try to sketch how Wedderburn-Artin might be generalized. Let be a closed symmetric monoidal Ab-enriched category with small limits and colimits, though I would like to reduce these hypotheses once I've actually proved what I want to prove!
Every monoid has a category of actions, which I'll call modules and denote as .
If you want to understand what I'm doing, it's good to think about the example , where will be a ring. This is what I'm trying to generalize. Another example might be a category of sheaves of abelian groups, or sheaves of modules of some fixed commutative ring.
Conjecture A. is Ab-enriched with small limits and colimits, and symmetric monoidal if is commutative.
Conjecture B. is -enriched.
The idea here is that given we can first form using the internal hom in , but then pick out a subobject of this which serves as the object of -module homomorphisms. Maybe we can call that if we're using left -modules.
This is different than the abelian group of -module homomorphisms from to , which also should exist.
We say is semisimple if is a [[semisimple category]] (viewed externally, I guess, as an Ab-enriched category).
is always a left module over itself in a standard way, giving an object I'll call .
When is semisimple this object will be a coproduct of finitely many simple objects.
Conjecture (External Wedderburn-Artin Theorem). When is semisimple, the ring of left -module endomorphisms of is a finite direct sum of matrix algebras over division rings.
This should actually be pretty easy, at least if I have enough of the right assumptions on : for example, I believe the endomorphism ring of any simple object in any abelian category is a division ring by Schur's lemma, and the rest of the argument should mimic this.
(But I didn't actually say is abelian - does it follow? - and also we may not need the full force of abelianness to do this argument.)
But what I want is an internal version of Wedderburn-Artin which describes as a monoid in , not the ring of left -module endomorphisms of .
Note that when these are the same! That is, we have an isomorphism of rings between and the ring of -module endomorphisms of as a left -module. (Hmm, I guess I need to treat it as a right -module instead, for this to be true.)
But in the case of general I want to "internalize" the external Wedderburn-Artin conjecture to get a description of itself.
I'd be surprised if your assumptions imply that is abelian.
But conjectures A and B should be pretty straightforrward enriched category theory, even with Ab replaced by some other Benabou cosmos. E.g. for conjecture B, you can think of as an -enriched presheaf category on regarded as a one-object -enriched category. Then to get the Ab-enrichment, you can use the fact that a monoidal Ab-enriched category is the same as an ordinary monoidal category with a monoidal adjunction , and apply change-of-enrichment along the right adjoint. That right adjoint sends an object to the Ab-enriched hom , where is the monoidal unit of .
Nice, thanks! That's slick.
I suppose you could get the monoidal structure on by noting that when is commutative, its corresponding one-object -category is a monoidal -category, and so its presheaf category has a Day convolution monoidal structure. But it might be easier to just write down the tensor product explicitly.
The point at which I'm tempted to use the abelianness of comes later, in proving Schur's lemma. People define an object to be simple if it has no subobjects other than 0 and itself, but in an abelian category this is equivalent to saying it has no quotients other than 0 or itself. So, in an abelian category you can see that an endomorphism of a simple object is either zero or invertible.
This makes the endomorphism ring a division ring.
Maybe we need less than abelianness for this, or else I could just decree a simple object has no subobjects or quotients other than 0 and itself.
Yeah, that was the first thing that occurred to me too. But why don't you want to assume that is abelian?
Maybe I'm being silly, but I might want to be the category of real vector bundles on a topological space, which is not abelian.
Ok. So first of all, that example shows that your assumptions don't imply is abelian, right? (-:
Right.
Do you have some intuition for what a "simple vector bundle" should be?
A line bundle I guess. Those are the ones whose endomorphism algebra bundles are division algebra bundles.
But I see now that's a kind of "internal" version of being simple: their rings of endomorphisms are huge and not at all division rings.
But btw, also, the place I tend to want abelianness is not but for where is a monoid in .
Anyway, I should think more about this example and see if I even want to handle it.
Note that is a special case of for a monoid in , namely the monoidal unit.
Anyway, that makes me wonder whether the category of vector bundles is "internally abelian" in some sense.
Luckily I have other motivating examples that don't force me to stretch so far: interesting examples of symmetric monoidal abelian where it seems the 'semisimple algebras' in are very interesting.
My favorite is the symmetric monoidal category of super-vector spaces over a field, i.e. -graded vector spaces where we use the symmetry that puts in a minus sign when we switch two objects. This is connected to the tenfold way. But people have also generalized a lot of the results to -graded vector spaces!
Okay, here's something a bit different an attempt to set up a theory of something like semisimple categories and division algebras with very low prerequisites:
Let be a semiadditive category, i.e. a category with binary biproducts and a zero object (a both initial and terminal object).
Such a category is automatically -enriched.
Let's say an object is atomic if all its endomorphisms are zero or invertible.
Let's say two objects are separate if the only morphism between them is the zero morphism.
Let's say our category is molecular if there exists a set of mutually separate atomic objects such that every object is a finite biproduct of copies of these objects.
I don't intend these terms to be permanent terminology, so please don't anyone hassle me about them - I just need some words for now to state a few easy results. But if you know more standard terms please let me know - or even better, useful theorems related to these ideas! (They are much more familiar for additive categories.)
I believe for every atomic object , is a division rig, i.e. a [[rig]] such that every nonzero element is invertible.
Thanks to how biproducts work, if is the n-fold biproduct then
that is, is the rig of matrices with entries in .
In a molecular category we then get a kind of Wedderburn-Artin-like result. Every object is of the form
and so we get
This is like how every semisimple algebra is a finite product of matrix algebras over division rings!
Interesting. How much of that has moved the theorem into the definitions? That is, if you specialize that to the classical Wedderburn-Artin theorem, is the statement "such-and-such category is molecular" closer to the hypotheses of the theorem or to its conclusion?
A few things to say here:
"Schur's lemma", or some abstraction of the usual version, says that in an abelian category an atomic object is the same as a simple object. This has a very short and conceptual proof.
As an immediate consequence, an abelian category is semisimple iff it's molecular.
A semisimple ring can be defined as one whose category of right modules is semisimple, but by the above that's equivalent to saying its category is molecular.
Then by my above generalization of Wedderburn-Artin we see that if is a semisimple ring, is a finite product of matrix algebras over division rings.
Then we need an extra fact for rings, , to see that if is semisimple it's a finite product of matrix algebras over division rings. This is the "traditional" Wedderburn-Artin theorem.
So, I think the proof of my generalization of Wedderburn-Artin is pretty much the same as the good proof of the "traditional" version, except that the "traditional" one also needs Schur's lemma to check that simple objects in an abelian category are atomic, i.e. have division rings of endomorphisms.
It's all pretty easy stuff (now that I understand it).
One thing I want to generalize away from the traditional case of rings is , so that my generalized Wedderburn-Artin, which describes endomorphism rigs of objects, can actually describe the objects themselves in some cases.
This is some sort of closed category / enriched presheaf stuff.
Okay, I wrote up a blog post on generalizing the Wedderburn-Artin theorem, and I think I'll declare victory for now, though there still are some issues, like how generally we can pull a trick like :
@Jean-Baptiste Vienney will be happy to see that I've reduced the assumptions to a bare minimum - so for example, my result applies to rigs.
Thank you, I wanted to protest when you were talking about abelian categories but I didn't want to be annoying. I'm glad that it applies to rigs!
Yes, me too! I try to get the basic ideas working before removing assumptions. It was only recently that I understand Wedderburn-Artin well enough to generalize it.