You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
(deleted)
To me this sounds like a semiadditive monoidal category. In the terminology of the nlab it would be a [[distributive monoidal category]] with finite biproducts. Well, except that you don't want a zero object. Once you have tensor products, you don't need to worry about multilinear maps anymore.
(deleted)
(deleted)
Assuming "the category of vector spaces" means "... and linear maps" like usual, then this isn't a map in the category of vector spaces at all.
(deleted)
(deleted)
(deleted)
Jean-Baptiste Vienney said:
More simply, I can ask:
give me a categorical doctrine that allows to build the map
in the category of vector spaces.
I'll assume is a vector space and is an element of . Then this isn't a linear map, but it's a polynomial map. (There's a pretty obvious concept of polynomial map between vector spaces, which I can explain if necessary.) So, your map is a morphism in the category of vector spaces and polynomial maps.
Jean-Baptiste Vienney said:
I know how to define homogeneous polynomial maps (it is defined in the picture above), but how do you do for general polynomial maps?
A general polynomial is a finite sum of homogeneous polynomials.
Given finite-dimensional vector spaces with chosen linear coordinates and , a polynomial function is one where each coordinate of is a polynomial in the coordinates of , i.e.
for some list of -variable polynomials .
This definition of polynomial function seems to depend on the choice of coordinates, but in fact it does not.
There are slicker ways to give this definition, but I hope this is clear.
Here's a slicker way, in case anyone cares. Any vector space gives a commutative algebra , namely the free commutative algebra on . You can think of this as the algebra of polynomial functions on . A linear map gives an algebra homomorphism , and this gives a faithful embedding . The latter category is called the category of [[affine schemes]]. Taking the full image of we get the category of vector spaces and polynomial functions between vector spaces.
I've been working with @Todd Trimble and @Joe Moeller on "polynomial functors" , which are discussed in representation theory. These are not functors from to in the usual sense! They are really functors from to the category of vector spaces and polynomial functions. (I'm leaving out some conditions that representation theorists often use, like finite-dimensionality, but this is why I'm interested in polynomial functions between vector spaces.)
Thanks, I'm gonna try to understand this and ask you questions. Sorry for having deleted all my messages. I've been quite stressed by my work recently.
I understand. I'm sorry if something I said sounded rude, like telling you some obvious facts you already know. I just like spelling things out methodically, starting from simple stuff and working up.
John Baez said:
Here's a slicker way, in case anyone cares. Any vector space gives a commutative algebra , namely the free commutative algebra on . You can think of this as the algebra of polynomial functions on . A linear map gives an algebra homomorphism , and this gives a faithful embedding . The latter category is called the category of [[affine schemes]]. Taking the full image of we get the category of vector spaces and polynomial functions between vector spaces.
I've been working with Todd Trimble and Joe Moeller on "polynomial functors" , which are discussed in representation theory. These are not functors from to in the usual sense! They are really functors from to the category of vector spaces and polynomial functions. (I'm leaving out some conditions that representation theorists often use, like finite-dimensionality, but this is why I'm interested in polynomial functions between vector spaces.)
My approach to symmetric algebras (as I talked about in the stream "our work") comes in fact from this question of finite-dimensionality. So let me explain it a little more before we come back to what you were saying.
To speak about the features of vector spaces in a categorical or logical way, we have (differential) linear logic and differential categories. People in categorical quantum mechanics are more interested by speaking about the features of the category of finite dimensional vector spaces (that they prefer calling the category of finite-dimensional Hilbert spaces) and @JS PL (he/him) was wondering whether we can do the differential stuff in the category of finite dimensional vector spaces to relate the two fields. And he proved that it is impossible in this paper: Why FHilb is Not an Interesting (Co)Differential Category. In linear logic, or differential linear logic / categories, you can take the exponential of a vector space, which is a kind of space of smooth functions with coordinates in , but it gives always something infinite-dimensional, unless you take , intuitively because you always have . And is infinite-dimensional unless . That's why I took the approach of instead of considering , considering rather the family which is a family of finite-dimensional spaces when is finite-dimensional. I found it really interesting to work with the family of all the spaces of formal homogeneous polynomials of degree , rather than with the countable biproducts of all of them. Because it allows you to consider for instance the map which is equal to (I call this equation "specialty") and plays a role in the characterization of the families as the only special connected graded bialgebras (which happen to be automatically bicommutative and have an antipode). I found that their were previously characterizations of the symmetric algebras really similar to mine (apart from the difference that they have nothing to do with category theory neither logic) published and the authors used a way more complicated diagram to express this equality because they considered the entire symmetric algebra rather than "cutting it in small pieces" as I do (or another complicated axiom in another characterization). In the two cases, the more complicated axiom instead of the one of specialty is impossible to express in a kind of "linear logic" whereas all my axioms are expressible as equality between proofs in a convenient logic.
John Baez said:
Here's a slicker way, in case anyone cares. Any vector space gives a commutative algebra , namely the free commutative algebra on . You can think of this as the algebra of polynomial functions on . A linear map gives an algebra homomorphism , and this gives a faithful embedding . The latter category is called the category of [[affine schemes]]. Taking the full image of we get the category of vector spaces and polynomial functions between vector spaces.
I've been working with Todd Trimble and Joe Moeller on "polynomial functors" , which are discussed in representation theory. These are not functors from to in the usual sense! They are really functors from to the category of vector spaces and polynomial functions. (I'm leaving out some conditions that representation theorists often use, like finite-dimensionality, but this is why I'm interested in polynomial functions between vector spaces.)
What is the definition of a polynomial functor?
I guess that from my perspective, a linear map between finite-dimensional vector spaces would give a -graded-finite-dimensional-algebra homomorphism , and this gives a faithful embedding where is the category of finitely-generated (if it makes sense, I'm not sure) -graded commutative rings.
I don't know if it is that interesting but this has the advantage that everything is finite-dimensional.
I know that it is weird to cutting an algebra in pieces like this but it is something quite natural in homological algebra for instance where they prefer exterior powers rather than exterior algebras.
I'm not sure how you can define a morphism of "piecewise" graded ring/algebra like this but it must be doable. Between two algebras and , it must given be a family of linear maps which verifies some conditions.
According to Macdonald's book Symmetric Functions and Hall Polynomials, a polynomial functor is a functor from the category of finite-dimensional vector spaces and linear maps to the category of finite-dimensional vector spaces and polynomial functions. Here 'polynomial functions' between finite-dimensional vector spaces are defined in the way I just did. I generalized the definition to infinite-dimensional vector spaces, but if you restrict that to the finite-dimensional case you get his idea.
As you implicitly observed, a lot of interesting constructions on vector spaces are polynomial functors. For example the "nth tensor power" functor sending any linear map to is a polynomial functor.
In Chapter 1 Appendix A of his book, Macdonald gives a complete classification of polynomial functors.
Thanks that's going to be very interesting to me.
Good! This stuff about polynomial functors deserves to be generalized to some larger class of categories, and it sounds like you may do it.
I don't know all polynomial functors but I know: tensor powers, symmetric powers, exterior powers and divided powers. So it's part of my plan to give them the same categorical/logical approach that I started for symmetric powers. I know also that Schur functors exist but it would be for when I will have first treated the simple powers. So yes, polynomial functors could maybe be approachable also.
But note that the difficulty is to do something not only categorical but logical. You can define these various powers in any symmetric monoidal categories enriched over commutative monoids (or abelian groups for the exterior powers) by an equalizer or a coequalizer, that's not difficult. But this is by characterizing the family of eg. all symmetric powers of an object as an algebraic structure (instead of a limit) like some kind of graded bialgebra that you can make it logical. And the proof of this characterization for symmetric powers is kinda difficult.
So I'll give you news about all that! (I mean I'll put everything on Zulip.)
Could you tell me @John Baez (or somebody else) if there is an inclusion between Schur functors and polynomials functors or are they two completely different things? as you seem to know both.
Working with vector spaces over any field, every Schur functor gives a polynomial functor.
Over a field of characteristic zero, every polynomial functor that's a finite sum of homogeneous polynomial functors comes from a Schur functor. Indeed we can state an equivalence of categories in this case, between
and
This is proved by Macdonald in that appendix I mentioned.
Over fields of nonzero characteristic the situation is more tricky, and I don't really understand it. There's still a functor from to , but I think it is not an equivalence.
Actually you have to be a bit careful about how you define in this case, since two definitions that agree in characteristic zero don't agree in nonzero characteristic!
Thank you very much!
I'm happy to see that these two classes of "regular functors" on vector spaces are related. I'm not surprised that things are different in nonzero characteristic because it is already the case for symmetric powers.
In any characteristic, in the category of vector spaces, the symmetric power of is given by the coequlizer of the permutations and the divided power of is given by the equalizer of these permutations. In characteristic zero, the equalizer is equal to the coequalizer, which gives a third definition of the symmetric power as a splitting of the idempotent . This is this definition that I used and this is why I worked in symmetric monoidal -linear categories. This situation is degenerated from the point of view of linear logic as (if you interpret these powers as graded exponential and ).
In a -enriched symmetric monoidal category (ie. without any assumption on characteristic), I believe that I'm gonna obtain a combined characterization of the family of all symmetric powers and divided powers of an object as a kind of combination of two graded bialgebras, I'm not quite sure. So it would no longer be degenerated from the point of view of linear logic, with two different graded exponentials.
Right, the inability to divide by changes everything. In fact if there's an isomorphism between the equalizer of all the permutations and their coequalizer, where is a vector space over a field of characteristic .
But it should be very interesting to study the situation carefully for -enriched categories.
John Baez said:
Right, the inability to divide by changes everything. In fact if there's an isomorphism between the equalizer of all the permutations and their coequalizer, where is a vector space over a field of characteristic .
Ooh I didn't know that. Very valuable information.
John Baez said:
But it should be very interesting to study the situation carefully for -enriched categories.
Yes, divided powers and symmetric powers in -enriched symmetric monoidal categories are going to be my next series of headaches. It must be very interesting because there are several canonical morphisms between the symmetric power and the divided power of an object.
Maybe I'll do before the exterior powers in -enriched symmetric monoidal categorie because it's really close to the symmetric powers.
I believe that exterior powers are given by the equalizer or the coequalizer of the signed permutations in every characteristic (that's why we don't talk about divided exterior powers) but I'm not sure at all, I didn't started to look at this.
At the very end, I hope that I could find an algebraic characterization of the family of all Schur functors of an object, it would be the final fireworks.
Once you can do symmetric and exterior powers you can in theory do all the Schur functors described by Young diagrams, because they are just more complicated idempotents coming from other representations of the symmetric group .
Oh that sounds very logical. Because, to go from the idempotent description to the graded bialgebra description, the idea for the symmetric powers is this:
To go from idempotents to graded bialgebra, you start with:
and
Then you can build the unit, counit, multiplications and comultiplications like this:
If you start with the graded bialgebra which verify the right axioms:
Then you can build the idempotents like this:
The constructions must be the same for the exterior powers, you just have to change the axioms conveniently on the two sides.
If you have any idempotent, you can do the same. You just have to find the right axioms for the graded bialgebra and showing that the set of axiom is equivalent to the two equalities given for each idempotent, for every .
It could works in characteristic and it's great, I didn't how that all Schur functors are described by an idemptotent!
In positive characteristic, one must first try to understand how it works for the couple symmetric/divided powers. And in other categories than vector spaces like a category of semimodules over a semi ring, it could be even different.
In characteristic , the axioms of the graded bialgebra for the symmetric powers can be read in term of differentiation (in fact, it's more the notion of Hasse-Schmidt derivative which is useful to differentiate polynomials in positive characteristic), for instance the specialty axiom is a higher version of what is known as Euler identity (that, I wrote here on the nLab: homogeneous polynomial). The compatibility multiplication/comultiplication is a kind of -ary and higher order Leibniz rule.
The graded bialgebra side for exterior powers must be understandable in terms of Koszul complex.
The graded bialgebra side for Schur functors must be related to the notion of Schur complex.
In positive characteristic, the symmetric powers give homogeneous polynomials and the divided powers what's known as divided polynomials. You can differentiate these things and it must give the interpretation for the graded bialgebra side.
The Schur functors and Schur complexes also exist in positive characteristic.
If you have the idempotent , would you be able to say what where is the sum of the two partitions (ie. the partion of obtained by putting the two together) and what are equal to, if you adapt the construction before (I guess it's possible but for the moment I didn't check how to do it with partitions instead of just an integer) ?
Logically the first one must gives a sum such as the axiom for symmetric powers (in the image in my topic in the stream our work) but when you have a big sum indexed on four partitions that verify the right equalities in order to by decomposition of and multiplied by power something huge maybe.
The second must give the identity multiplied by power something depending of and multiplied by a binomial coefficient maybe.
I'm a bit lost but the idea should make sense.
If there is such a characterization, it must work in any symmetric monoidal -linear category in which the idempotents giving the Schur functors split (which is normally equivalent to the existence of equalizers for the appropriate diagrams and equivalent to the existence of the coequalizers for the same diagrams).
If you don't have the courage to decipher this, thanks for the very valuable information!
It gives me motivation to right now try to adapt my theorem to Schur functors (in characteristic 0) because it seems that it can really works.
I think I passed by several states in my mind and that's difficult to reexplain everything from scratch.
And there is intuition from differential categories also, various things...
Your comments are interesting, and I have a few more things to say about all this - but I'm busy traveling around right now so I'll have to wait a bit! In the meantime, if you're curious about my thoughts on Schur functors you can try section 2 of this paper, which is an explanation of classical stuff, and maybe also section 3, which is a category-theoretic reinterpretation of that stuff.
Beware: "polynomial species" are different from "polynomial functors", and our category is about polynomial species, not polynomial functors.