Category Theory
Zulip Server
Archive

You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.


Stream: theory: category theory

Topic: Polynomial maps in a categorical setting


view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 16:18):

Is there any categorical setting where this composition of maps between vector spaces (or modules over a semi-ring) makes sense?
AAAAAS2(A)A \rightarrow A \oplus A \rightarrow A \otimes A \rightarrow S^{2}(A)
The first map is a (linear) copy map which sends xx to (x,x)(x,x). The second map is the universal (bilinear) projection into the tensor product which sends (x,x)(x,x) to xxx \otimes x. The third map is the universal (linear) map from XXX \otimes X to the second tensor symmetric power of AA (which can be defined as the coequalizer of the identity and the swap from AAA\otimes A to AAA \otimes A) which sends xxx \otimes x to xsxx \otimes_s x.

Together it gives me an exponentiation map which sends xx to xs2x^{\otimes_s 2}.

It can seem silly but you can't do this in a monoidal category and I think you can't do this in a multicategory, poycategory, additive, with tensor products, as you want... I don't think the definitions would allow such composition. And in logic, it doesn't fall either in the framework of linear logic: linear logic is only about linear maps. Not quadratic maps like this. But these power maps are an elementary component of algebra and are useful in the most basic mathematics.

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 16:22):

I feel like category theory isn't even equipped with enough theory to express the identity (a+b)2=a2+2ab+b2(a+b)^2=a^2+2ab+b^2 in a purely categorical setting that I would like to write (a+b)s2=as2+2asb+bs2(a+b)^{\otimes_s 2}=a^{\otimes_s 2}+2a\otimes_s b +b^{\otimes_s 2} for any vectors a,ba,b in a common vector space AA.

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 16:24):

I've already talked about that and people didn't seem to realize how there is a basic problem to solve here.

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 16:24):

Maybe I'm wrong also and someone knows how to do that. Maybe it's super simple and I'm just idiotic.

view this post on Zulip James Deikun (Dec 20 2023 at 22:52):

I think the key here is that polynomial maps are the natural maps of affine schemes, not of vector spaces. Commutative kk-algebras are monadic over kk-vector spaces and the coordinate rings of affine spaces are freely generated, so you can still talk about linear maps as free maps between freely generated objects.

Basically: the polynomial rings k[x1,...,xn]k[x_1,...,x_n] are the freely generated symmetric algebras on the kk-vector spaces RnR^n, and if your vector space is not of that form you will still get a "polynomial" ring. The elements of the algebra represent the (polynomial) maps from your free affine scheme to A1\mathbb{A}^1, which are the same as ring homomorphisms from k[x]k[x] to your kk-algebra. More generally, polynomial maps are represented as ring homomorphisms in the opposite direction.

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 22:58):

Could you give more details? When you say that you still get a "polynomial" ring are you talking about the symmetric algebra of a vector space / module or something else?

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:01):

Are you saying that you view a polynomial map as a linear map of the form S(A)BS(A) \rightarrow B?

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:03):

If this is what you say, then I'm not very satisfied by this approach: it doesn't allow me to see concretly a map which takes an element to its square. And it doesn't allow me to encode for instance the binomial theorem as a commutative diagram.

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:06):

I'm interested by very very practical stuff. I want to be able to understand the binomial theorem via sequent calculus and cut elimination in some logic for instance and this is not something any schemes help me to do as far as I understand them. But I could very well be wrong.

view this post on Zulip James Deikun (Dec 20 2023 at 23:11):

Looking at the maps you talked about at the top, the first one is represented by S()S(\nabla), the lift of the codiagonal to symmetric algebras; the second one by the nonlinear map from S(AA)S(A^* \otimes A^*) to S(AA)S(A^* \oplus A^*) that takes every generator xyx \otimes y to (x0)s(0y)(x \oplus 0) \otimes_s (0 \oplus y); the third is the inclusion of S(S2(A))S(S_2(A^*)) in S(AA)S(A^* \otimes A^*). The composite takes every symmetric 2-form on A to the corresponding quadratic 2-form on A as a polynomial in 1-forms.

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:16):

Ok. I'm glad you understand how to encode these maps in terms of symmetric algebras, tensor products and biproducts. But do you know how to do this in an abstract categorical setting? For instance, monoidal categories can be seen as an abstraction of vector spaces together with their tensor product. You can see multilinear maps between vector spaces as a multicategory. What about these maps? Is there a setting to talk about them purely categorically? This is really what I'm interested by. Because once you have this, you can maybe see your maps as proofs of some logic.

view this post on Zulip James Deikun (Dec 20 2023 at 23:16):

And I'm saying I view a polynomial map p:ABp : A \to B as a linear map of the form BS(A)B^* \to S(A^*).

view this post on Zulip James Deikun (Dec 20 2023 at 23:18):

You can definitely abstract this process considerably, for example you can abstract SS to just "a monad", but you will have trouble talking about binomial coefficients at that level of abstraction.

view this post on Zulip John Baez (Dec 20 2023 at 23:23):

James Deikun said:

I think the key here is that polynomial maps are the natural maps of affine schemes, not of vector spaces.

That's the solution that @Todd Trimble, @Joe Moeller and I are using to deal with problems of this sort. There's a symmetric monoidal functor from FinVect\mathsf{FinVect} to AffSch\mathsf{AffSch} which lets you turn any category enriched over FinVect\mathsf{FinVect} into one enriched over AffSch\mathsf{AffSch}, but the latter allows for more general maps between the hom-spaces: maps that are not just linear but 'polynomial'.

view this post on Zulip John Baez (Dec 20 2023 at 23:24):

@James Deikun wrote:

it doesn't allow me to see concretely a map which takes an element to its square.

It does: indeed that's exactly the sort of thing it does.

view this post on Zulip James Deikun (Dec 20 2023 at 23:25):

(The only time I've really used this point of view in anger was when I was thinking about determinants and characteristic polynomials. It worked pretty well then. I didn't completely succeed in what I was trying to do, but I don't think that was down to any of this.)

view this post on Zulip James Deikun (Dec 20 2023 at 23:30):

John Baez said:

it doesn't allow me to see concretely a map which takes an element to its square.

It does: indeed that's exactly the sort of thing it does.

Yeah; if you take a basis for the dual of your vector space, this is the same as choosing coordinates; then the ring underlying your affine scheme is the ring of polynomial functions of the coordinates. You can then read formulas directly off the backward maps by sending the basis elements of B through.

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:31):

John Baez said:

James Deikun said:

I think the key here is that polynomial maps are the natural maps of affine schemes, not of vector spaces.

That's the solution that Todd Trimble, Joe Moeller and I are using to deal with problems of this sort. There's a symmetric monoidal functor from FinVect\mathsf{FinVect} to AffSch\mathsf{AffSch} which lets you turn any category enriched over FinVect\mathsf{FinVect} into one enriched over AffSch\mathsf{AffSch}, but the latter allows for more general maps between the hom-spaces: maps that are not just linear but 'polynomial'.

Hmm, it looks interesting but I'm very very cautious. I want to encode things with the extreme level of formalism of proof theory that's why I'm not satisfied by most of what I read. How is this functor defined? I know how to talk about symmetric powers with sequent calculus or string diagrams as long as we allow scalars in Q+\mathbb{Q_+}. This is what I've been doing in this preprint: String diagrams for symmetric powers I: In symmetric monoidal ℚ≥0-linear categories (with string diagrams, not sequent calculus). And you could apply your symmetric functors to my string diagrams because they form a category enriched over FinVect\mathsf{FinVect}...

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:32):

Then it could give you a practical language to talk about polynomial maps.

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:34):

By the way, it tooks me months to write all these string diagrams. All that because I dream to talk about polynomials in a purely syntactic setting like string diagrams or proofs. If you want, I'm interested by the categorical logic of polynomial maps, in the same way than in the book of our lamented P. Scott and J. Lambek (which doesn't talk about polynomials maps of course).

view this post on Zulip James Deikun (Dec 20 2023 at 23:34):

(Or just in general, you can compute some 1-form on B explicitly by sending it through the backwards map and you'll get a formula for the result in terms of a polynomial in 1-forms on A...)

view this post on Zulip John Baez (Dec 20 2023 at 23:38):

How is this functor defined?

Given a finite-dimensional vector space VV over a field kk, let S(V)S(V^\ast) be the symmetric algebra on its dual. This gives a contravariant functor from FinVectk\mathsf{FinVect}_k to CommAlgk\mathsf{CommAlg}_k, the category of commutative algebras over kk. But by definition AffSchk\mathsf{AffSch}_k is the opposite of CommAlgk\mathsf{CommAlg}_k. So we get a functor from FinVectk\mathsf{FinVect}_k to AffSchk\mathsf{AffSch}_k.

(I am now making the dependence on the field explicit, which I wasn't earlier.)

view this post on Zulip John Baez (Dec 20 2023 at 23:43):

It takes a bit of work to show that this functor is symmetric lax monoidal. (I left out the word 'lax' earlier!)

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:44):

Brr... That's fine mathematically but my ultimate question is: Can you express the map xx2x \mapsto x^2 using string diagrams or anything purely syntactic? That's really what I want to do. Can you write the binomial theorem using only the language of category theory? That's the kind of stuff I'm interested by.

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:44):

You can use this functor if it helps toward this goal.

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:45):

My questions are much more difficult that what it sounds I think. For instance we learned how to talk about linear maps purely syntactically only with the invention of symmetric monoidal categories and linear logic. The second one was only 40 years ago.

view this post on Zulip John Baez (Dec 20 2023 at 23:47):

Okay, you have a somewhat different goal than I have. I'm not trying to use string diagrams for anything now; I just need a category where I can talk about things like the nth power map from End(V)\mathrm{End}(V) (the algebra of endomorphisms of a finite-dimensional vector space) to itself, or the concept of an 'algebraic' representation of this algebra of matrices, etc.

view this post on Zulip Todd Trimble (Dec 20 2023 at 23:48):

I don't think I understand the rules of the game. Are you just allowing yourself one tensor product?

view this post on Zulip James Deikun (Dec 20 2023 at 23:50):

I'm actually pretty interested in how the lax monoidal structure on this functor works. And I think it's relevant to @Jean-Baptiste Vienney's questions too ...

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:50):

Todd Trimble said:

I don't think I understand the rules of the game. Are you just allowing yourself one tensor product?

You can use two if you want...

view this post on Zulip Jean-Baptiste Vienney (Dec 20 2023 at 23:50):

John Baez said:

Okay, you have a somewhat different goal than I have. I'm not trying to use string diagrams for anything now; I just need a category where I can talk about things like the nth power map from End(V)\mathrm{End}(V) (the algebra of endomorphisms of a finite-dimensional vector space) to itself, or the concept of an 'algebraic' representation of this algebra of matrices, etc.

I would want the nthn^{th} power map to be a natural transformation if possible.

view this post on Zulip James Deikun (Dec 20 2023 at 23:53):

I think the representation as "converting symmetric n-forms to degree-n forms" as I outlined above is a natural transformation.

view this post on Zulip James Deikun (Dec 20 2023 at 23:59):

The nth power map on End(V)\mathrm{End}(V) isn't natural though just because End()\mathrm{End}(-) isn't a functor. Maybe it's dinatural?

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 00:03):

Brr I think I'm going to be infinitely annoying. My original question was about this map:

Jean-Baptiste Vienney said:

AAAAAS2(A)A \rightarrow A \oplus A \rightarrow A \otimes A \rightarrow S^{2}(A)

You see, it looks very simple. A composition of three maps. And I don't use a symbol for the symmetric algebra. Only for the second symmetric power. I don't use a symbol for the dual of a vector space. I want to be able to express this as a proof in some sequent calculus. I already know how to talk about the first and the third maps but not how to talk about the second map. I'm sure that such a sequent calculus doesn't exist yet and not lot of people here are interested by proof theory so they will probably not invent it. (I would be fine with string diagrams or a purely categorical framework as well, they are all pretty much the same thing). So I don't think it's useful to talk a lot about that. Unless someone really understands what's my goal and knows an answer. That's all the time like this here. People say what they know even if it doesn't answer the question or they don't understand the spirit of the question. There is lot of "showing off". But also I'm just being lazy. I should just work on this by myself.

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 00:12):

However, if someone can gives me a purely categorical framework where the composition of these three maps makes sense, which is one of the three formalisms I'm interested by together with string diagrams and sequent calculus, please do it. If you know it doesn't exist, please say it. But anything else than these two possibilities will probably not be the answer I'm looking for and it's going to confuse me a lot to try to understand if there is something useful for me in the answer.

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 00:26):

Pff, thanks everyone for trying to make progress on this question though.

view this post on Zulip John Baez (Dec 21 2023 at 00:51):

Jean-Baptiste Vienney said:

AAAAAS2(A)A \rightarrow A \oplus A \rightarrow A \otimes A \rightarrow S^{2}(A)

All these maps are morphisms in the category of finite-dimensional vector spaces and polynomial maps. These are excellent examples of the kind of maps I use a lot in my work with Todd and Joe on Schur functors. We find the AffSch\mathsf{AffSch} approach is crucial for proving the kinds of theorems we want to prove. But of course that doesn't imply that you should take this approach!

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 01:01):

Can anyone rewrite thes maps without using the symmetric algebra but only symmetric powers? I think both @James Deikun and @John Baez must understand very well these maps.

James Deikun said:

Looking at the maps you talked about at the top, the first one is represented by S()S(\nabla), the lift of the codiagonal to symmetric algebras; the second one by the nonlinear map from S(AA)S(A^* \otimes A^*) to S(AA)S(A^* \oplus A^*) that takes every generator xyx \otimes y to (x0)s(0y)(x \oplus 0) \otimes_s (0 \oplus y); the third is the inclusion of S(S2(A))S(S_2(A^*)) in S(AA)S(A^* \otimes A^*). The composite takes every symmetric 2-form on A to the corresponding quadratic 2-form on A as a polynomial in 1-forms.

(I prefer symmetric powers because they allow me to introduce the nn from xnx^n, which I can't do using only the symmetric algebra. This line of idea comes from graded linear logic. And I've been extending it to the richer setting of graded differential linear logic (I've invented/discovered this thing). This is the first thing I did in research. After this, I became interested a lot only in symmetric powers (the !n!_n in graded linear logic is not necessarily a symmetric power). And I rediscovered a theorem about an algebraic characterization of symmetric powers but in a purely categorical setting which is the subject of the preprint I linked above. I say this in order that people understand my background because I'm always annoyed when people don't understand that there is a lot of thoughts behind my question.)

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 01:07):

In order to interpret the _\_^* in this answer, I would need probably a compact closed or star-autonomous category. I would prefer if there is no need of using _\_^*.

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 01:08):

The dual is a "no" in logic and I always prefer when we don't use the word "no" if we don't need to. I understand this is a very personal point of view.

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 01:11):

Also the symmetric algebra introduces the notion of infinity by taking a countable coproduct and I prefer not to use any infinity when it’s not required.

view this post on Zulip James Deikun (Dec 21 2023 at 02:27):

Well, your two conditions are hard, but maybe fulfilling them together will be easier than either one alone. If you are only interested in homogeneous maps, then you can take a category-like setting where there are objects and for each pair of objects A,BA,B and each natural number nn a set of arrows Homn(A,B)Hom_n(A,B). Each one of these can be realized without using the dual because it is basically Hom(B,Sn(A))Hom(Sn(A),B)Hom(B^*,S^n(A^*)) \cong Hom(S^n(A),B) if your vector spaces are finite-dimensional or Banach or whatever.

This presentation secretly represents the squaring map as the identity map on S2(A)S^2(A) actually. This is kind of the problem with it from a linear logic perspective; the bookkeeping to see a square as two uses doesn't really come so naturally.

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 02:35):

The squaring map as the identity on S2(A)S^2(A) is exactly what I don't like!

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 02:36):

Do you have a real squaring with what you wrote before?

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 02:37):

It took me a lot of time to realize that the linear logic / differential categories etc... perspective talks about non-linear maps but in a kind of fake way, without the non-linear phenomena.

view this post on Zulip James Deikun (Dec 21 2023 at 02:39):

With what I wrote before, if you feed in a coordinate of S2(A)S^2(A) to the backward map, you will get a formula in coordinates of a quadratic function out.

view this post on Zulip James Deikun (Dec 21 2023 at 02:40):

To me this counts as "real squaring", but I don't know if you would count it or not.

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 02:42):

Hmm, is your feeding realized by a map? Is it really part of the categorical setting or something that you can do only using set-theory and elements?

view this post on Zulip James Deikun (Dec 21 2023 at 02:44):

I guess you would probably not consider it part of the categorical setting; it uses the fact that we are working with kk-algebras in particular.

view this post on Zulip James Deikun (Dec 21 2023 at 02:53):

You can get a setting where you keep track of everything in the categorical part if you use a generalized multicategory where each input comes with a degree/multiplicity. But it will still represent the squaring map as an identity in the underlying category of vector spaces. Vector spaces just don't understand the idea of "using the same variable twice", so polynomial maps have to be encoded in some way or other.

view this post on Zulip James Deikun (Dec 21 2023 at 03:02):

(Whether it's using a linear map from the space of polynomials to the space of polynomials, like above, which sort of moves the nonlinearity below the reach of vector spaces, or whether it's sometimes using an interpretation of Sk()S^k(-) as containing only the diagonal, which moves it above the reach of vector spaces.)

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 03:05):

I think a syntactic approach would be simpler. Trying to write a kind of linear logic where you can square and understanding the categorical semantics later. The weird part is clearly the AAAAA \oplus A \rightarrow A \otimes A. Or another solution is to introduce a rule which makes AAAA \rightarrow A \otimes A. Or a !nA!npA!_n A \rightarrow !_{np}A with a graded modality. This last one is maybe the best one. It would encode directly the pthp^{th} power map (on homogeneous polynomials of degree nn for instance).

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 03:06):

The last one is clearly in the spirit of the strucural rules of linear logic I think. But introducing a very non-linear element.

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 03:10):

With this you could maybe understand for instance the binomial theorem through cut elimination which would be very cool.

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 03:17):

I'm glad I talked with you because I had never thought before to these two other ways of encoding the square or power pp operations in a linear logic way.

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 03:25):

I'm so happy about this:
!nA!npA!_n A \rightarrow !_{np}A
This is a beautiful line
:) :) :)

view this post on Zulip Jean-Baptiste Vienney (Dec 21 2023 at 03:38):

A few years ago I got the idea of combining differential linear logic and graded linear logic and now here is a new element to add: "exponentiation rules" :) :) :)