Category Theory
Zulip Server
Archive

You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.


Stream: deprecated: our papers

Topic: Functorial semantics for partial theories


view this post on Zulip Ivan Di Liberti (Nov 26 2020 at 19:27):

I would like to present to this community our last paper. This is :fire: Functorial Semantics for Partial Theories :fire: , j/w Loregian, Nester, Sobocinski. I think (well, I hope) many people here might like it .

https://arxiv.org/abs/2011.06644

The paper is a love story between string diagrammatic reasoning, props, restriction categories, and categories with finite limits. Besides the technical language in which it is written, it provides a graphical framework to represent equational reasoning for algebraic theories where operations are only partially defined. It also provides a complete description of the semantic range of such theories. On a more historical level, it successfully exports the Lawvere-presentation of algebraic theories to partially algebraic theories.

view this post on Zulip Notification Bot (Nov 26 2020 at 22:20):

This topic was moved here from #general > Functorial semantics for partial theories by Matteo Capucci

view this post on Zulip Kenji Maillard (Nov 27 2020 at 18:22):

Hi @Ivan Di Liberti , thanks for sharing that enlightening paper ! Beside the amazing string diagrammatic reasoning for partial term, I find your presentation and reconstruction of "classical results" around Lawvere theories really impressive and easy to follow.

However, reading the article left me with the following interrogation. I am not sure I understand why Freyd's essentially algebraic theories, the theories induced by Vickers & Palmgren's partial horn logic (or Makkai's FOLDS which I think are also related) do not provide a reasonable notion of syntax. At least I thought (in a rather unprecise/handwavy fashion) that all these approach correspond morally to a simple notion of dependent type theory.

Or maybe another way to formulate my question is: given a partial theory as you define them in the paper, is there a way to turn it into a total theory were you add as many sorts as you need to describe the partial domains of your operations ?

view this post on Zulip Ivan Di Liberti (Nov 27 2020 at 19:33):

Kenji Maillard said:

Hi Ivan Di Liberti , thanks for sharing that enlightening paper ! Beside the amazing string diagrammatic reasoning for partial term, I find your presentation and reconstruction of "classical results" around Lawvere theories really impressive and easy to follow.

However, reading the article left me with the following interrogation. I am not sure I understand why Freyd's essentially algebraic theories, the theories induced by Vickers & Palmgren's partial horn logic (or Makkai's FOLDS which I think are also related) do not provide a reasonable notion of syntax. At least I thought (in a rather unprecise/handwavy fashion) that all these approach correspond morally to a simple notion of dependent type theory.

Or maybe another way to formulate my question is: given a partial theory as you define them in the paper, is there a way to turn it into a total theory were you add as many sorts as you need to describe the partial domains of your operations ?

I think that's a very good question, and it's a pity that you can have only my answer. The best part of this paper, in fact, is that each author has his own understanding and point of view on this story.

That said, my point of view is the following. When you look at the history of algebraic theories, especially in parallel with the theory of clones, the connection between the categorical specification of syntax and the most traditional understanding of equational theories is neat and even very natural. I would not dare to say the same for finite limit theories.

You can produce some type theory out of a finite limit theory (of course), but then what you lose is the "equationality" of this story. We wanted something that could acknowledge that essentially algebraic theories are truly close to algebraic theories, something talking about equations and identities between terms.

The string calculus that we propose shows that essentially algebraic theories are monoidal theories after all, which is what we truly wanted. The fact that finite limits can capture these very special monoidal theories is a theorem, more than a definition, from my point of view.

Of course, this is very much open to debate, and one can find finite limit theories good enough, I could not complain. In that case, our contribution would be to enrich the debate about syntactical presentations of finite limit theories. Ain't that cool? :wink:

view this post on Zulip Kenji Maillard (Nov 27 2020 at 20:37):

Am I right to think that by "equationality" you refer to equations between two terms/string diagrams/pieces of "raw" syntax, by opposition to equations between "well-formed" terms subject to side conditions and/or presuppositions ? :thinking:
If that's so it remind me a bit the usual debate between intrisically well-formed/typed syntax vs raw terms.

Any way I am mostly asking because I'm interested in the next steps, and how it could impact neighbouring fields :grinning_face_with_smiling_eyes:
In particular I wonder if other kinds of "algebraic" effects could be treated in a similar fashion (through a reconstruction with a suitable weakening of the comonoid structure) ; it looks like it should naturally connect with the works of Furhmann but also with various ideas coming from linear logic (that in the presence of "effects" the structural operations of duplication/erasure cease to be natural and might not exist at all on arbitrary computations).

view this post on Zulip Ivan Di Liberti (Nov 27 2020 at 21:44):

Kenji Maillard said:

Am I right to think that by "equationality" you refer to equations between two terms/string diagrams/pieces of "raw" syntax, by opposition to equations between "well-formed" terms subject to side conditions and/or presuppositions ? :thinking:
If that's so it remind me a bit the usual debate between intrisically well-formed/typed syntax vs raw terms.

Any way I am mostly asking because I'm interested in the next steps, and how it could impact neighbouring fields :grinning_face_with_smiling_eyes:
In particular I wonder if other kinds of "algebraic" effects could be treated in a similar fashion (through a reconstruction with a suitable weakening of the comonoid structure) ; it looks like it should naturally connect with the works of Furhmann but also with various ideas coming from linear logic (that in the presence of "effects" the structural operations of duplication/erasure cease to be natural and might not exist at all on arbitrary computations).

Yes you are right.

Coming to applications to linear logic, don't overestimate my knowledge of maths! I would love to see such applications, but I can't provide them at the moment. If you have something more precise in mind, maybe you could suggest something to read.

view this post on Zulip fosco (Nov 27 2020 at 22:15):

I think that's a very good question, and it's a pity that you can have only my answer. The best part of this paper, in fact, is that each author has his own understanding and point of view on this story.

To prove that this is true, I will add "my" version of the story, that in the end is the story of a partial failure, because this approach hasn't been developed yet, due to its intrinsic greater difficulty.

To my eye, the Great History of Functorial Semantics is that there are equivalences between

  1. The category of Lawvere theories, i.e. cartesian props
  2. The category of finitary monads on Set\sf Set
  3. The category of monoidal promonads on Finop\sf Fin^\text{op}, (the free category with products over the point)
  4. The category of cocontinuous, convolution preserving monads on [Fin,Set][{\sf Fin},{\sf Set}]
  5. The category of clones (cartesian operads), i.e. functors FinopSet{\sf Fin}^\text{op} \to {\sf Set} that are monoids w.r.t. the substitution product

Most of these equivalences are shown using the fact that an equivalence of categories [Fin,Set][Set,Set]<ω[{\sf Fin},{\sf Set}]\cong [{\sf Set},{\sf Set}]_{<\omega} (RHS: finitary endofunctors) is monoidal if on the LHS you put the substitution product (the one for which "operads are monoids").

This pentacle of equivalences is the highest possible sense in which "a theory is a monoid"; now, partial theories in our paper capture very well item 1 of the list; albeit in a different 2-category: what about items 2,3,4,5?

Too tired to tell you now, wait for tomorrow!

view this post on Zulip Nathanael Arkor (Nov 27 2020 at 22:18):

If you're generous with what you consider separate concepts, you can also add "relative monads on FSet\mathbb F \hookrightarrow \mathsf{Set}", and distinguish abstract clones and cartesian operads (since they are technically presented via different operations).

view this post on Zulip Nathanael Arkor (Nov 27 2020 at 22:19):

(Both of which are also essentially "monoids" in an appropriate sense.)

view this post on Zulip John Baez (Nov 27 2020 at 22:45):

It's interesting that your pentacle doesn't even include the traditional approach, namely varieties. I suppose one can argue these are so close to Lawvere theories (only more presentation-dependent) that they're not worth a separate entry. But culturally they are very distinct.

view this post on Zulip Chad Nester (Nov 27 2020 at 22:45):

I’m not willing to argue that the other ways to specify an essentially algebraic theory are “not syntax”, but I can offer my perspective on our approach.

I think for me the remarkable thing is that from the string diagrammatic/monoidal theory perspective the move from (cartesian) algebraic theories to essentially algebraic theories feels like a natural thing to do in the syntax, as well as in the semantics (where we move from finite products to finite limits).

view this post on Zulip Chad Nester (Nov 27 2020 at 22:49):

I also find the presentations of these theories in terms of discrete cartesian restriction categories a lot more intuitive than the alternatives, but I don’t really know how many people agree with me here :sweat_smile:

view this post on Zulip Nathanael Arkor (Nov 27 2020 at 22:57):

If I may offer my thoughts on the topic: I think there is an important distinction to be drawn here between concrete syntax (e.g. term calculi or graphical calculi) and abstract syntax (i.e. categorical models). While there are various essentially-equivalent notions of concrete syntax for essentially algebraic theories (partial Horn logic, dependent type theories with dependent sums, etc.), there has previously been no abstract syntax in the style of Lawvere. (I should say that finitely complete categories are farther removed from the syntax than one would like, because you lose the structure of the generating sorts.) However, this work actually provides a categorical presentation-free formulation of the abstract syntax of essentially algebraic theories.

view this post on Zulip Chad Nester (Nov 27 2020 at 22:59):

As for why our approach might be “more syntactic” than others... I think that if we compare the usual method of presenting algebraic theories to most of the ways to present essentially algebraic theories (finite limit sketches, Freyd’s scheme, variations on Freyd’s scheme,...) then the presentations of algebraic theories are a clear winner in terms of clarity. For essentially algebraic theories, you need to have a whole bunch of other crap floating around and the situation is usually pretty confusing. With our approach you can present them in terms of generators and equations and nothing else.

view this post on Zulip Nathanael Arkor (Nov 27 2020 at 22:59):

So, to my eye, there is nothing lacking about the existing concrete syntactic approaches to essentially algebraic theories (though another in the diagrammatic forms of PROPs is certainly welcome), but none of these approaches are abstract; whereas cartesian restriction categories facilitate this abstract syntactic approach that is much cleaner categorically than term or graphical calculi.

view this post on Zulip Nathanael Arkor (Nov 27 2020 at 23:00):

I would remark that saying this new approach finally gives an equational perspective (i.e. in terms of generators and equations) appears to me misleading, because this is already possible, e.g. through dependent type theories with dependent sums. The PROPs give a new equational perspective, but I do not think that was entirely missing before.

view this post on Zulip Chad Nester (Nov 27 2020 at 23:02):

Nathanael Arkor said:

I would remark that saying this new approach finally gives an equational perspective (i.e. in terms of generators and equations) appears to me misleading, because this is already possible, e.g. through dependent type theories with dependent sums. The PROPs give a new equational perspective, but I do not think that was missing before.

I think this is a good point. I do agree that type theory should could as syntax.

view this post on Zulip Nathanael Arkor (Nov 27 2020 at 23:02):

I also share in @fosco's hope that this perspective might shed light on the monadic perspective :)

view this post on Zulip Nathanael Arkor (Nov 27 2020 at 23:09):

Nathanael Arkor said:

I would remark that saying this new approach finally gives an equational perspective (i.e. in terms of generators and equations) appears to me misleading, because this is already possible, e.g. through dependent type theories with dependent sums. The PROPs give a new equational perspective, but I do not think that was entirely missing before.

@Ivan Di Liberti: please do not take this as a criticism! I don't think this in any way reduces the beauty of the PROP-based approach to the syntax: it is simply that as a type theorist I also find much value in the type theoretic approach :)

view this post on Zulip Ivan Di Liberti (Nov 27 2020 at 23:15):

Nathanael Arkor said:

Nathanael Arkor said:

I would remark that saying this new approach finally gives an equational perspective (i.e. in terms of generators and equations) appears to me misleading, because this is already possible, e.g. through dependent type theories with dependent sums. The PROPs give a new equational perspective, but I do not think that was entirely missing before.

Ivan Di Liberti: please do not take this as a criticism! I don't think this in any way reduces the beauty of the PROP-based approach to the syntax: it is simply that as a type theorist I also find much value in the type theoretic approach too :)

I think you would be even entitled to criticise me. A better understanding of what we (all) prove comes from a dialectical process that goes much beyond us and that inevitably passes through intellectual debate.

Anyway, I agree with your comment, and I even wrote it in mine. A type-theoretical understanding is available. My point (which could be wrong) was that such an understanding is unjust towards the equational point of view on universal algebra.

view this post on Zulip Nathanael Arkor (Nov 27 2020 at 23:17):

Yes, I completely agree that this perspective is much closer to the tradition of equational logic and universal algebra than the type theoretic perspective.

view this post on Zulip Kenji Maillard (Nov 28 2020 at 08:58):

Thank you everyone for the various insights! It does help getting a firmer point of view on this landscape :).

view this post on Zulip Kenji Maillard (Nov 28 2020 at 09:35):

Ivan Di Liberti said:

Coming to applications to linear logic, don't overestimate my knowledge of maths! I would love to see such applications, but I can't provide them at the moment. If you have something more precise in mind, maybe you could suggest something to read.

It's not so much applications to linear logic, but rather some common "take away ideas" between your work and linear logic-inspired work. For instance, following Benton's work, a model of linear logic consist of an adjunction between a (symmetric) monoidal category and a cartesian monoidal category (plus some more structure). A recipe to build such models is to consider a monoidal category C and its category of co-comutative comonoids (thanks to Fox's theorem): if you can construct the free co-comutative comonoid on an object of C, then you get an appropriate adjunction. These ideas inspired this (unpublished) manuscript and, as I understand it, the key idea is that you have to equip resources (what I would explain as "effect-context dependent pieces of data" ?) with their own notion of duplication and erasure (deallocation).

Now that I'm writing it, it does sound related but maybe not in such a close fashion :sweat_smile:

Another question that popped up during the night: do you have an idea if there is some connection to the work of Prakash Panangaden, Radu Mardare and Gordon Plotkin on Quantitative algeabraic reasoning ?
Could we hope for a "Lawevrian Theory of Everything" encompassing Lawvere theories, the relational theories that you mention in your paper, the partial theories of your paper, this algebraic quantitative reasoning, as well as other theories yet to be worked out (what happens for state for instance ?).

view this post on Zulip Kenji Maillard (Nov 28 2020 at 11:46):

Yet another question (don't hesitate to say if I'm a bother): What would change to the results in a constructive setting ? Say you consider the category of bishops sets instead of "classical sets". I have the impression that the prop induced from the monoidal theory of partial frobenius algebras would then correspond to partial maps with decidable domain of definition.

view this post on Zulip Pawel Sobocinski (Nov 28 2020 at 11:50):

Kenji Maillard said:

However, reading the article left me with the following interrogation. I am not sure I understand why Freyd's essentially algebraic theories, the theories induced by Vickers & Palmgren's partial horn logic (or Makkai's FOLDS which I think are also related) do not provide a reasonable notion of syntax. At least I thought (in a rather unprecise/handwavy fashion) that all these approach correspond morally to a simple notion of dependent type theory.

I can see that I'm coming severely late to the discussion, but let me try to defend why we claim that this is the "right" syntax for partiality.

The word "syntax" is severely overused and means different things for different people. E.g. I still don't really understand what mathematicians mean by syntax-semantics dualities... :)

Let's zoom out a bit and think about what properties of classical linear syntax made it a powerful formal mathematical notation, especially in algebra, from Fibonacci to the 20th century.

Classical linear syntax (as abstract syntax in the programming language sense, i.e. as trees, with variables)

There are so many examples of these properties used in maths and CS that it is almost second nature to us. But I think that it's useful to keep in mind that much of what society considers "maths" is enabled by the above properties.

Our string diagrammatic syntax

I won't go into the details, but I think essentially algebraic theories, finite limit sketches, and type theory all fail one or more of these four points.

view this post on Zulip Ivan Di Liberti (Nov 28 2020 at 12:13):

Kenji Maillard said:

Another question that popped up during the night: do you have an idea if there is some connection to the work of Prakash Panangaden, Radu Mardare and Gordon Plotkin on Quantitative algeabraic reasoning ?
Could we hope for a "Lawevrian Theory of Everything" encompassing Lawvere theories, the relational theories that you mention in your paper, the partial theories of your paper, this algebraic quantitative reasoning, as well as other theories yet to be worked out (what happens for state for instance ?).

In a paper with Jiri Rosicky, we introduce nests. Nests are categories with finite limits equipped with a good enough factorization system. We show that Met-enriched nests are perfect to axiomatize continuous universal algebra (and even more). According to some informal conversations that I had with Chad, nests should correspond to (some) range categories, so some interaction is there, and there is a chance to provide a good approach to quantitative algebraic reasoning but this is nothing more than an informal conversation at the moment. Also, the category theory around Met-enriched categories is a bit green and needs to find its maturity.

view this post on Zulip Nathanael Arkor (Nov 28 2020 at 14:34):

@Pawel Sobocinski: I do not want to derail this thread, so let me just say that I would be very interested in hearing why you think type theory does not satisfy these four points, in a separate discussion.

view this post on Zulip Fawzi Hreiki (Nov 28 2020 at 14:51):

Well I imagine the answer is that the internal language of a finitely complete category is not intuitively 'equational'.

view this post on Zulip Nathanael Arkor (Nov 28 2020 at 14:57):

@Fawzi Hreiki: I would prefer not to continue the conversation here, as I imagine the authors would rather reserve this thread for discussion about the paper itself.

view this post on Zulip Pawel Sobocinski (Nov 29 2020 at 08:46):

Nathanael Arkor said:

Pawel Sobocinski: I do not want to derail this thread, so let me just say that I would be very interested in hearing why you think type theory does not satisfy these four points, in a separate discussion.

Sure!