Category Theory
Zulip Server
Archive

You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.


Stream: theory: category theory

Topic: Curry Howard Correspondence


view this post on Zulip Patrick Nicodemus (Oct 18 2023 at 14:50):

I would like to ask the community for references to papers on using category theory to model type theory and proof theory in a way which includes the dynamic aspects of type theory and proof theory, that is, term reduction in type theory and cut elimination in proof theory.

I am aware of a few papers on this front, but I would very much like to be aware of it if there is a more substantial theory developed which uses category theory to model term reduction semantics.
There is this paper - https://dl.acm.org/doi/10.1017/S0960129597002508 which uses categories enriched over PER's. This seems like a reasonable approach.
There is this paper - https://math.ucr.edu/home/baez/qg-winter2007/seely_2-categorical.pdf
What else is out there?
Intuitively it seems that higher category theory would be suited for this. For example, let T be a monad on the category of sets, and let X be a T-algebra. The bar construction associates to X an augmented simplicial object whose higher simplices are n-nested trees of expressions in a certain formal language under an underlying set of variables drawn from X. The 0th face map represents evaluation of terms in X; the other face maps represent flattening of certain trees. So here is a sketch of a semantics of computation, one computes by following the face maps down. We might speculate that using monads for theories with binding operators like the lambda calculus, as developed here, https://drops.dagstuhl.de/opus/volltexte/2019/10513/pdf/LIPIcs-FSCD-2019-6.pdf we could extend this semantics to constructs that programming language theory cares about, beyond just considering terms in first order logic.

But more generally I would just love to see papers which use either 2-category theory, poset enriched category theory, or infinity category theories to model reduction semantics. If they can say something interesting about strong normalization, cut elimination theorems, and logical consistency, so much the better!

BTW, homotopy type theory gets pretty close to something I'm talking about, with the idea of adjoint equivalences between types rather than isomorphisms, so that the zigzag identities for adjunctions become "up to homotopy" - which is almost, almost, almost what I want. I want zigzag identities for adjunction up to a computational notion of reduction. Basically, the introduction/elimination rule pairs for most constructors should take the form of "lax adjunctions."

view this post on Zulip Patrick Nicodemus (Oct 18 2023 at 14:57):

My impression is that most category theory literature on type theory and proof theory neglects this but I have not yet done a thorough literature search. Just some light skimming on Google Scholar. I found this -
"Categorical reconstruction of a reduction free normalization proof - Thorsten Altenkirch, Martin Hofmann & Thomas Streicher"
https://link.springer.com/chapter/10.1007/3-540-60164-3_27
which looks pretty cool.

view this post on Zulip Patrick Nicodemus (Oct 18 2023 at 15:25):

I sometimes hear people use the phrase "Curry-Howard-Lambek correspondence." I do not like this phrase very much and I will explain why.
The phrase "Curry-Howard correspondence" refers to

  1. a static correspondence between types in a type theory and propositions in a logic, and between terms in a type theory and proofs of propositions in a logic
  2. a dynamic correspondence between term reduction in a type theory and cut elimination in proof theory.

Number 2 is of substantial interest to proof theorists and type theorists. Strongly normalizing type theories are equivalent to logics with a cut elimination theorem, and cut elimination theorems have very important consequences for analysis of proofs in the logic that lets you prove things by a fairly straightforward induction on the structure of cut-free proofs, not least of which is often the consistency of the logic! I cannot stress enough how much of structural proof theory deals with establishing cut elimination theorems and deriving consequences of them.
Similarly type theorists care about using type systems to establish important results about programs in the programming language such as termination guarantees, this is one of the most fundamental things that distinguishes the simply typed lambda calculus from the untyped lambda calculus.

Now, say 70% of proof theory deals with cut elimination theorems and consistency results, and say 70% of type theory is about using type systems to establish guarantees about program reduction, then 70% of what is important about the Curry-Howard correspondence is in point 2.

My problem with phrases like "Curry-Howard-Lambek correspondence" and "computational trinitarianism" is that category theory has had so far very little to say about these dynamic aspects of computation and proof. As far as I am concerned, the phrase "Curry-Howard-Lambek" is substantially overstating what category theory has contributed here. I have no problem with recognizing the contributions of Lambek in general, I love categorical logic and think it's extremely valuable, I don't mean to diminish anybody's contributions here, just pointing out what is implicitly being promised when you suggest that category theory gives an equivalent perspective on proof theory and type theory. I would either discourage the use of the phrase "Curry-Howard-Lambek" or at least urge anyone who uses it to be aware of the important as-yet-unfulfilled promise contained in that phrase.

view this post on Zulip Jean-Baptiste Vienney (Oct 18 2023 at 15:26):

My feeling is that everybody is speaking about this but there are not a lot of papers about a general theory which would be about a framework to model any kind of type theory 2-categorically (this is what I'm thinking about). Most of the work is about modelling a specific kind of type theory, most of the time simply typed lambda calculus, as you can see it in the papers you link.

I can find papers where such ideas are used but it is always applied to a somehow complicated system, not about a general theory for all type theories. Ex: Categorifying Non-Idempotent Intersection Types: in this paper he uses a bicategory but I don't understand what the paper is about. Something using bicategories (but this is not a bicategory where the 2-arrows model computations), homotopy and proof theory: Template games and differential linear logic.

I think logicians which are more into logic than category theory don't care too much about establishing a big clean framework like in the book Categorical Logic and Type Theory. They are just interested by specific models of specific kind of computations, and can use 2-categories to model computation, along the way.

Defining a super general framework for all the logics at the same time, taking account of the dynamic aspect would be very cool :) Like something where you allow morphism between the different logics, which preserve computation in some way etc...

view this post on Zulip Jean-Baptiste Vienney (Oct 18 2023 at 15:28):

By the way, I haven't read your second message before posting this one. You did an edit it so it appeared before this message :)

view this post on Zulip Jean-Baptiste Vienney (Oct 18 2023 at 15:32):

I was having a felling recently that you could define such kind of a functor from regular logic to differential linear logic. But in fact, it would be a 3-morphism in a 3-category maybe :) if each of the logic is modeled by a 2-category

view this post on Zulip Jean-Baptiste Vienney (Oct 18 2023 at 15:32):

In the 3-category of all computational/logical systems :)

view this post on Zulip Tom Hirschowitz (Oct 18 2023 at 15:44):

If I understand your question correctly, my paper Cartesian closed 2-categories and permutation equivalence in higher-order rewriting goes in this direction. It exhibits a universal property of something called permutation equivalence in higher-order rewriting (for any simply-typed, higher-order rewriting system), in terms of 2-categories.
@Benedikt Ahrens's thesis might also interest you -- if I remember correctly he uses monads relative to the embedding of sets into posets.
Finally, @Philip Saville's more recent thesis is about bicategorical interpretations of type theories.

view this post on Zulip Philip Saville (Oct 18 2023 at 16:00):

@Tom Hirschowitz I was just writing to add your paper to the list ;) there was also a nice paper at POPL this year showing how bicategorical models of untyped lambda calculus let you prove more than the classical setting, roughly because you're remembering how terms are derived all the time (link)

I did some work on 2-dimensional modelling of type theories in my thesis and a couple of subsequent papers, mainly focussed on versions of simply-typed lambda calculus (e.g. 1, 2, 3 ). In this case you get a particularly tight correspondence between the traditional type theory and the 2-dimensional semantics, in that you can prove there's a rewrite ttt \Rightarrow t' in the 2-dimensional calculus (or equivalently, in the free 2-dimensional model) iff the corresponding terms in the usual lambda calculus are βη\beta\eta-equal. You can also show that rewriting in the calculus with products and exponentials is conservative over the calculus with just products.

I think the route Marcelo and I took does lay the groundwork for doing 2-dimensional work properly: we started from abstract clones (also presented as cartesian multicategories / Lawvere theories / relative monads), which are an excellent model of syntax, and replaced them with a 2-dimensional version called biclones, then extracted the type theory from that. The 2-dimensional rules you end up with then coincide with βη\beta\eta in a quite a pleasing way, but with the caveat that you're introducing a quotient on the rewrites. This quotient is a natural one because it says β\beta-reduction after η\eta-expansion is doing nothing (cf. some papers by Ghani & Jay talking about this), but it's perhaps less nice to rewriting theorists, who have spent a long time thinking about which quotients have good properties for proving things -- my impression is that some of the categorically-natural rules are very badly behaved from this perspective. (Another consequence is that the semantic interpretation + free model are easier to construct this way -- otherwise there are some subtleties with how strict things are that get pretty ticklish.)

A lot of type-theoretic constructs (products, exponentials, sums, monadic metalanguage a la Moggi,...) can be phrased as structure in the 2-category of clones, so I expect a version of this to work one level up if you play the same game in the tricategory of biclones. Likewise for linear / ordered type theories, where you work with (symmetric) multicategories instead. I'm currently looking to see whether you can get Moggi's monadic metalanguage + its rewrites from this framework, by looking at pseudomonads in 2-Clone / Biclone.

I haven't got round to checking yet (it's on the list, with too much other stuff!), but I imagine a lot of this can be done using the kind of ideas Marcelo has already used (e.g. this paper with Nathanael Arkor or the '99 abstract syntax paper ) because one of the big insights there is that type theories can be described using structured clones (see e.g. Prop 3.4 in the '99 paper). So I imagine one could start with a signature described as an endofunctor on Cat and construct a collection of corresponding biclones (or 2-clones if you prefer) that is sound and complete for describing the corresponding type theory, complete with the rewrites.

(If you want type theories that go beyond simple type theory, then at that point I'm stuck.)

view this post on Zulip Morgan Rogers (he/him) (Oct 18 2023 at 21:22):

Pinging @William Troiani because he is interested in using higher categorical structures to model dynamics/computational content of proofs.

view this post on Zulip Damiano Mazza (Oct 19 2023 at 09:53):

As far back as the 2nd edition of LICS (!), in 1987, Seely advocated for using higher category theory to, so to speak, take seriously the computational/cut-elimination aspect of the Curry-Howard correspondence. For years, this viewpoint was picked up only very sporadically, for example by Hilken in his 2λ2\lambda-calculus, by @Tom Hirschowitz in the paper he already mentioned, or, on the semantic side, by Fiore, Gambino, Hyland and Winskel. It was even discussed, long ago, on the nCategory Café, in a post written by @John Baez. Now it looks like interest in "2-dimensional" semantics and rewriting is really starting to spread out, which is great! (For example, at the last LICS there were at least three papers on bicategorical semantics of logic/computation, and then of course there are all the papers mentioned by other people above. I am particularly impressed by Fiore and @Philip Saville's work, which I was not aware of).

view this post on Zulip Damiano Mazza (Oct 19 2023 at 09:54):

Patrick Nicodemus said:

[...] I would just love to see papers which use either 2-category theory, poset enriched category theory, or infinity category theories to model reduction semantics. If they can say something interesting about strong normalization, cut elimination theorems, and logical consistency, so much the better!

With my former PhD students Luc Pellissier and Pierre Vial, we used Cat\mathbf{Cat}-enriched cartesian multicategories and certain kinds of (very "loose") fibrations to model type systems for programming languages. The idea is the obvious one: a Cat\mathbf{Cat}-enriched cartesian multicategory models a (possibly typed) programming language, in which objects are types, multimorphisms of arity nn are programs with nn parameters and 2-arrows are computations. A functor between such multicategories (I mean a morphism of Cat\mathbf{Cat}-enriched cartesian multicategories) is a semantic-preserving encoding of a programming language into another. If the target programming language is somehow of a "semantic" nature, then functors may also be seen as semantic interpretations (for instance, if the target is a Set\mathbf{Set}-enriched cartesian multicategory, then you recover traditional denotational semantics).

Picking up on an idea due to Melliès and Zeilberger, we view type systems as special kinds of functors, which we call Niefield functors and which enjoy a version of the Grothendieck construction: a Niefield functor on (small) LL is the same as a functor LRelL\to\mathcal{Rel}, where Rel\mathcal{Rel} is a certain (large) bicategory of distributors and relational natural transformations. In particular, Niefield functors are stable under pullback, so the pullback of a type system is a type system. Fundamental properties of type systems like subject reduction and subject expansion become "fibrational" properties of Niefield functors: a Niefield fibration/opfibration is a type system satisfying subject expansion/reduction. These too are pullback-stable.

This allowed us to formulate a very general theorem about so-called intersection types. These are known to be able to capture all sorts of termination properties for all sorts of λ\lambda-calculi (call-by-name, call-by-value, with continuations, etc.). We show that, for each "flavor" of intersection types (idempotent or not, etc.) there's a "universal" intersection type system uu for linear logic, and basically every intersection type system for some λ\lambda-calculus LL introduced in the past 30 years may be obtained by pulling back the suitable uu along an encoding of LL in linear logic. In fact, the framework also immediately gives you new systems (for example, for the λμ\lambda\mu-calculus) and, with Ugo Dal Lago, Marc de Visme and Akira Yoshimizu, we later used it to get an intersection type system for the π\pi-calculus, thanks to previous work by Honda and Laurent, and Ehrhard and Laurent, showing how this may be mapped to linear logic.

A very interesting aspect of all this is, I think, that it uncovers how intersection types actually build a "bridge" between a "weak termination" property and a "strong termination" property. That is, you have properties SS and WW of your programming language, such that SS trivially implies WW, and every proof showing that intersection types capture some kind of termination is actually showing that WW implies typable and that typable implies SS, so that SS, WW and typable coincide. We give a categorical abstraction of what SS and WW may be, as well as sufficient conditions, formulated at the level of the Niefield functors representing the type systems, so that these implications hold. We call it the "fundamental theorem of intersection types".

What happens in practice is that people focus on one of WW or SS, disregarding the other (in the sense that they don't even see that it's there), and present their result as "typable iff SS" (or WW). The (higher) categorical perspective allows you to see that usually there is also a WW (or an SS), and it is often very interesting to know that it is equivalent to SS (or WW). For instance, fundamental λ\lambda-calculus results like

and so on, are all instances of the fundamental theorem of intersection types!

All this appeared in a POPL 2018 paper, as well as in Luc's thesis. I also wrote my habilitation thesis with this higher-multicategorical perspective in mind. The higher-categorical approach to Curry-Howard was also briefly discussed here on Zulip, in a thread about explicit substitutions, which are a typical thing that "does not exist" at the 1-categorical level (calculi with explicit substitutions have the same normal forms as calculi without them, so explicit substitutions are invisible in 1-categorical semantics. This is just an observation, at the moment I am not aware of any work trying to give a higher-categorical semantics to explicit substitutions).

view this post on Zulip Jules Hedges (Oct 19 2023 at 11:01):

This is a lot of interesting stuff! It's definitely true in general that operational semantics is under-appreciated in on the category theory side. I remember Neil Ghani complaining to me about people using 2-cells to model reduction, which he thought is naive compared to what's possible - namely "structural operational semantics", which is about having a monad describing your syntax and a comonad describing your dynamics, and a distributive law saying how they interact. I don't know anything else about it, and I suspect it was worked out for things like π\pi-calculus, so it might or might not be known how it applies to harder things like cut elimination

view this post on Zulip Damiano Mazza (Oct 19 2023 at 14:36):

I would really like to see this monad/comonad interaction worked out on a simple example, it sounds really interesting but I don't understand how it's supposed to work (especially the comonadic part, how does it describe the dynamics?) and how it relates to higher categories... Has Neil Ghani written something on this approach?

view this post on Zulip Leopold Schlicht (Oct 19 2023 at 15:26):

Damiano Mazza said:

As far back as the 2nd edition of LICS (!), in 1987, Seely advocated for using higher category theory to, so to speak, take seriously the computational/cut-elimination aspect of the Curry-Howard correspondence. For years, this viewpoint was picked up only very sporadically, for example by Hilken in his 2λ2\lambda-calculus, by Tom Hirschowitz in the paper he already mentioned, or, on the semantic side, by Fiore, Gambino, Hyland and Winskel. It was even discussed, long ago, on the nCategory Café, in a post written by John Baez. Now it looks like interest in "2-dimensional" semantics and rewriting is really starting to spread out, which is great! (For example, at the last LICS there were at least three papers on bicategorical semantics of logic/computation, and then of course there are all the papers mentioned by other people above. I am particularly impressed by Fiore and Philip Saville's work, which I was not aware of).

Since you @ed some of the people: @Robert Seely is also on this Zulip.

view this post on Zulip Jules Hedges (Oct 19 2023 at 15:36):

Damiano Mazza said:

I would really like to see this monad/comonad interaction worked out on a simple example, it sounds really interesting but I don't understand how it's supposed to work (especially the comonadic part, how does it describe the dynamics?) and how it relates to higher categories... Has Neil Ghani written something on this approach?

Apparently this paper is the standard reference on this: https://homepages.inf.ed.ac.uk/gdp/publications/Math_Op_Sem.pdf

(Don't ask me any questions about it, I haven't read it!)

view this post on Zulip Tom Hirschowitz (Oct 19 2023 at 19:59):

Well, recently, @Sergey Goncharov, @Stelios Tsampas, @Henning Urbat, and Stefan Milius, but also @Ambroise and I, have worked on categorical approaches to operational semantics. They use a variant of the distributive law approach, while we model dynamics as edges (in a generalised kind of graph). Both groups have been mainly working on congruence of bisimilarity for now, but I believe both approaches may be used to prove other kinds of results. (This is getting a bit far away from the OP's original question, sorry.)

view this post on Zulip Patrick Nicodemus (Oct 20 2023 at 00:28):

Damiano Mazza said:

I would really like to see this monad/comonad interaction worked out on a simple example, it sounds really interesting but I don't understand how it's supposed to work (especially the comonadic part, how does it describe the dynamics?) and how it relates to higher categories... Has Neil Ghani written something on this approach?

I don't really know anything about this topic but Power and Watanabe have a paper on every possible way to have a monad distribute over a comonad or vice versa, and they cite a paper by Turi and Plotkin on this topic as their motivation.
"D. Turi, G. Plotkin, Towards a mathematical operational semantics"
Proceedings, Lecture Notes in Computer Science, Vol. 97, IEEE Press, 1997, pp. 280–291

This appears to maybe be inspired by earlier work of Turi.
"J. Rutten, D. Turi, Initial algebra and Final coalgebra semantics for concurrency"
J. de Bakker, et al.
(Eds.), Proc. REX Workshop A Decade of Concurrency – Reflections and Perspectives, Lecture Notes in Computer Science, Vol. 803, Springer, Berlin, 1994, pp. 530–582

view this post on Zulip Patrick Nicodemus (Oct 20 2023 at 00:29):

Brooke and Geva have also studied the distributivity of a monad over comonad in computer science but their motivations are different.
" S. Brookes, S. Geva, Computational comonads and intensional semantics"

view this post on Zulip Patrick Nicodemus (Oct 20 2023 at 00:34):

Damiano Mazza said:

Patrick Nicodemus said:

[...] I would just love to see papers which use either 2-category theory, poset enriched category theory, or infinity category theories to model reduction semantics. If they can say something interesting about strong normalization, cut elimination theorems, and logical consistency, so much the better!

With my former PhD students Luc Pellissier and Pierre Vial, we used Cat\mathbf{Cat}-enriched cartesian multicategories and certain kinds of (very "loose") fibrations to model type systems for programming languages. The idea is the obvious one: a Cat\mathbf{Cat}-enriched cartesian multicategory models a (possibly typed) programming language, in which objects are types, multimorphisms of arity nn are programs with nn parameters and 2-arrows are computations. A functor between such multicategories (I mean a morphism of Cat\mathbf{Cat}-enriched cartesian multicategories) is a semantic-preserving encoding of a programming language into another. If the target programming language is somehow of a "semantic" nature, then functors may also be seen as semantic interpretations (for instance, if the target is a Set\mathbf{Set}-enriched cartesian multicategory, then you recover traditional denotational semantics).

Picking up on an idea due to Melliès and Zeilberger, we view type systems as special kinds of functors, which we call Niefield functors and which enjoy a version of the Grothendieck construction: a Niefield functor on (small) LL is the same as a functor LRelL\to\mathcal{Rel}, where Rel\mathcal{Rel} is a certain (large) bicategory of distributors and relational natural transformations. In particular, Niefield functors are stable under pullback, so the pullback of a type system is a type system. Fundamental properties of type systems like subject reduction and subject expansion become "fibrational" properties of Niefield functors: a Niefield fibration/opfibration is a type system satisfying subject expansion/reduction. These too are pullback-stable.

This allowed us to formulate a very general theorem about so-called intersection types. These are known to be able to capture all sorts of termination properties for all sorts of λ\lambda-calculi (call-by-name, call-by-value, with continuations, etc.). We show that, for each "flavor" of intersection types (idempotent or not, etc.) there's a "universal" intersection type system uu for linear logic, and basically every intersection type system for some λ\lambda-calculus LL introduced in the past 30 years may be obtained by pulling back the suitable uu along an encoding of LL in linear logic. In fact, the framework also immediately gives you new systems (for example, for the λμ\lambda\mu-calculus) and, with Ugo Dal Lago, Marc de Visme and Akira Yoshimizu, we later used it to get an intersection type system for the π\pi-calculus, thanks to previous work by Honda and Laurent, and Ehrhard and Laurent, showing how this may be mapped to linear logic.

A very interesting aspect of all this is, I think, that it uncovers how intersection types actually build a "bridge" between a "weak termination" property and a "strong termination" property. That is, you have properties SS and WW of your programming language, such that SS trivially implies WW, and every proof showing that intersection types capture some kind of termination is actually showing that WW implies typable and that typable implies SS, so that SS, WW and typable coincide. We give a categorical abstraction of what SS and WW may be, as well as sufficient conditions, formulated at the level of the Niefield functors representing the type systems, so that these implications hold. We call it the "fundamental theorem of intersection types".

What happens in practice is that people focus on one of WW or SS, disregarding the other (in the sense that they don't even see that it's there), and present their result as "typable iff SS" (or WW). The (higher) categorical perspective allows you to see that usually there is also a WW (or an SS), and it is often very interesting to know that it is equivalent to SS (or WW). For instance, fundamental λ\lambda-calculus results like

and so on, are all instances of the fundamental theorem of intersection types!

All this appeared in a POPL 2018 paper, as well as in Luc's thesis. I also wrote my habilitation thesis with this higher-multicategorical perspective in mind. The higher-categorical approach to Curry-Howard was also briefly discussed here on Zulip, in a thread about explicit substitutions, which are a typical thing that "does not exist" at the 1-categorical level (calculi with explicit substitutions have the same normal forms as calculi without them, so explicit substitutions are invisible in 1-categorical semantics. This is just an observation, at the moment I am not aware of any work trying to give a higher-categorical semantics to explicit substitutions).

Ok, thank you. this is really interesting, I think it will take me a bit to understand it. This has sparked some really good discussion, I am super thankful for all commenters.

view this post on Zulip Stelios Tsampas (Oct 20 2023 at 11:05):

Patrick Nicodemus said:

I don't really know anything about this topic but Power and Watanabe have a paper on every possible way to have a monad distribute over a comonad or vice versa, and they cite a paper by Turi and Plotkin on this topic as their motivation.
"D. Turi, G. Plotkin, Towards a mathematical operational semantics"
Proceedings, Lecture Notes in Computer Science, Vol. 97, IEEE Press, 1997, pp. 280–291

Yes, very nice, the Turi and Plotkin framework is the basis of our work (the one that @Tom Hirschowitz, who has himself done super cool work on the foundations of operational semantics, mentioned, see e.g. https://arxiv.org/pdf/2210.13387.pdf and https://arxiv.org/abs/2302.08200).

I'll try to be brief and I can elaborate if you find this interesting. Lately we've extended said Turi and Plotkin approach to higher-order languages and some core principles carry over to the higher-order setting. We can also do languages with variable binding such as the λ\lambda-calculus; for that we are making use of Fiore's foundations in "Abstract syntax and variable binding" that @Philip Saville mentioned.

The main goal of this track is not to give a computational aspect to the Curry-Howard-Lambek correspondence... it's more about coming up with a precise, satisfactory definition of operational semantics, generalize and simplify reasoning principles, that sort of stuff. However, in our latest (yet unreleased) work we have been playing around with an implementation of STLC in the higher-order Mathematical Operational Semantics style. This got me thinking if there is something to be said about the Curry-Howard-Lambek correspondence!

view this post on Zulip Matteo Capucci (he/him) (Oct 23 2023 at 08:29):

Jules Hedges said:

Damiano Mazza said:

I would really like to see this monad/comonad interaction worked out on a simple example, it sounds really interesting but I don't understand how it's supposed to work (especially the comonadic part, how does it describe the dynamics?) and how it relates to higher categories... Has Neil Ghani written something on this approach?

Apparently this paper is the standard reference on this: https://homepages.inf.ed.ac.uk/gdp/publications/Math_Op_Sem.pdf

(Don't ask me any questions about it, I haven't read it!)

It's weird because this paper is very, very famous in the CS crowd doing semantics of programming languages & so on. This and the paper with Fiore have around 500 citations each!
That said, I'm a big big fan of 2-cells-as-reductions and of the paper of Seely's mentioned by @Damiano Mazza . I'd love to see the two approaches coming together though, instead of having CT go off its tangent ignoring the current practice in the field (I will now retreat in my shell---I'm sure people are aware of the current practice in the field)

view this post on Zulip Tom Hirschowitz (Oct 23 2023 at 08:46):

Wait, @Matteo Capucci (he/him), before retreating, could you please explain what you mean? :shell: Maybe something that wasn't clear from the discussion is why there are two different approaches: I think one main reason is that the 2-categorical approach takes “congruence” as built in, hence isn't a priori as expressive as the bialgebraic approach.
What I mean by congruence is that rewrites may occur anywhere in programs, which is only true for a restricted class of languages. Typically, you don't evaluate the body of an effectful function before it is applied.
Does this have to do with what you meant?

view this post on Zulip Matteo Capucci (he/him) (Oct 23 2023 at 08:51):

I'm not very familiar with the details, I only eavesdrop on programming language semantics. Is there a common use case of both approaches where you can compare?

view this post on Zulip Damiano Mazza (Oct 23 2023 at 09:01):

Tom Hirschowitz said:

What I mean by congruence is that rewrites may occur anywhere in programs, which is only true for a restricted class of languages.

But isn't this only accidentally true? I mean, it is true because people (including myself) have only considered approaches where, roughly, 1-cells ABA\to B are programs of type BB with a free variable xx of type AA, and compositon is substitution for xx. But other approaches are conceivable in which rewrites are still 2-cells (for instance, 1-cells could be evaluation contexts, which are usually closed under composition). I mean, I have never seriously done anything like this so maybe it will fail for trivial reasons as soon as one tried to actually develop it, but at the moment I don't see why the 2-categorical approach must necessarily force rewrites everywhere...

view this post on Zulip Tom Hirschowitz (Oct 23 2023 at 09:33):

@Matteo Capucci (he/him) yes, sure, (potentially higher-order) rewrite systems exist on both sides. Did you mean that someone should bridge the gap? Like, define a, say “congruent” subcase of the bialgebraic approach in which one may construct a 2-category related in some suitable sense to the generated bialgebra?

@Damiano Mazza You're right of course, there are ways around this. Another way is to have a special constructor to trigger evaluation. But another problem is that 2-categories hardly model labelled transition systems, so you need to go to double categories. People like Roberto Bruni, Fabio Gadducci, and Ugo Montanari tried that in the 90's. I also spent quite some time trying to make this work, in vain. I'd be a bit jealous if someone succeeded, but also really glad, because it should work.

view this post on Zulip Matteo Capucci (he/him) (Oct 23 2023 at 10:40):

Tom Hirschowitz said:

Matteo Capucci (he/him) yes, sure, (potentially higher-order) rewrite systems exist on both sides. Did you mean that someone should bridge the gap? Like, define a, say “congruent” subcase of the bialgebraic approach in which one may construct a 2-category related in some suitable sense to the generated bialgebra?

I'm saying it'd be weird if there is nothing to learn from the current state of the art of categorial operational semantics, i.e. Turi-Plotkin semantics, and that if higher CT ever wants to be taken seriously in this domain (or, for what matters, any domain) then it should at last show how it relates to it.
Sorry, I sound like a grumpy old man. That's why I wanted to stay in my shell. I don't like being the umarel of the situation.

view this post on Zulip Tom Hirschowitz (Oct 23 2023 at 11:59):

So... this kind of sounds like a “yes”, right?

view this post on Zulip Matteo Capucci (he/him) (Oct 23 2023 at 12:14):

Yeah

view this post on Zulip Chris Barrett (Oct 24 2023 at 11:01):

I think it's for certain known by many who think about Curry-Howard-Lambek that you _could_ extend the Lambek side so that beta-reduction is modelled by 2-morphisms. When people consider beta-reduction/cut-elimination, they are often concerned with complexity theoretic questions. In the translation to category theory -- even 2-category theory -- this information is lost. So my question is: what is gained from this perspective? One might even suggest that the _point_ of denotational semantics _is_ to reduce (beta-)equivalence to a more strict equality.

view this post on Zulip John Baez (Oct 24 2023 at 11:45):

Wht does modelling beta reduction as a 2-morphism lose information about complexity?

view this post on Zulip Tom Hirschowitz (Oct 24 2023 at 12:42):

I guess @Chris Barrett means because general 2-cells don't have an intrinsic length, while reduction sequences do. But probably this could be remedied rather easily.

What is gained, on the other hand, is essentially that

This should be contrasted with the mainstream technology used in the programming languages literature, which consists in doing everything by hand (or sweeping things under the rug for clarity).

view this post on Zulip Damiano Mazza (Oct 24 2023 at 12:57):

Adding to (or, really, rephrasing) Tom's message, being able to have a higher-categorical account of reductions will hopefully give us better answers (than those available today) to the question "What is a programming language?", so there's definitely something to be gained. More in topic with this thread, if we had a good answer to the question "What is the Curry-Howard 'isomorphism' an isomorphism of?" that would be nice too I think. The length of reductions plays no role in these questions.

view this post on Zulip Damiano Mazza (Oct 24 2023 at 13:02):

It is true, though, that the answer to all these questions has an unspecified format (they are koans, as Jan Willem Klop would call them!), so the actual gain will depend on the nature of the answers...

view this post on Zulip Jean-Baptiste Vienney (Oct 24 2023 at 14:18):

Yes, no problem with the length of the reductions, for cut elemination in a logic, your 2-morphisms are freely generated by the reduction steps, so that the notion of length of the reduction is definied. Nothing in such a loss of information would have to do with category theory, it's exactly when you write xyx \rightarrow^* y instead of say xnyx \rightarrow^{n} y if a term xx is rewritten to a term yy in a rewriting system, in nn steps. You chose whether you specify the lengths or not, with a categorical or non-categorical formalism, this is just a choice. But this is true that complexity issues have not been analysed with category theory as far as know. It doesn't mean that it is impossible. The thing to do is to encode as much as CSists need in category theory.

view this post on Zulip Damiano Mazza (Oct 24 2023 at 15:09):

I agree. However, let me say that being able to count reduction steps in a programming language is such a marginal "complexity theoretic question" (I am quoting @Chris Barrett's original comment here), I mean relatively to what complexity theorists actually do, that I would not see it as a convincing motivation for the development of a higher categorical Curry-Howard. It would be kind of like saying that some super interesting technique for graph theory is also relevant for category theory because, you know, in the end categories are just graphs :wink:

view this post on Zulip Mike Shulman (Oct 24 2023 at 15:41):

Yeah, the connection between β\beta-reduction steps and the measures of computational complexity that are relevant for actual computation is rather fraught.

view this post on Zulip Tom Hirschowitz (Oct 24 2023 at 15:52):

@Mike Shulman I'm not sure whether @Damiano Mazza means this, or rather that being able to count steps hardly counts as complexity theory. At least, regarding the connection between β\beta-reduction steps and complexity, work by Beniamino Accattoli -- which I'm sure Damiano knows better than me -- has considerably tightened the connection.

view this post on Zulip Chris Barrett (Oct 24 2023 at 15:59):

The length of a reduction sequence is not really the kind of question I had in mind -- it's not obviously a good measure of the complexity of beta-reduction, after all, since a single beta reduction step is not implemented in constant time. Now, probably one _could_ encode "as much as CSists need in category theory", but it is not obvious to me that the encoding would be a particularly natural one or that it would provide a useful perspective on those questions! Of course, I'd love to be wrong about this!

@Tom Hirschowitz @Damiano Mazza I'd really love if you could expand on your answers to my question about what is gained! Tom, if I understand correctly your notion of signature would include some generating 2-reductions also? And then presumably you're imagining constructing some free such-and-such category over it. I'm struggling to really unpack your comments. In what sense is everything more typed?

view this post on Zulip Chris Barrett (Oct 24 2023 at 16:13):

Picking up on an idea due to Melliès and Zeilberger, we view type systems as special kinds of functors, which we call Niefield functors and which enjoy a version of the Grothendieck construction: a Niefield functor on (small) LL is the same as a functor LRelL\to\mathcal{Rel}, where Rel\mathcal{Rel} is a certain (large) bicategory of distributors and relational natural transformations. In particular, Niefield functors are stable under pullback, so the pullback of a type system is a type system. Fundamental properties of type systems like subject reduction and subject expansion become "fibrational" properties of Niefield functors: a Niefield fibration/opfibration is a type system satisfying subject expansion/reduction. These too are pullback-stable.

This allowed us to formulate a very general theorem about so-called intersection types. These are known to be able to capture all sorts of termination properties for all sorts of λ\lambda-calculi (call-by-name, call-by-value, with continuations, etc.). We show that, for each "flavor" of intersection types (idempotent or not, etc.) there's a "universal" intersection type system uu for linear logic, and basically every intersection type system for some λ\lambda-calculus LL introduced in the past 30 years may be obtained by pulling back the suitable uu along an encoding of LL in linear logic. In fact, the framework also immediately gives you new systems (for example, for the λμ\lambda\mu-calculus) and, with Ugo Dal Lago, Marc de Visme and Akira Yoshimizu, we later used it to get an intersection type system for the π\pi-calculus, thanks to previous work by Honda and Laurent, and Ehrhard and Laurent, showing how this may be mapped to linear logic.

A very interesting aspect of all this is, I think, that it uncovers how intersection types actually build a "bridge" between a "weak termination" property and a "strong termination" property. That is, you have properties SS and WW of your programming language, such that SS trivially implies WW, and every proof showing that intersection types capture some kind of termination is actually showing that WW implies typable and that typable implies SS, so that SS, WW and typable coincide. We give a categorical abstraction of what SS and WW may be, as well as sufficient conditions, formulated at the level of the Niefield functors representing the type systems, so that these implications hold. We call it the "fundamental theorem of intersection types".

I somehow managed to skip over this in my initial reading of the thread, but this is super interesting and I'm keen to take a closer look at your work on intersection types here!

view this post on Zulip John Baez (Oct 24 2023 at 16:17):

If the mere number of beta-reduction steps is not what one is concerned with, it seems one could use some other way(s) of measuring the complexity of 2-cells in a 2-categorical approach to the lambda calculus. I thought people were saying it's impossible to count the beta-reductions in a 2-cell, but now they seem to be saying they don't want to. That's fine. I'm just struggling to see if there's any real obstacle to studying complexity 2-categorically. Whether it's worth doing is a separate question, which has to come later.

view this post on Zulip Chris Barrett (Oct 24 2023 at 16:25):

John Baez said:

If the mere number of beta-reduction steps is not what one is concerned with, it seems one could use some other way(s) of measuring the complexity of 2-cells in a 2-categorical approach to the lambda calculus. I thought people were saying it's impossible to count the beta-reductions in a 2-cell, but now they seem to be saying they don't want to. That's fine. I'm just struggling to see if there's any real obstacle to studying complexity 2-categorically. Whether it's worth doing is a separate question, which has to come later.

This is my fault -- I first said that the complexity theoretic information is "lost", but I don't believe this is necessarily the case: of course, one can probably encode almost anything in CT, since CT is very expressive. I expect it is the case with the simple approach of considering beta steps as 2-morphisms.

view this post on Zulip Damiano Mazza (Oct 24 2023 at 16:26):

Tom Hirschowitz said:

being able to count steps hardly counts as complexity theory.

Yes, that's the spirit of my comment! Complexity theory is WAY more than counting (deterministic) time up to a polynomial, which is what we are able to do with certain restricted notions of β\beta-reduction (which, incidentally, do not even compose, so they can't even be the 2-arrows of a 2-category!).

view this post on Zulip Damiano Mazza (Oct 24 2023 at 16:52):

John Baez said:

I'm just struggling to see if there's any real obstacle to studying complexity 2-categorically.

"Studying complexity 2-categorically" is a bit vague, so no, I don't think I can come up with any reason why it would be impossible :upside_down: I mean, for example, I don't think there's a conceptual obstacle to defining time complexity 2-categorically. What I know is that, doing it in the context of Curry-Howard as discussed in this thread (more precisely, via some categorification of CCCs) leads either to something worthless (unrelated to Turing machines) or to a very limited definition which would only capture a handful interesting complexity classes. That's a real obstacle, but of course it doesn't preclude from switching to a different 2-categorical model, or a categorification of CCCs which does not correspond to anything I have in mind now, which will turn out to do wonders for complexity theory.

My main point is that "studying complexity" is like "studying category theory". These are huge fields with countless ramifications and with specific viewpoints on things, it's not because we managed to express one very basic definition (like deterministic time complexity) that we will be able to actually do anything relevant...

view this post on Zulip Mike Shulman (Oct 24 2023 at 16:57):

There's some recent work like https://arxiv.org/abs/2107.04663 that I think is intended to give a more flexible and modular way to incorporate notions of complexity into λ\lambda-calculus.

view this post on Zulip Tom Hirschowitz (Oct 25 2023 at 06:56):

Tom, if I understand correctly your notion of signature would include some generating 2-reductions also? And then presumably you're imagining constructing some free such-and-such category over it. I'm struggling to really unpack your comments. In what sense is everything more typed?

For the first question, yes, that's the idea. It is developed in this paper.

For the last question, it's always been difficult for me to explain, so if others have better ways, please don't hesitate to help!

In a sense, @Damiano Mazza already answered this question: it's about trying to answer the question “what is a programming language”?
And also how and from what do we generate them?

I don't know how familiar you are with programming language research, but, taking a few steps back, what the average paper does is construct a mathematical object by induction (and perhaps some quotienting for α\alpha-equivalence), without attempting to give it a type. This process is almost the same across the literature, so you quickly recognise and name the objects involved: you first have a grammar, maybe some typing rules, and then some reduction or transition rules, which generate a (maybe indexed) set and a (maybe labelled) binary transition relation. What we're doing is defining these notions formally, and giving a universal property and generic construction of the set-and-binary-relation from the grammar and transition rules. We use this to turn methods into theorems. This is what @Sergey Goncharov, @Stelios Tsampas, Henning and Stefan, as well as @Ambroise and I, did with Howe's method. Well, to be fair, there were theorems about it, but we generalised them.

Does this help?!

view this post on Zulip Chris Barrett (Oct 25 2023 at 14:07):

Thanks, Tom, that helps me understand where you're coming from! I suppose what I was imagining initially was a kind of 2-category for lambda-terms but, taking a broader perspective, I can see how one would be motivated to find a way to unify and study in generality the many e.g., extensions to the lambda-calculus that people consider in the literature. I look forward to taking a closer look at the work you mentioned to understand better.

By the way, I hope my initial comment was not too pointed -- I am very interested in this line of work, since it seems such a natural extension of Curry-Howard-Lambek. I do consider myself a PL theorist btw, albeit a young one!

This is a vague musing, but I have wondered before: perhaps instead of considering higher CT as a way to understand type/proof theory, one could use tools from type/proof theory to understand higher CT -- in particular, I wonder if one could use (some variant of) cut elimination to give proofs of coherence for higher categories.

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 14:40):

I think it is indeed a very good idea. I'm only aware of applications of cut elimination to 1-order categorical doctrines, possibly replacing their equality by 2-morphisms

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 14:45):

I think that anyone who knows a bit of proof theory and multiplicative linear logic understands that using its cut elimination is the most natural way to prove the MacLance coherence theorem for symmetric monoidal categories. Frederico Olimpieri proves coherence for some kind of multicategories using "resource calculus": Coherence by Normalization for Linear Multicategorical Structures. And one of my favorite paper, by Hasegawa: A Categorical Reduction System for Linear Logic. He says in the abstract that this "gives the first step towards the mechanization of diagram chasing" and honnestly, I'd like to extend it in particular to differential categories to obtain automatic proofs of commutative diagrams about differentation.

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 14:53):

I was thinking about creating a sequent calculus for abelian categories and then apply the same kind of methodology to mechanize diagram chasing but I didn't succeed to have a calculus with cut elimination. Indeed, abelian categories, which rely heavily on negative numbers don't sound very compatible with proof theory. I don't understand precisely why but negative numbers interact badly with logic and several people told me that. But one could maybe build a sequent calculus for regular logic, obtain cut elimination and apply the same kind of methodology to mechanize diagam chasing in this context.

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 14:55):

There is a variety of logical systems, which regular logic is part of whose dynamic has apparently never been studied by anybody (or maybe I just don't know the papers ) because they come from pure mathematics rather than computer science. I'm thinking to regular logic, coherent logic and geometric logic

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 15:28):

People know about Curry-Howard but what is important to me would be to apply it to concrete settings in mathematics, in order to apply techniques of proof theory / computer science to the categories mathematicians are interested by. It implies that some people need to know at the same time pretty abstract mathematics and proof theory / lambda calculus etc, to make progress in this wonderful direction. I'm super excited by the paper of Hasegawa, as well as by Bewl because they are going in this direction of applying computer science to mathematical structures, in order to make mathematics easier to work with.

view this post on Zulip Philip Saville (Oct 25 2023 at 15:33):

This is a vague musing, but I have wondered before: perhaps instead of considering higher CT as a way to understand type/proof theory, one could use tools from type/proof theory to understand higher CT -- in particular, I wonder if one could use (some variant of) cut elimination to give proofs of coherence for higher categories.

I really like Federico's line of thinking on this -- there is a well-developed theory around standardisation etc that seems likely to tell us things about coherence. Marcelo and I did something similar but more semantic than syntactic here, where we used his semantic take on normalisation by evaluation to prove normalisation of a 2-dimensional lambda calculus and hence a strong form of coherence for cartesian closed bicategories. NBE is such a powerful tool I've always expected similar results to work for other structures, but I've not seen it done yet.

Something I'd also like to know is whether this 2-dimensional point of view (1-cells as morphisms / terms, 2-cells as 2-cells / rewrites) has anything useful to say about proof transformations in the way they talk about them in proof theory. I still don't know enough proof theory to be able to say either way

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 15:46):

There is the problem to know what kind of natural transformations must the proofs be interpreted by in a logical system 2-categorically, in fact the problem is already here at the 1-categorical level. Interpreting proofs by morphisms and formulas by objects is correct but very far to be ideal. In fact it is better to interpret formulas by functors and proofs by natural transformations.

I don't rememenber exactly in what setting, but often proofs are not basic natural transformations but rather extranatural transformation or dinatural transformations. And I think it's still not known when you can compose them (indeed you can't always compose them).

I'm not sure it's related to your question. But It made me think to this. It's just that 2-categories made me think to natural transformations but it's not the same 2-categorical structure that the one for rewriting

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 15:48):

So, the good way to interpret reduction would be as some kind of 22-morphisms (so 33-morphisms is maybe a better word) between extranatural transformations or something like this in fact. So Curry-Howard is maybe something 3-categorical at the end

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 15:52):

Thinking about it, I'm not sure lot of people are aware of this

view this post on Zulip Chris Barrett (Oct 25 2023 at 15:55):

Jean-Baptiste Vienney said:

There is the problem to know what kind of natural transformations must the proofs be interpreted by in a logical system 2-categorically, in fact the problem is already here at the 1-categorical level. Interpreting proofs by morphisms and formulas by objects is correct but very far to be ideal. In fact it is better to interpret formulas by functors and proofs by natural transformations.

I don't rememenber exactly in what setting, but often proofs are not basic natural transformations but rather extranatural transformation or dinatural transformations. And I think it's still not known when you can compose them (indeed you can't always compose them).

I'm not sure it's related to your question. But It made me think to this. It's just that 2-categories made me think to natural transformations but it's not the same 2-categorical structure that the one for rewriting

A contemporary of mine during my PhD, Alessio Santamaria, all but settled the question of compositionality of dinaturals in his PhD thesis, I believe.

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 15:57):

Oh, he says it's related to acyclicity of some graphs so it sounds somehow related to the correctness criterion for proof nets

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 16:00):

Interesting, it makes two different but related subjects who talk about acyclicity of graphs

view this post on Zulip Chris Barrett (Oct 25 2023 at 16:01):

Jean-Baptiste Vienney said:

I was thinking about creating a sequent calculus for abelian categories and then apply the same kind of methodology to mechanize diagram chasing but I didn't succeed to have a calculus with cut elimination. Indeed, abelian categories, which rely heavily on negative numbers don't sound very compatible with proof theory. I don't understand precisely why but negative numbers interact badly with logic and several people told me that. But one could maybe build a sequent calculus for regular logic, obtain cut elimination and apply the same kind of methodology to mechanize diagam chasing in this context.

Thanks for the links to those papers, I'm excited to take a look. This idea of (as I would imagine it) applying proof theoretic techniques to more free-categories-with-structure than are typically considered when one thinks of "logic" is something I've thought a bit about, and I think could definitely be interesting. I'm writing up at the moment a calculus for rig categories with biproducts, for instance -- more focussed on the PL than proof theory side, though.

About negatives: I know they generally don't interact well with infinite dimensions because of diverging sums, and I've noticed this problem too in similar-ish investigations.

view this post on Zulip Chris Barrett (Oct 25 2023 at 16:01):

Jean-Baptiste Vienney said:

Oh, he says it's related to acyclicity of some graphs so it sounds somehow related to the correctness criterion for proof nets

Yes, it very much looks that way! The proof has a geometry-of-interaction / token-passing feel, too.

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 16:08):

Chris Barrett said:

Jean-Baptiste Vienney said:

I was thinking about creating a sequent calculus for abelian categories and then apply the same kind of methodology to mechanize diagram chasing but I didn't succeed to have a calculus with cut elimination. Indeed, abelian categories, which rely heavily on negative numbers don't sound very compatible with proof theory. I don't understand precisely why but negative numbers interact badly with logic and several people told me that. But one could maybe build a sequent calculus for regular logic, obtain cut elimination and apply the same kind of methodology to mechanize diagam chasing in this context.

Thanks for the links to those papers, I'm excited to take a look. This idea of (as I would imagine it) applying proof theoretic techniques to more free-categories-with-structure than are typically considered when one thinks of "logic" is something I've thought a bit about, and I think could definitely be interesting. I'm writing up at the moment a calculus for rig categories with biproducts, for instance -- more focussed on the PL than proof theory side, though.

About negatives: I know they generally don't interact well with infinite dimensions because of diverging sums, and I've noticed this problem too in similar-ish investigations.

Oh, and maybe it has something to do with fixed-points ? I don't understand anything about the role of fixed-points in computer science. But Lionel Vaux in this paper: The Algebraic Lambda-Calculus, introduce a λ\lambda-term that he is writing s\infty_s and says that in presence of negative numbers, big problems happen. That's as far as I understand. But now that you tell me it's related to infinite sums, it sounds easier to me to understand: the problem is about 0\infty - \infty \neq 0...

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 16:10):

However, even without infinite sums, in an abelian category, I didn't know how to deal with the kernels, cockernels etc... in a cut elimination. But hopefully, it's possible to do math without negative numbers and there are not negative numbers in regular categories for instance which are kind of categories like the category of sets and relations and are much more general than abelian categories

view this post on Zulip Chris Barrett (Oct 25 2023 at 16:21):

In the weighted relational models of LL, proofs are interpreted as matrices over some complete semiring (in the sense that every infinite sum is convergent), with completeness necessary for dealing with the infinities that crop up interpreting the exponential $!$. Iirc, there are no complete rings because there is no well-defined way to understand the infinite sum a+(-a)+a+(-a)+... However, the coherence space models and similar give another way of resolving this issue. This is getting a bit out-of-my-depth: perhaps I will say something wrong.

view this post on Zulip Chris Barrett (Oct 25 2023 at 16:22):

I suspect Vaux's comments are related to this issue of completeness. But your issues with Abelian categories may not be related!

view this post on Zulip Chris Barrett (Oct 25 2023 at 16:23):

I'm interested in your investigations on Abelian categories. My thinking on rig categories with biproducts is like a baby version of this (homsets are commutative monoids, not abelian groups; no notion of co/kernels, etc.)

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 16:24):

I didn't do much honnestly, I just wrote a sequent calculus on a sheet of paper one day and realized that cut elimination didn't seem to work well

view this post on Zulip Chris Barrett (Oct 25 2023 at 16:26):

It's a delicate thing, indeed!

view this post on Zulip Chris Barrett (Oct 25 2023 at 16:29):

Jean-Baptiste Vienney said:

People know about Curry-Howard but what is important to me would be to apply it to concrete settings in mathematics, in order to apply techniques of proof theory / computer science to the categories mathematicians are interested by. It implies that some people need to know at the same time pretty abstract mathematics and proof theory / lambda calculus etc, to make progress in this wonderful direction. I'm super excited by the paper of Hasegawa, as well as by Bewl because they are going in this direction of applying computer science to mathematical structures, in order to make mathematics easier to work with.

Btw what is the work by Bewl you mention here?

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 16:30):

Chris Barrett said:

Jean-Baptiste Vienney said:

People know about Curry-Howard but what is important to me would be to apply it to concrete settings in mathematics, in order to apply techniques of proof theory / computer science to the categories mathematicians are interested by. It implies that some people need to know at the same time pretty abstract mathematics and proof theory / lambda calculus etc, to make progress in this wonderful direction. I'm super excited by the paper of Hasegawa, as well as by Bewl because they are going in this direction of applying computer science to mathematical structures, in order to make mathematics easier to work with.

Btw what is the work by Bewl you mention here?

Ahah, it's the work by Felix Dilke, cf. the topic "Introducing me and Bewl..."

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 16:30):

It's a pogramming language, not a person

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 16:33):

But about abelian categories: I could try again when I have more time. Now I must work on courses because I am at the beginning of my Phd, in America. But regular categories are nice too. People working on categorical algebra, like Dominique Bourn who wrote the nice book Dominique Bourn, From Groups to Categorial Algebra : Introduction to Protomodular and Mal’tsev Categories are already using more general structures than abelian categories, in a pure math perspective. Abelian categories seem to be insufficient in mathematics.

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 16:35):

And I suspect that every regular category is a model of linear logic: they are the categories resembling the category RelRel

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 16:36):

But for sure, you can't do as much in a regular category than in an abelian category

view this post on Zulip Jean-Baptiste Vienney (Oct 25 2023 at 16:40):

(at least you lose some basic results on techniques related to negative numbers which sounds like a lot)

view this post on Zulip Kevin Arlin (Oct 25 2023 at 17:15):

Exact categories (in the sense of Barr) are much more precisely the right analogue of abelian categories without the additivity. In particular a category is abelian if and only if it's additive and exact, but regular doesn't suffice.

view this post on Zulip Patrick Nicodemus (Oct 27 2023 at 00:41):

Kevin Arlin said:

Exact categories (in the sense of Barr) are much more precisely the right analogue of abelian categories without the additivity. In particular a category is abelian if and only if it's additive and exact, but regular doesn't suffice.

@Kevin Arlin you say additive, can you get away with preadditive?

view this post on Zulip Kevin Arlin (Oct 27 2023 at 01:17):

Yeah, an exact category is in particular regular so has finite products, which are biproducts in any preadditive category.