You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
Inspired by @Mike Shulman's talk, I have put some work into getting a little linear logic theorem prover I built a few months back suitable for public consumption: https://github.com/akissinger/pypn
It proves a formula by trying to build a proof net, satisfying some correctness criterion, which is pluggable by the user. I've implemented the (MLL+MIX version of the) Danos-Regnier conditions, and also some slightly more expanded things in trying to find the "right" notion of a logic for #practice: applied ct > causal inference.
Happy to chat about all the above. Happier if other people decide they want to have a bit of a hack on the theorem prover. :)
what's the + there?
is it supposed to be ⅋ ?
yes
that seems a little misleading when there's also a ⊕ connective in linear logic >.>
neat
what kind of proof net is this?
it's a "two-sided" variation of the usual notion of an MLL proof net
hmm
so axiom links are now just identities & cuts are now just composition?
Some notations for linear logic use + and circled +.
:scream:
I forget if they also use × and circled ×, though.
(not at the notation itself necessarily, just at the total incompatibility of the giant array of notations)
yes, this is actually not so uncommon :)
this is all girard's fault for picking ⅋ instead of something that was, like, possible to typeset
my choice for using plus was pretty much pragmatic, in that it has the right precedence in the "host" language (python)
One might hope that that notation would coincide with the fact that one of the conjunctions is a categorical product, and one disjunction is a coproduct (I think), but it might not follow that convention, either.
it does not
the product is & and the coproduct is ⊕
Yeah.
sarahzrf said:
so axiom links are now just identities & cuts are now just composition?
It has axioms and cuts too, but the distinction between axiom, cut, and identity is pretty much syntactic, i.e. whether you are proving something in one-sided style like or two-sided .
...or do something crazy like ""
I feel compelled to say that the notations for the linear logic connectives are far from arbitrary, as Girard himself has stressed several times. They were initially meant as mnemonics for the distributivity rules and . These rules turned out to be related to a deeper phenomenon, polarity: are "negative" connectives whose sequent calculus rules are invertible.
unfortunately, polarity cuts across additive vs multiplicative, and the latter is more blatantly on display a lot of the time
so it mightve been a better idea to prioritize it in the notation
:shrug:
Yeah, you would like the symbols for dual connectives to remind one of each other… Taking all those constraints into account is not an easy design problem :p
indeed!
i dont hate the standard ⊗ ⅋ & ⊕, and ive gotten used to them, but i do think it wld probably have been better to swap ⊕ and ⅋ or something
although then you have the problem that ⊕ is a multiplicative and not an additive... :cold_sweat:
oy gevalt
BTW, if anybody wants to implement a non-exponential correctness criterion for MLL+Mix, the way to do so would be to use Retoré's reduction to alternating elementary cycles for perfect matchings, and then use the linear-time algorithm for the latter. (I point this out in one of my papers.)
maybe we should all switch to mike shulman's notation
In my experience a lot of people don't realize how fundamental Retoré's work on proof nets is (but some people in the deep inference community do get it)
hmm, now you're talking about stuff out of my league :sweat_smile:
Well IMO this stuff is a lot simpler (or at least less abstract) than categories, it's just discrete math / graph theory
i mean in terms of what i currently know—im sure i could learn about it if i put the time in
i havent gone super deep w/ proof nets—cant remember any correctness criteria off the top of my head or anything
My pitch would be: perhaps the most difficult result of Girard's original "Linear Logic" paper was proving that MLL proof nets enjoying the correctness criterion are exactly those that can be obtained from the MLL sequent calculus. MLL+Mix adds further difficulties. What Retoré shows is that the problem for MLL+Mix is basically equivalent to a result from the 1950s in graph theory, so logicians didn't have to do the whole thing from scratch.
ooh nice
yeah lol i remember skimming thru that paper and scrolling past pages and pages of material on long trips
And in turn this result (Kotzig's theorem) is equivalent to a lot of other things that pop up in graph theory, including IIRC a notoriously difficult lemma from one of the Robertson-Seymour graph minor papers; there's a paper by Stefan Szeider that compiles some of those equivalent versions.
So with this, instead of being this isolated thing from proof theory, you have a phenomenon which is actually connected to mainstream graph theory.
every polyhedral graph has an edge whose two endpoints have total degree at most 13
wat
I'm not talking about that
It's the one about perfect matchings: https://arxiv.org/pdf/1402.0949.pdf
And Szeider's paper: https://www.ac.tuwien.ac.at/files/pub/szeider-AC-2004.pdf
(Damn those people who are productive enough to have multiple well-known theorems named after them!)
I guess while I'm at it I can plug my own work: https://lmcs.episciences.org/6172
nice
Oh and I forgot the most important, Retoré's paper: https://www.sciencedirect.com/science/article/pii/S030439750100175X
(What I'm talking about is just the short first section of that paper by Retoré; the rest is dedicated to an equally important idea of representing proof nets using cographs, which is at the heart of a lot of subsequent developments, e.g. Dominic Hughes's combinatorial proofs for propositional classical logic)
Regarding notation, in both category theory and logic it's pretty standard to use rotated/reflected variant notations for pairs of notions that are categorical / De Morgan duals. Thus for cartesian products and coproducts, for intersections and unions, for classical conjunction and disjunction, and... and ??
I think the most sensible notation is to use cartesian/lattice notations like or for the additive connectives, since they are in fact categorical (co)products / posetal meets/joins. The problem then is to find a notational pair for the multiplicative that displays their De Morgan duality but doesn't clash with other expectations too much. Cockett-Seely used , but I just can't handle for a tensor product; it has too strong a connotation of a direct sum / biproduct for me. Jeff Egger has suggested \owedge and \ovee, which is not bad except that I have a hard time distinguishing them visually. The notation I used in the talk, with and a -rotated version thereof, is the best I've found.
on the other hand, ⅋ just looks cool :pensive:
Its main value seems to be as a shibboleth for linear logicians.
you're probably right—altho i mean, would a turned ⊠ not end up playing the same role?
It's much easier to draw. For me at least.
oh true, totally forgot about handwriting :upside_down:
personally i just had a point around when i got into linear logic where sat down and spent 20 minutes writing ⅋s over and over workign out how to do it
Mike Shulman said:
I think the most sensible notation is to use cartesian/lattice notations like or for the additive connectives, since they are in fact categorical (co)products / posetal meets/joins. The problem then is to find a notational pair for the multiplicative that displays their De Morgan duality but doesn't clash with other expectations too much. Cockett-Seely used , but I just can't handle for a tensor product; it has too strong a connotation of a direct sum / biproduct for me. Jeff Egger has suggested \owedge and \ovee, which is not bad except that I have a hard time distinguishing them visually. The notation I used in the talk, with and a -rotated version thereof, is the best I've found.
I like Jeff's notation because then then it matches with the notation for the monoidal units. Also, the duality of both monoidal structures is expressed in this symmetry along the horizontal axis. As long as it is not the upside down ampersand, because I find it impossible to legibly write.
@Nguyễn Lê Thành Dũng thanks for the refs. I find it quite remarkable many different-looking correctness criteria of lots of different complexities are equivalent. I'm essentially teaching myself them by coding them in to PyPN. Maybe I'll have a go at Retore's next.
PyPN does already have one polynomial correctness checker for MLL+MIX, cut_checker
which repeatedly removes nodes, subject to the condition that removing "tensor-like" nodes disconnects its neighbours. This is quadratic I think. I convinced myself this is correct reading some papers on (ISO)MIX by Cockett & Seely, but probably there's a better ref for this.
It seems that most of the newer correctness criteria are actually easier to implement for MIX, rather than harder, i.e. usually there is something additional you have to check about the net when you don't have MIX. For instance, without MIX,cut_checker
would need to check that, whenever you remove a par-like node, the graph doesn't disconnect.
The exception to this seems to be the "contraction to a point" criterion that appears in Danos' PhD thesis. I couldn't see any particularly obvious way to incorporate MIX into this, which is a bit of a pity because it is perhaps the most conceptually interesting for the application I have in mind (causality and signalling for "black-box" processes).
Aleks Kissinger said:
PyPN does already have one polynomial correctness checker for MLL+MIX,
cut_checker
which repeatedly removes nodes, subject to the condition that removing "tensor-like" nodes disconnects its neighbours. This is quadratic I think. I convinced myself this is correct reading some papers on (ISO)MIX by Cockett & Seely, but probably there's a better ref for this.
IIRC something like this also appears in Danos's PhD.
Aleks Kissinger said:
The exception to this seems to be the "contraction to a point" criterion that appears in Danos' PhD thesis. I couldn't see any particularly obvious way to incorporate MIX into this, which is a bit of a pity because it is perhaps the conceptually interesting for the application I have in mind (causality and signalling for "black-box" processes).
Indeed, I spent a lot of time thinking about this and couldn't find any way to do so either. My feeling is that the contractibility criterion fundamentally relies on some properties that are there in MLL but not in MLL+Mix. For instance, if a subgraph of a MLL proof net is itself a correct MLL proof net, then it may appear as a subproof of some sequentialization of the bigger net; there is no analogous property for MLL+Mix (you have to distinguish what Bellin calls "normal subnets").
Another exception is the criterion by Virgile Mogbil & Paulin de Naurois that shows that MLL proof net correctness is NL-complete (NL = non-deterministic logarithmic space) – for MLL+Mix the precise complexity is not really known (applying an old result of László Lovász gives you randomized NC complexity, which is believed to be strictly smaller than P).
Cole Comfort said:
I like Jeff's notation because then then it matches with the notation for the monoidal units.
Argh, but then you have a clash with the traditional notations: and are the units for the additive conjunction and multiplicative disjunction respectively in linear logic. I'm fine with emphasizing duality instead of polarities, but then IMO you should use fresh symbols to avoid confusion.
(Personally I've come to like the traditional notations.)
Nguyễn Lê Thành Dũng said:
(Personally I've come to like the traditional notations.)
Same. Not for any well-considered reasons, but just that there's something satisfying about a drawing a well-executed par-symbol. :)
The ASCII notations I'm using in PyPN come from limitations and python and at least partially from the notations used by llprover
, which is a 20+ year old prover for CLL and fragments written in prolog.
+1 for the Girard notation. I like that and fit with tensor product and direct sum in categories of ring modules, which I see as the prototypical symmetric monoidal closed cats. It feels right that they should go together, and with/par be in the same relation in the opposite category.
actually i kinda like that way of thinkign about it now that ive heard it :)
I would tend to think of Set as a more "prototypical" symmetric monoidal closed category.
And ring modules are not -autonomous, so their notation doesn't really suggest what should be used for that case.
And the point is that multiplicative conjunction and additive disjunction don't "go together". What "goes" with multiplicative conjunction is multiplicative disjunction.
But probably we should stop before we get any further down the rabbit hole of Wadler's Law.
The 'going together' was having a distributive rule. But of course, that's just one choice of 'going together'.
I think if you connect the notation up with the categorical structure, ways for remembering the distribution that are at least as good as the Girard notation, though. The tensors distribute across the relevant (co)product.
so what's the linear logic notation for comments?
the possibility of going further down the rabbit hole of wadler's law necessitates there being one, surely
One problem with emphasizing distribution in order to remember the notation is that frequently we work with fragments like MLL that don't have the additives, hence distribution can't even be discussed.
@Aleks Kissinger I just found out about this implementation of various algorithms on proof nets in Scala: https://staff.aist.go.jp/s-matsuoka/PNCalculator/index.html
cool! Thanks for pointing this out.