Category Theory
Zulip Server
Archive

You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.


Stream: deprecated: logic

Topic: treating implication as fundamental in linear logic


view this post on Zulip Spencer Baugh (Aug 30 2021 at 04:08):

So far in the presentations of linear logic that I've seen, additive and multiplicative conjunction and disjunction are treated as fundamental and implication is defined in terms of those. In particular, so far all the presentations of linear lambda calculus that I've seen have (various linear forms of) lambda as a primitive, but also have primitives for conjunction and disjunction. That's different from other lambda calculi which often prefer to use lambda as a primitive, and define conjunction and disjunction with Church encoding, rather than have data types as a distinct primitive. (e.g. System F)

Is it possible to have linear logic with implication as the foundation? In particular, is it possible to have a linear lambda calculus where the conjunction and disjunction forms are not primitve, but are recreated with (various linear forms of) lambda terms? What would that look like? (it seems easy and standard for multiplicative conjunction and additive disjunction - it's just Church encoding - but I have no idea if it can be done for additive conjunction and multiplicative disjunction)

view this post on Zulip Damiano Mazza (Aug 30 2021 at 07:44):

Spencer Baugh said:

That's different from other lambda calculi which often prefer to use lambda as a primitive, and define conjunction and disjunction with Church encoding, rather than have data types as a distinct primitive. (e.g. System F)

Actually, as far as I know, even in intuitionistic logic conjunction and disjunction cannot be fully defined in terms of implication alone, you need second order quantification. That is, System F (or calculi with higher order quantification) is the only example of "other lambda calculi" you mention. For example, in the simply-typed λ\lambda-calculus, there's no way of defining A×BA\times B in terms of the arrow alone so that, say, the first projection gets type A×BAA\times B\to A.

That being said, something similar holds in linear logic: multiplicative conjunction and second order existential quantification are definable in terms of linear implication and second order universal quantification (the definitions are essentially identical to the standard ones of intuitionistic logic, you just trade intuitionistic implication for linear implication):
AB := X.(ABX)XA\otimes B\ :=\ \forall X.(A\multimap B\multimap X)\multimap X
1 := X.XX1\ :=\ \forall X.X\multimap X
X.A := Y.(X.AY)Y\exists X.A\ :=\ \forall Y.(\forall X.A\multimap Y)\multimap Y

Then, in order to define additive conjunction and additive disjunction, you need the exponential modality (which, of course, is not itself definable):
A&B := X.!(XA)!(XB)XA \mathrel{\&} B\ :=\ \exists X.\mathop !(X\multimap A)\otimes\mathop !(X\multimap B)\otimes X
 := X.X\top\ :=\ \exists X.X
AB := X.!(AX)!(BX)XA\oplus B\ :=\ \forall X.\mathop !(A\multimap X)\multimap\mathop !(B\multimap X)\multimap X
0 := X.X0\ :=\ \forall X.X

Defining multiplicative disjunction in terms of linear implication is either trivial in the classical case (they are the same connective up to a negation), or does not make sense in the intuitionistic case (the par connective needs more than one formula on the right hand side of sequents to be meaningful).

view this post on Zulip Mike Shulman (Aug 30 2021 at 13:15):

Of course, in the classical case you can also define multiplicative conjunction from implication and negation:
AB(AB)A\otimes B \coloneqq (A \multimap B^\perp)^\perp.

view this post on Zulip Valeria de Paiva (Aug 30 2021 at 15:22):

But NO! in Intuitionistic Logic conjunction, disjunction and implication are ALL independent connectives. In classical logic yes, you can define everything in terms of implication, but not in intuitionistic logic. so if you have propositional quantification as in system F then the Russell-Prawitz translation works, but not with simple types.

and yes you can have a linear logic formulation that thinks of lambda as the main operation in FILL [Full Intuitionistic Linear Logic] (tinyurl.com/59bpj5ec)

view this post on Zulip Valeria de Paiva (Aug 30 2021 at 15:26):

(the par connective needs more than one formula on the right hand side of sequents to be meaningful).

is not true. You can have multiple conclusion intuitionistic logic (FIL) and multiple conclusion intuitionistic linear logic!

view this post on Zulip Damiano Mazza (Aug 30 2021 at 17:08):

Valeria de Paiva said:

You can have multiple conclusion intuitionistic logic (FIL) and multiple conclusion intuitionistic linear logic!

Ha, that's something completely new to me! Thank you. The question then would be whether there is a second order definition of the par connective in FILL using only linear implication...

view this post on Zulip Spencer Baugh (Aug 30 2021 at 19:04):

Damiano Mazza said:

A&B := X.!(XA)!(XB)XA \mathrel{\&} B\ :=\ \exists X.\mathop !(X\multimap A)\otimes\mathop !(X\multimap B)\otimes X

Ah, thanks! That's what I was looking for, that makes sense! Thanks all for all the other comments as well.

(Ultimately the reason I wanted to know this is because my programmer brain is reliant on understanding logic through computation, so an encoding of linear logic into some lambda calculus helps me understand. And, indeed, this makes alternative conjunction much clearer to me, since now I can see how you'd write a lambda calculus term for it!)

view this post on Zulip James Wood (Aug 31 2021 at 15:51):

It may help to notice that Church encodings (or whatever they're called these days) arise as transcriptions of pattern-matching eliminators. For example, in system F, we have A+BX. (AX)(BX)XA + B \coloneqq \forall X.~(A \to X) \to (B \to X) \to X. When using the encoded A+BA + B, we give the result type XX of the case expression and the two branches/continuations for handling AA- and BB-values. As such, it's clearer how to encode type formers with pattern-matching eliminators (called positive when talking about polarity, with other type formers, like \multimap and &\&, called negative).

By fluke, every propositional intuitionistic type former we care about other than \to is equivalent to a positive type former, so just functions are good enough to make encodings (in particular, positive and negative products are equivalent thanks to arbitrary deletion and duplication). An alternative approach in a linear setting would be to afford yourself all of the standard negative type formers, and encode the positive ones. This gives encodings like ABX. (AX) & (BX)XA \oplus B \coloneqq \forall X.~(A \multimap X)\ \&\ (B \multimap X) \multimap X, which are a tighter fit to the usual eliminators and avoid !!.

view this post on Zulip Valeria de Paiva (Aug 31 2021 at 16:10):

James Wood said:

It may help to notice that Church encodings (or whatever they're called these days)

I learned about the encodings in Peter Aczel's "The Russell-Prawitz Modality" 1999 (http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=9FA745910624425FE262012EE70607FD?doi=10.1.1.39.5294&rep=rep1&type=pdf) so maybe my notation is outdated. just pointing it out to make sure we're talking about the same encodings.