You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
My understanding is that this is the 'most permissive pure type system' as described at https://ncatlab.org/nlab/show/pure+type+system#inconsistent_systems. I realise that this was proved inconsistent by Girard. But before that it was intended to function a foundation for mathematics. My question is how would that have worked?
Dependent product types are given to us by the deduction rules. But how can we use these to define other types? Can we define dependent sum types purely in terms of products? How would we get the natural numbers, booleans, etc?
We don't know about the 1971 paper, but in his 1975 paper there is separate provision for Π, Σ, and ℕ in the rules of the system. I've always assumed that the 1971 system is similar.
I guess you could use the same techniques as in System F, Church encodings: booleans would then be encoded by with , and where I use that in the second application to .
Edit: Not sure the eliminator returns the right type though (you would need )
Though, with the intensional Martin-Löf identity type, equality of Church encoded types is going to be quite bad.
My guess had been that the relation between the 1971 and 1975 papers was analogous to the relation between Frege's set theory and ZF. For example with unrestricted comprehension you can easily prove that every set has a powerset. You just directly have the set whose elements are precisely the subsets of your original set. But in ZFC you have to explicitly put in an axiom to allow you to construct powersets.
I'd assumed that Martin-Löf's 1975 paper was similar. After weakening the type system to avoid Girard's paradox he then had to add in a bunch of constructors to get back what had been lost.
@James Wood if you add primitive identity types, they won't fare well with church encoded datatypes.
But you can also (try to) church encode the identity types themselves, that seems to be the most straightforward solution from a PTS pov. Anyway, I'm not really sure what to expect from the resulting theory: it is logically inconsistent, and from a computational pov there isn't much to observe
The trick seems to be that you need some kind of parametricity or naturality to ensure that the functions encode only the constructors you expect.
(e.g. if you have LEM you can construct non-parametric functions)
@Mike Shulman wrote a post about adding coherence conditions to functions to obtain non-truncated coproducts, but he mentioned that it required a pre-existing type of natural numbers: https://homotopytypetheory.org/2018/11/26/impredicative-encodings-part-3/ (I don't know if there's been more progress here since then)
In general I'm optimistic that functions actually provide most of the structure needed for type theories, and inductive types are “merely” conveniences to obtain better computation rules and prevent universe explosions (especially without impredicativity), but it's far from obvious because of the risk of logical contradictions …
Funext becomes a much bigger issue, though …
It's worth noting that people had been working with simple type theory and higher-order predicate logic for decades at this point. The inadequacy of the church encodings can be "fixed" by restricting to the well formed naturals, i.e. the ones obeying .
Then every statement about naturals needs to be relativized, e.g. becomes .
Again, people had been doing things this way for a while.
Nick Scheel said:
Mike Shulman wrote a post about adding coherence conditions to functions to obtain non-truncated coproducts, but he mentioned that it required a pre-existing type of natural numbers
FWIW, instead of the natural numbers, I believe you can use coinductive types as long as they satisfy "coinductive extensionality" (equality is bisimulation). You also don't need the natural numbers if you have UIP; then the Awodey-Frey-Speight construction works directly. So you could restrict to the types that are h-sets and use that construction. Or you could use setoids.
Cody Roux said:
The inadequacy of the church encodings can be "fixed" by restricting to the well formed naturals
What's the meaning of "fixed" in quotes? Does the resulting type satisfy the correct induction principle?
I think you need that is a subsingleton, which is not the case assuming . You could have , but then you can only eliminate into mere propositions. So I suspect there needs to also be a condition on to say it is the canonical induction principle … maybe there's some nice trick like in the blog post, before we get an infinite tower of well-formednesses.
Mike Shulman said:
Cody Roux said:
The inadequacy of the church encodings can be "fixed" by restricting to the well formed naturals
What's the meaning of "fixed" in quotes? Does the resulting type satisfy the correct induction principle?
It does not, sadly (at least in the 2nd order predicate calculus). Indeed there is no type which satisfies the induction principle, so you have to add the extra qualification/relativization for each proposition about natural numbers.
It's also consistent to assume it as an axiom, of course, but that's pretty unsatisfying.
Thanks. It does seem plausible that that's what Martin-Lof originally had in mind, although nowadays we can imagine better ways to do things. (-:
James Wood said:
We don't know about the 1971 paper, but in his 1975 paper there is separate provision for Π, Σ, and ℕ in the rules of the system. I've always assumed that the 1971 system is similar.
The 1971 paper is now online here: https://raw.githubusercontent.com/michaelt/martin-lof/master/pdfs/martin-loef1971%20-%20A%20Theory%20of%20Types.pdf
And it turns out I was right! His original system uses only s. He defines via: