You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
It's been quite a few months that I've been having this idea popping up in my head at random: The functor category being conceptually very close to some sort of polymorphic -calculus, in that polymorphic (well-behaved) -terms are morphisms and parametric types are objects. Non-parametric stuff are just plain-old terms between "const" functors.
This has the advantage that polymorphic terms are now "internalized" (I'm hoping I'm using the word properly) as morphisms. Thoughts?
It's similar to schematic types from e.g. Hindley-Milner. But I think there are some issues. One is that there's only one variable, and it's covariant only. Also, that category might not be (Cartesian) closed, due to size issues (not sure).
Yeah, I was thinking about Hindley-Milner as well. As for the CCC structure, according to this entry in nlab, sec. 3, 1st subsec., 2nd paragraph, it is indeed a CCC.
isn't small.
Is that a hard no? Is it not possible to restrict to small sets?
Ah, look at "Size" section in the previous nlab link, this can be easily done.
Sorry, for the post right above I meant this link.
Restrict what? The category of small sets isn't small itself.
Hmmmmm :/.
You could have where the capital one is the category of large sets, and non-capital one is itself large, but then it's not exactly the same situation.
Whatever works really. I struggle to figure out where things might go wrong. Which brings my to why I have an interest in this. I dabble in Turi's mathematical operational semantics, where semantics are represented by distributive laws of various complexity.
If said distributive laws (or semantics) correspond to -terms themselves (as they are morhpisms in the functor category we are working in), or any terms of the model of computation of choice, maybe there's some room for experimentation.
Like, in the spirit of classic fixpoint results, attempt to (show that it is not possible to) make a distributive law that is "equivalent" to the computational model one is working in. That would be fun.
Stelios Tsampas said:
Ah, look at "Size" section in the previous nlab link, this can be easily done.
The problem that one encounters when doing category theory formally is that one wants to work with precisely-defined categories, and that means dealing with size issues. In this instance the problem is that while we can choose a category (or "universe") of sets to be closed under typical operations like products, powersets/exponentials, subsets... when we try to build a category like , we would like to argue that these operations are still possible in a point-wise way, but what we find is that even if we know how to "construct" them, the resulting functors may be "too big", in that they no longer take values in our original universe.
Yea is definitely not a CCC.
In this case a solution might be to work in until you desperately need to have uncountably many objects to work with :wink:
An important lesson learned :).
But I like my identity function to be polymorphic...
I think there are pretty strong constraints about what things can be models of System F. In particular, a boolean topos cannot fit the bill...
Morgan Rogers said:
In this case a solution might be to work in until you desperately need to have uncountably many objects to work with :wink:
Thank you for the idea. Maybe there's another (full) subcategory that's CCC?
It doesn't have to be System F by the way. But it looked like some flavor of polymorphic lambda calculus.
Cody Roux said:
I think there are pretty strong constraints about what things can be models of System F. In particular, a boolean topos cannot fit the bill...
@Stelios Tsampas you should look into whether/why this^ is the case, since it likely precludes many otherwise plausible categories, given that the Yoneda embedding preserves any limits and exponentials that exist. Even if you don't need it to be System F, it might be enlightening to see what goes wrong, and you'll learn some topos theory along the way :heart_eyes: Do you have references, @Cody Roux?
@Morgan Rogers What?! You mean to tell me that there isn't a quick and dirty way to git things done? That I can't just cut the Gordian knot??
Just kidding, foundational stuff usually deserve doing them the hard way. This should be worthwhile.
For System F, the classic is "Polymorphism Is Not Set Theoretic". That it can be done in a non-boolean topos, there's "Polymorphism is Set Theoretic, Constructively". However, System F may not be relevant, because the question was about schematic types a la Hindley-Milner, which can be predicative, and (the sort of) impredicativity of System F is the real issue.
Yes, that makes perfect sense.
So maybe I could take a look in the paper you mention and make sure I stay within the boundaries of Set Theory when dealing with polymorphism. And then figure out the best fit for the functor category I'm looking for.
Morgan Rogers said:
In this case a solution might be to work in until you desperately need to have uncountably many objects to work with :wink:
Note that $\mathsf{FinSet}$ is a large category. But it's essentially small - equivalent to a small one - so you can replace it with a equivalent small category before taking presheaves on it.
You could do the same replacing finite sets by sets with cardinality bounded by whatever you want. All these give essentially small categories. So, take an inaccessible cardinal...
So a large category being equivalent to a small one :thinking:. This sounds crazy, and evil. Or is it I that's evil?
But don't you run into equivalent problems if you just do the original thing? Like is probably not a CCC because it's not κ-complete?
The problem with is that it doesn't function in the way the original question was posed, because it is type schemas whose variables are finite types, not arbitrary types.
For strong limit cardinals (those for which ) it might be okay?
I think @John Baez implies that one cat take the category of sets bounded by something like , not necessarily finite.
Oh never mind, the problem is again itself is not -small
Oh right. It probably would be κ-complete.
It's not crazy at all. Every small category is equivalent to a large one: just throw in a proper class of objects isomorphic to objects you already have. (You have to do this correctly to maintain equivalence.) So when someone hands you a large category you should query whether it's equivalent to a small one. The easiest way is to find a skeleton and see if that's small.
Dan Doel said:
But don't you run into equivalent problems if you just do the original thing? Like is probably not a CCC because it's not κ-complete?
First of all, there's no requirement that a cartesian closed category have any limits other than finite products.
Second, I'm not sure what you mean by - the category of sets of cardinality , or some equivalent small category. I prefer working with when the category is small, because then a bunch more theorems apply. For example: whenever is small, is complete and cocomplete. That is, it has all small limits and colimits.
So, if means some small category equivalent to the category of sets of cardinality (or maybe ) then is complete and cocomplete. And yes, it's a cartesian closed category.
John Baez said:
It's not crazy at all. Every small category is equivalent to a large one: just throw in a proper class of objects isomorphic to objects you already have. (You have to do this correctly to maintain equivalence.) So when someone hands you a large category you should query whether it's equivalent to a small one. The easiest way is to find a skeleton and see if that's small.
I see! Thank you, I hadn't thought of that :).
It's possible that the suggestion of was for something else, because it is like extending sets with variables. But it's not obvious to me how that would help.
@Stelios Tsampas: you might be interested in Fiore–Hamana's Multiversal Polymorphic Algebraic Theories (https://www.cl.cam.ac.uk/~mpf23/papers/Algebra/mpat.pdf), which describes a framework for polymorphic type theories (including System F) using presheaf categories
@John Baez is not like 'types schemas in variables that range over all types', assuming that the 'types' are sets (which is why the base is ). It has just expanded what the variables range over to κ-sized sets, but that is still the same restriction as using . I would presume we are looking for such that is Cartesian closed.
is the types, and are the schemas in one variable.
I don't know anything about "type schemas in variables that range over all types", so I'm not attempting to join this part of the conversation. I just know which categories are complete, cartesian closed, etc., so I was correcting some apparent mistakes I saw.
The completeness criterion is that I'm assuming that the nlab's sufficient condition for to be a CCC generalizes to being -complete.
needs to be complete for things indexed by certain constructions related to , but im not sure that -completeness is sufficient
at the very least, in a constructive setting i think it may not be—in a classical setting i don't know
consider presheaves on some preordered set; the exponential is basically kripke semantics, so you're using Set's completeness over a down-set (er, maybe an up-set), not just the base category itself
and constructively speaking, it's totally possible to have completeness for some shape but not for arbitrary subshapes or similar things—just consider that any category has all limits of shape 1, but few have all limits of every subsingleton shape!
it's possible i'm missing something that resolves the issue, but in any case i think this shows that it's not cut-and-dry
Yeah, you probably need small copowers as well or something.
For the well-known construction.
John Baez said:
It's not crazy at all. Every small category is equivalent to a large one: just throw in a proper class of objects isomorphic to objects you already have. (You have to do this correctly to maintain equivalence.) So when someone hands you a large category you should query whether it's equivalent to a small one. The easiest way is to find a skeleton and see if that's small.
except for the empty category :smirk_cat:
Wow, true. It's so damned small it's not equivalent to any large category!
The empty sets strikes again
Dan Doel said:
The problem with is that it doesn't function in the way the original question was posed, because it is type schemas whose variables are finite types, not arbitrary types.
If you're trying to tackle "arbitrary types" you'll never escape these size issues! Surely there must be some upper bound on the size of type constructions you reeeeally need?
Alternatively put, is there a reason why you need and can't use for some larger category ? The only one actually mentioned so far is "I want the identity to be in there", but there is a natural embedding of eg. FinSet into Set which tastes just like the identity. I remember reading some work a long time ago about Relative Pseudomonads (Highland et al, I believe) that was built to overcome size issues a lot like this (eg I want to think of the Yoneda embedding as a kind of monad but I can't because it makes things Big)
This is (I think) the original reference about relative 1-monads: https://arxiv.org/abs/1412.7148. This is the paper about relative pseudomonads: https://arxiv.org/abs/1612.03678.
@Morgan Rogers Types can have infinite numbers of values in them, like the natural numbers. Type schemas look like a -> a
(that's the one for the identity function), and a
can be instantiated to any concrete type (or schema, if you're using it to produce another schema). My understanding is that the original question was about fixing a category that represents all the concrete types, and then representing these schemas as functors . I'm not surprised that it doesn't work, because I was trying to explain ways why it wouldn't, initially.
Or, I'm not surprised it doesn't work with .
Unless I'm not understanding the alternate suggestion, then is going to be a situation where 1) does not allow being schematic in all concrete types or 2) contains things that aren't concrete types, so it isn't a precise correspondence like the initial question wants.
From the preceding discussion, your option 2 sounds like a concession which might ultimately be necessary to avoid size issues if you want both your category of concrete types and your category of schemas to have moderately strong closure properties. :shrug:
I actually studied Fiore and Hamana's work on algebras with binders a long time ago, and I'd be happy to see discussion on the subject. It's clear there is a stark distinction between viewing polymorphism as a function between universes and as a schema which involves syntax, which is what Dan is getting at. Obviously syntax allows for anything, including inconsistent theories, so one usually wants some semantics at some point, but it's interesting to start with the syntax.
I can kind of understand the hope, because being a functor rather than a function requires you to be coherent in a way that might only allow things we'd understand as parametric. But there are at least technical issues.
You could take the category CESet of computably enumerable sets and partial computable functions and do category theory internal to it. Then CESet is "internal to itself" [1] and CESet^CESet is countable, because only computable functors are allowed.
[1] That's a link to a page quoting a thread on the category theory mailing list; I can't find the thread online anywhere else.
Mike Stay said:
You could take the category CESet of computably enumerable sets and partial computable functions and do category theory internal to it. Then CESet is "internal to itself" [1] and CESet^CESet is countable, because only computable functors are allowed.
[1] That's a link to a page quoting a thread on the category theory mailing list; I can't find the thread online anywhere else.
That indeed sounds very, very promising. Thanks!