You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
Given monads and a distributive law of over , the monad 'lifts' to -algebras, 'extends' to the Kleisli category of , and the composite functor is a monad, whose algebras are (equivalent to) the algebras of the lifted monad on -algebras.
Thus, we have a commutative triangle of monadic functors
which is nice, since monadic functors don't compose in general.
However, it seems that in the most well-known example --- namely the distributivity of the monoid monad over the abelian group monad on --- we seem to have a commutative square
of monadic functors?
Is this special to this case (maybe because the Ab-monad is commutative), or is always monadic over (besides being monadic over ?
Are you sure that is monadic? I thought that would imply the free abelian group monad distributes over the free commutative monad (which is not the case, they distribute only the opposite way).
No I'm not sure, but I think it has a left adjoint, given by the tensor algebra formula. Maybe it's only enriched monadic?
Indeed, it should have a left adjoint, by the [[adjoint lifting theorem]].
Since rings are monoids in and some other nice things hold ( has colimits and distributes over colimits) we can build the free ring on an abelian group using a standard trick for building the free monoid in a monoidal category , which works when this monoidal category has colimites and the tensor product distributes over them:
This standard trick is the "tensor algebra formula" @Jonas Frey mentioned.
I think this gives a left adjoint to the forgetful functor .
But if this forgetful functor really has a left adjoint and is really not monadic, we get an interesting puzzle! What is the Eilenberg-Moore category of this adjunction between and ?
It's monadic. For instance, it's a special case of the fact that the category of algebras for an enriched operad is monadic. But I don't think this is true for a general distributive law; note that the monad here on Ab is not really built in any obvious way out of the two original monads on Set.
Mike Shulman said:
[...] note that the monad here on Ab is not really built in any obvious way out of the two original monads on Set.
Ok, so this is why we do not get a distributive law of the other type out of this monadicity.
So we've got a monadic right adjoint that's the composite of monadic right adjoints (first forget the multiplication, then the addition) even though addition doesn't distribute over multiplication? If so, this is a nice counterexample to keep in mind.
(For beginners: of course is also the composite of monadic right adjoints (first forget the addition, then the multiplication) where we do have a distributive law that explains why the composite is monadic.)
It's funny, because to me the composite feels more "natural" than . I think more naturally of a ring as "an abelian group with the extra structure of a multiplication" than as "a commutative monoid with the extra structure of an addition". But distributive laws disagree...
John Baez said:
So we've got a monadic right adjoint that's the composite of monadic right adjoints (first forget the multiplication, then the addition) even though addition doesn't distribute over multiplication? If so, this is a nice counterexample to keep in mind.
(For beginners: of course is also the composite of monadic right adjoints (first forget the addition, then the multiplication) where we do have a distributive law that explains why the composite is monadic.)
There are various theorems about when monadic functors compose, that have nothing to do with distributive laws. One is that if and both have left adjoints, both reflect isomorphisms, and both preserve reflexive coequalizers, then of course the same is true of their composite . But this trio of conditions is sufficient for monadicity; this result is called the "crude monadicity theorem". Hence is monadic as well.
As a matter of fact, if is crudely monadic and is merely monadic, then is also monadic. This, together with some related results, is stated as a theorem in Toposes, Theories, and Triples by Barr and Wells. (But alas, with a one-word proof: "Easy".)
In the go-to example where monadic functors do not compose, namely the pair of forgetful functors
where the first functor takes a category to its underlying reflexive graph and the second takes a reflexive graph to its set of edges, the first functor does not preserve reflexive coequalizers. Indeed, coequalizers in can introduce some complications due to looping: the categories and are finite and rather simple, but the coequalizer of the two functors is infinite (the additive monoid as a one-object category).
In my head, the "default" is that the composite of two monadic functors should be monadic, and counterexamples like that are "pathological". I don't know whether that's justified though.
Yeah, it could be fun to think up more such counterexamples. I haven't thought about it, though.
Thanks for your all your input! Here's how I see the situation now:
where the diagonal, and all sides except the left one are monadic.
So to me it seems like we get a square of monadic functors "almost always", and to find a counterexample we'd have to look at categories with very few colimits. Does anybody know a counterexample?
In particular, I think that the cancellation property for monadic functors implies that for every morphism of Lawvere theories the induced adjunction between L-models and M-models is monadic, which gives a large class of "composable" monadic functors, including everything in this square:
Jonas Frey said:
So to me it seems like we get a square of monadic functors "almost always", and to find a counterexample we'd have to look at categories with very few colimits.
Nice observation. I can't think of a counterexample offhand, but note that it could still happen that has lots of colimits, as long as doesn't preserve them so that they don't lift to .
Mike Shulman said:
Jonas Frey said:
So to me it seems like we get a square of monadic functors "almost always", and to find a counterexample we'd have to look at categories with very few colimits.
Nice observation. I can't think of a counterexample offhand, but note that it could still happen that has lots of colimits, as long as doesn't preserve them so that they don't lift to .
Ahh right, over base categories other than Set, cocompleteness is not automatically inherited by the algebras (see here)
Mike Shulman said:
Jonas Frey said:
So to me it seems like we get a square of monadic functors "almost always", and to find a counterexample we'd have to look at categories with very few colimits.
Nice observation. I can't think of a counterexample offhand, but note that it could still happen that has lots of colimits, as long as doesn't preserve them so that they don't lift to .
Sorry, not following: where is preservation of colimits involved in @Jonas Frey's proof?
The adjoint lifting theorem requires to have certain coequalizers. Usually the way to show that a category of algebras for a monad has colimits is to show that the base category has colimits and the monad preserves certain colimits.
The simplest version is if has a certain kind of colimit and the monad preserves that colimit, then also has that colimit. But you can do better than that, e.g. IIRC as soon as preserves reflexive coequalizers, if is cocomplete then so is . And by a different argument, if is not just cocomplete but locally presentable, and is accessible (preserves sufficiently-highly-filtered colimits), then is also locally presentable and hence cocomplete.
Ok, so you if I understand correctly, you meant to point out that the hypothesis is not so mild, right?
Tom Hirschowitz said:
Ok, so you if I understand correctly, you meant to point out that the hypothesis is not so mild, right?
Yes that's how I understood Mike's comment as well. In particular, it's not only a condition on the underlying category, but also on the monads.
Still, I'd like to see an explicit counterexample!
Well, it's still relatively mild. Most of the "naturally occurring" monads I can think of are either accessible or preserve reflexive coequalizers.
In fact I can't right now think of any example of a monad on a cocomplete category whose category of algebras is not also cocomplete. (I feel like this is the sort of thing I should know, but apparently I don't.)
Well, I couldn't have either; all I knew is where I'd look it up. An example where cocompleteness fails is given here; see their III.10. But it's not easy.
Hmm, their monad is on the category of "posets and isotonic maps". Is isotonic the same as monotone?
I believe so.
They also refer to some other counterexample due to Adamek.
That sure isn't easy. So I guess the question is, can one make an example like that arise from a distributive law?
Mike Shulman said:
In my head, the "default" is that the composite of two monadic functors should be monadic, and counterexamples like that are "pathological". I don't know whether that's justified though.
Just ran across this example from Toposes, Triples and Theories: the full inclusion of torsion-free abelian groups in abelian groups is reflective, hence monadic, but the composite
is not monadic. I'm imagining that there could be plenty of examples that utilize reflective subcategories of categories monadic over . For example, I'm guessing that commutative rings with no nonzero nilpotents, as a full subcategory of commutative rings, is not monadic over .
Until recently, I also thought of monadic functors not composing as pathological. But ongoing work with @Ambroise and my dad makes me perceive the phenomenon as a side effect of “type dependency”. I know it's probably unclear, but maybe the following example (found by Ambroise) will convey the idea.
It's like models of type theory with a universe and a decoding function, except you don't even need to index over contexts, you merely consider families.
The example is the composite
where
(The reason the “composite” monad does this is: the first left adjoint maps to ; the second left adjoint throws in as many new types as there are elements in the fibre over , i.e., in this case, none!)
Can you explain your notation? What is ?
Oh sorry: I'm thinking of families as presheaves over , so denotes the set of “types” of a family , and denotes its set of terms. Additionally, I'm writing for the fibre of over any . Does this help?
And of course denotes the family with just one type.
Oh, I see -- mixing "indexed" and "fibered" notation.
Huh, you're right, hadn't noticed this, thanks.
So the commonality between Tom's example and the one I gave can be abstracted like this: if in
are monadic, with left adjoints respectively, and if the unit map is an isomorphism, then the monads and on are isomorphic, so that is monadic only when is an equivalence.
In Tom's example, the map is an isomorphism essentially because the fiber over is empty. In the example of torsion-free abelian groups, it's an isomorphism because free abelian groups have the property of being torsion-free. (This sort of example can be multiplied at will, by finding suitable "reflective" properties satisfied by free algebras of whatever ilk.)
Nice examples both of you, thanks. I may have to change my intuition.
Can anyone give an example of a non-monadic composite of monadic functors where the monad induced by the composite adjunction doesn't coincide with the one induced by the second factor?
Just a quick note that the famous pair of monadic functors, one the forgetful functor from categories to reflexive graphs, the other the edge functor from reflexive graphs to sets, doesn't give an example that Mike wants. (The free reflexive graph on a set is an -indexed copower of copies of the reflexive graph , and the free category on that is an -indexed copower of the category , so we're in the situation of being an iso.)
I went ahead and posted Mike's question to MathOverflow.
And got an answer from Sridhar Ramesh, which works: consider torsionfree abelian groups as a reflective subcategory of just groups, followed by the forgetful functor from groups to sets. On the one hand we get the free group monad, and on the other we get the free abelian group monad (where free abelian groups are in fact torsionfree).
This example (and others like it) has the interesting property that it's "just one step beyond trivial": we have monadic functors and such that is not monadic, but we have a factorization such that , , and are monadic.
And the non-monadic composite of monadic functors is "trivial" in that it induces the same monad as .
But @Jonas Frey suggested a different counterexample in a comment which doesn't seem to have even this property: categories over quivers over .
I guess I need to update my intuition!
I didn't take the time to think about it, but in which way are quivers monadic over ?
A quiver is a pair of sets with two functions from the first set to the second, and the forgetful functor sending to is monadic.
One simple way to see this is that the forgetful functor from graphs/quivers to pairs of sets is given by the reindexing functor between presheaf categories induced by the inclusion of into . Since the inclusion is bijective-on-objects, the reindexing functor is monadic.
A pretty general example: take an essentially algebraic theory that is not monadic over any slice category . However, its category of models is LFP, so there is some category on objects so that the category of models is reflective in (hence monadic over) . Moreover by an argument similar to the above, is monadic over .
The composite of the reflection and depresheafification is still a right adjoint and so induces a monad on . The free presheaves are the small coproducts of representables. There is a set of generating anodyne morphisms for the reflector. We can assume because of how the reflector was constructed that if an anodyne morphism's domain lies over a representable there is already a unique lift within that representable, so if all the generating anodyne morphisms have domains that don't split into coproducts, we have the "trivial" situation. Among other things, this means the theory doesn't have constants, and all totally-defined operations are unary. An example is the theory of categories.
Jonas Frey's example avoids being trivial by refactoring the monads so that the totally-defined, unary operation of identity lives along with the reflector. You can do this because in the essentially algebraic theory of categories the identity operation is not used to define the domain of any operation or equation.
@James Deikun this looks interesting, but I can't follow. Would you please mind breaking it up a bit?
Regarding monadicity of graphs over pairs of sets, for the record: concretely, the monad given by left extension followed by restriction maps any pair , to , and an algebra structure amounts to a pair of maps . The intuitive reason the monad does this is that the left adjoint maps to the smallest graph with one edge, i.e., the walking edge, which has two vertices. Hence the monad maps to . The rest should be easy. Didn't know about that one, thanks @Jonas Frey, and @John Baez and @Nathanael Arkor for your help.
In general I think you can, for any presentation of an EAT on sorts , factor the forgetful functor into monadic functors forgetting one equation or operation at a time to along any linearization of its dependency graph, and all the compositions where no "domain dependency" is included will be monadic. Being "trivially non-monadic" depends on:
James, that's a really cool class of examples -- I'll have to remember that. Of course, is a paradigmatic example.
Oooooh, I think I more or less get the point, very nice, thanks. Am I right that this is fleshing out the vague intuition I mentioned before?
For example, the EAT of categories on one sort (arrows), has the operations of source/target (no conditions) and the operation , and the equations . If you break these into things involving and things not involving it, you get basically the kind of division I mentioned, although there's something I guess I missed: if you can prove an instance of a "guard" for a new operation you can rectify the situation by equating the result to something that doesn't involve the new operations, c.f. , which rectifies the equation , an instance of that enables you to write in the first place.
And yeah, this can be considered a more formal rendering of the intuition that it is "type dependency" that breaks the composition of monadic functors.
Basically: a forgetful functor can be monadic if structure or properties it forgets depend on structure or properties that it remembers, but not if it forgets something dependent on something else it also forgot!
Torsion-free Abelian groups also fall into this framework. The theory of torsion-free Abelian groups includes three operations: with no conditions, various equations with no conditions, and an infinite series of equations with conditions: for all primes . If you divide the conditional equations separate from everything else, you have the standard "trivial" version of this example; if you divide it in other ways like putting and its equations, or the commutative law, or both, together with the conditional equations you can get non-trivial examples. You can even put together associativity with the conditional equations and you should still get something monadic, but you need Theory 1 to be at least the theory of a pointed magma or the full theory will not be monadic over Theory 1.
And I would say the intuition should be that monadic functors failing to compose is not "pathological" at all; when is monadic it is because is somehow ignoring everything about its target category that forgets.
My intuition is also that Cartesian and weakly Cartesian monadic functors compose much better, because they come from kinds of operads and my intuition is that conditional operations and equations are generalized contraction, but as of yet I have less to back up this part.
Let's make the idea of "ignoring everything" more precise: Say you have a monad on and another monad on . I maintain there is a proper "composite monad" whose monadic adjunction is the composition of 's and 's iff has arities .
Say does have such arities. Then for in consider the large canonical diagram for over . It is shaped like the comma category which is isomorphic to . The diagram itself is the composition of the forgetful functor from this slice and . If we further compose it with and , we get a diagram that looks like or in other words and its colimit must be .
James Deikun said:
Let's make the idea of "ignoring everything" more precise: Say you have a monad on and another monad on . I maintain there is a proper "composite monad" whose monadic adjunction is the composition of 's and 's iff has arities .
I believe this is essentially (a consequence of) Theorem 5.5 in Relative monadicity (though phrased, more generally, in terms of relative monads rather than monads with arities).
I hope so, since it saves me finishing this proof!
James Deikun said:
In general I think you can, for any presentation of an EAT on sorts , factor the forgetful functor into monadic functors forgetting one equation or operation at a time to along any linearization of its dependency graph, and all the compositions where no "domain dependency" is included will be monadic. Being "trivially non-monadic" depends on:
- breaking the EAT into two pieces which have "domain dependencies" on each other but not within themselves, such that
- every operation and equation of theory 2 has as its "guard" a conjunction of equations at least one of which is independent of theory 1's equations.
This intuition is somewhat present in the result that the category of models for an essentially algebraic theory is "essentially monadic", i.e. there exists a (possibly infinite) chain of monadic functors from the category of models to (a power of) the category of sets, e.g. as proven in the one-sorted case in Essentially equational categories. However, I think the perspective via type dependency, and using this as an illustration of the failure of monadic functors to compose is very nice (and probably more naturally suited to generalised algebraic theories than essentially algebraic theories).
To be honest, I think this would make for a nice expository note since, as is clear from this discussion, it is a topic that many people do not have a good intuition for.
The tight connection between monads with arities and relative monads also deserves to be more widely known.
This all sounds really interesting and worthy of some more detailed explanation. Maybe someone could write a blog post? (-:
James Deikun said:
The tight connection between monads with arities and relative monads also deserves to be more widely known.
@Dylan McDermott and I will explicate this connection in a forthcoming paper.
But it is at least alluded to in Monads need not be endofunctors that the notion of monad with arities corresponds precisely to the notion of -relative monad (conversely, relative monads are more general, as they require no assumptions on ).
Ah, under "Related Work":
Berger et al. [10, 16] have introduced a generalization of finitary monads, called monads with arities. Monads with arities constitute a special case of relative monads on well-behaved functors.
I guess I simply learned these concepts in the wrong order! :big_smile:
Connecting this back to the crude monadicity theorem: if a right adjoint functor preserves reflexive coequalizers, then the monad induced by it preserves the standard presentations of algebras for any other monad whose category of algebras it lives on. Now, you can view the elements of as the operations of arity up to the equations of arity . If preserves the standard presentations, then the operations of any non-free arity are completely determined by the ones on free arities up to the direct consequences of the arity's defining equations. So no matter what monad you use to gin up the base category of , will still have arities in free objects.
And just throwing in more ingredients: Using the [[adjoint lifting theorem]] previously mentioned, say we have reflexive coequalizers in , and we have the wannabe-composite monad , then the comparison functor from to is monadic. We can further show using the cancellation property that as soon as factors through the factor is monadic. (Does it always factor? I think so, but I can't immediately think of an argument.) So now we have a situation:
where each arrow shown is monadic.
In the cases where this works, it gives a factorization so that the composition of the leftmost arrow above with the (monadic) composite of the two others is non-monadic in the "trivial" way. Perhaps this is why we get a lot of real-world examples that are "trivial" even though generically speaking it doesn't happen.
James Deikun said:
In general I think you can, for any presentation of an EAT on sorts , factor the forgetful functor into monadic functors forgetting one equation or operation at a time to along any linearization of its dependency graph, and all the compositions where no "domain dependency" is included will be monadic.
In the setting of algebraic theories, one can formalise the notion of "forgetting operations/equations" by morphisms of algebraic theories (one could restrict specifically to the monomorphisms, but I think this is unnecessary for the problem at hand). Every morphism induces a monadic functor and, in particular, the monadicity of each category of algebras is exhibited by unique morphism from the initial algebraic theory, whose category of algebras is the category of sets.
I wonder whether one can formalise this intuition about the (non-)composition of monadic functors in a similar way. For instance, we can consider a category of generalised algebraic theories, in which the objects comprise pairs where denotes a category of sorts, encoding the dependency structure, and denotes an -sorted presentation of a generalised algebraic theory. The morphisms are the usual translations of generalised algebraic theories. Now, it seems natural to imagine that this category is equipped with a factorisation system, in which each morphism factors as one that is trivial on the component, followed by one that is trivial on the component. Then, while it is not true that each morphism of GATs induces a monadic functor between categories of algebras, perhaps it is true that each morphism in these two subcategories induces a monadic functor (furthermore functorially). In particular, if we factorise the unique morphism from the initial GAT to any given GAT, this would give a factorisation of the forgetful functor from the category of algebras to a power/slice of Set, into two monadic functors, which aligns with the factorisation induced by the presentation of a locally presentable category as a localisation of a presheaf category. (I should say that I'm thinking aloud, without checking anything, so this may not be at all accurate.)
James Deikun said:
We can further show using the cancellation property that as soon as factors through the factor is monadic. (Does it always factor? I think so, but I can't immediately think of an argument.)
We have a monad morphism from to , given by the whiskering .
This monad morphism induces accordingly a functor from to , whose action on objects is given by precomposing any -algebra with the corresponding map (and similarly acting on morphisms by precomposing naturality squares, so to speak). This is as in Remark 3.4 at https://ncatlab.org/nlab/show/monad.
This should provide the appropriate factorization to make the following commutative diagram (where the bottom triangle commutes by the functor we just described leaving underlying carriers of algebras and of algebra morphisms unchanged):
(I don't seem to know how to get Quiver/tikz-cd diagrams to render here on Zulip)
Keep in mind, nothing in this post depends on the presumption of reflexive coequalizers, merely the setup of having two composable monadic functors.
Indeed, we get these same commutative triangles even if we take to be any right adjoint, not necessarily monadic.
(I don't seem to know how to get Quiver/tikz-cd diagrams to render here on Zulip)
(I don't think it's possible: I usually just take a screenshot of the diagram in quiver, and paste it, with a link to the diagram.)
Sridhar Ramesh said:
James Deikun said:
We can further show using the cancellation property that as soon as factors through the factor is monadic. (Does it always factor? I think so, but I can't immediately think of an argument.)
We have a monad morphism from to , given by the whiskering .
This monad morphism induces accordingly a functor from to , whose action on objects is given by precomposing any -algebra with the corresponding map (and similarly acting on morphisms by precomposing naturality squares, so to speak). This is as in Remark 3.4 at https://ncatlab.org/nlab/show/monad.
Thanks! I figured out later that the map existed, but not that it came from a monad morphism, so I'm glad I didn't update right away.
Circling back to the original question, say you have two monads and and distributes over . You get a monad on that is a lift of through and factors monadically, that's the normal thing. But you also get, by duality, a monad on that is an extension of through .
In and sufficiently similar equipments, there is an inclusion functor from to which is fully faithful and dense and factors through , call this one . Treating as a relative monad, you can extend it through this to get a relative monad over . Because it has a dense root, this relative monad extends to a monad on with arities in , and with a little manipulation you can see it also has arities in .
Thanks to that, and the facts unearthed earlier in the thread, the forgetful functor of new monad on composes with to give a monadic functor. So let's figure out what the monad of the composition is. The new monad arose as , which is a genuine extension. So the full monad is:
So you actually do have a monadic square as above with no extra assumptions, and the monad on the "wrong side" actually is built from the two given monads in a (fairly) straightforward way!
There's something I'm not quite following in the above. If I'm understanding correctly, you're saying that there are two monads on : the lifting of and a monad with arities in . We can consider the forgetful functors from the categories of algebras for both monads, and compose each with . The induced monad in both cases is . However, for the "monadic square" as in the original question, we want to consider a monad on to obtain the left/bottom path, rather than two monads on . Where does come into the picture?
Oh, maybe it's just a typo:
James Deikun said:
But you also get, by duality, a monad on that is an extension of through .
Presumably this should be a monad on (and then the relative monad in the following paragraph has codomain and so on).
James Deikun said:
Because it has a dense root, this relative monad extends to a monad on with arities in
Usually, for relative monads to extend to monads, you need the root to be dense and for (sufficient) left extensions along the root to exist. I would have expected this to require some mild cocompleteness assumptions on , but maybe you're deducing the existence of the monad differently to how I'm imagining?
I should have recalled this much earlier, but §3 of Beck's paper on distributive laws is very relevant to this discussion.
image.png
Nathanael Arkor said:
Oh, maybe it's just a typo:
James Deikun said:But you also get, by duality, a monad on that is an extension of through .
Presumably this should be a monad on (and then the relative monad in the following paragraph has codomain and so on).
Oh, actually it's the first monad that is on . Edited.
Nathanael Arkor said:
Usually, for relative monads to extend to monads, you need the root to be dense and for (sufficient) left extensions along the root to exist. I would have expected this to require some mild cocompleteness assumptions on , but maybe you're deducing the existence of the monad differently to how I'm imagining?
Ugh, you're right. Let's use Beck's formulation of the existence result because it's at least clearer about what colimits need to exist. You need the coequalizer of and to exist in for every -algebra .
It's a reflexive coequalizer (equip it with ).
(Also: they are respectively the Kleisli extensions of and ...)
Nathanael Arkor said:
James Deikun said:
In general I think you can, for any presentation of an EAT on sorts , factor the forgetful functor into monadic functors forgetting one equation or operation at a time to along any linearization of its dependency graph, and all the compositions where no "domain dependency" is included will be monadic.
In the setting of algebraic theories, one can formalise the notion of "forgetting operations/equations" by morphisms of algebraic theories (one could restrict specifically to the monomorphisms, but I think this is unnecessary for the problem at hand). Every morphism induces a monadic functor and, in particular, the monadicity of each category of algebras is exhibited by unique morphism from the initial algebraic theory, whose category of algebras is the category of sets.
I wonder whether one can formalise this intuition about the (non-)composition of monadic functors in a similar way. For instance, we can consider a category of generalised algebraic theories, in which the objects comprise pairs where denotes a category of sorts, encoding the dependency structure, and denotes an -sorted presentation of a generalised algebraic theory. The morphisms are the usual translations of generalised algebraic theories. Now, it seems natural to imagine that this category is equipped with a factorisation system, in which each morphism factors as one that is trivial on the component, followed by one that is trivial on the component. Then, while it is not true that each morphism of GATs induces a monadic functor between categories of algebras, perhaps it is true that each morphism in these two subcategories induces a monadic functor (furthermore functorially). In particular, if we factorise the unique morphism from the initial GAT to any given GAT, this would give a factorisation of the forgetful functor from the category of algebras to a power/slice of Set, into two monadic functors, which aligns with the factorisation induced by the presentation of a locally presentable category as a localisation of a presheaf category. (I should say that I'm thinking aloud, without checking anything, so this may not be at all accurate.)
I've been exploring similar ideas with Mathieu Anel, representing GATs by [[clans]] (see also this), and trying to find a factorization system on the 2-category of clans, one of whose classes induces monadic adjunctions between the associated categories of algebras.
The 2-category contains both the 2-category of finite-product categories, and the 2-category of finite-limit categories as full sub-2-categories, and in , the class of essentially surjective (finite-product preserving) functors induces monadic adjunctions and is both the left class of a (pseudo) OFS (with fully ff functors on the right), and the right class of a (pseudo) WFS (generated by ). In , essentially surjective functors do not induce monadic functors anymore (a counterexample is the inclusion of the FL-theory of Sets into the FL-theory of torsion-free abelian groups), but the free FL-functor on the end-point inclusion
of the interval category generates a WFS whose right class is monadic. Finally, on general clans, we haven't been able to find a non-trivial factorization with monadic maps on one side, but we have at least found a criterion on clan-maps that ensures monadicity of the induced forgetful functor between algebras, and is closed under composition: the condition is that the clan map is Cauchy-surjective, and the induced functor between functor categories reflects algebras (algebras are functors preserving the terminal object and pullbacks of display maps).
That does sound related. I think the crucial difference between the two perspectives is the distinction between essential surjectivity on objects (say, of a finitely complete category), and bijectivity between the sort-functions of a morphism (say, between essentially algebraic theories). In the setting of algebraic theories, essential surjectivity coincides with bijectivity of sort-functions, because the morphisms in the category do not affect the objects of the category. However, since this is no longer true for finitely complete categories, when one is generalising from algebraic theories to essentially algebraic theories (or similar), one must decide which notion is appropriate. (It could well be that both are interesting for different reasons.)
(In fact, the reason I described an approach using presentations, rather than a presentation-free approach like categories with display maps, is that it is much simpler to define what one means by "sort-preserving morphism" in that setting, which seemed crucial.)