You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
I would love to get my hands on this book, but it seems to be out of print. Is there an electronic version floating around somewhere? Any ideas how to get it?
Edit: I miss the UIUC library :blush:
PS: Btw, I am reading this
https://projecteuclid.org/euclid.pjm/1102650385
and could really use some help if this stuff is of interest to anyone here :blush:
In case it helps someone be interested: there's a forgetful functor from differential graded algebras to algebras, and the paper is studying things similar to the left adjoint of this functor.
I'm looking for more information about what they refer to as "Karoubi differential envelope"
For example at the top of page 249 they're looking at the left adjoint of the forgetful functor from -graded differential algebras to -graded algebras. They don't say "left adjoint" but buzzwords like "universal property", and the commutative triangle, give it away.
John, that sounds cool :blush:
Yeah, one way to say it is that I am trying to understand the functor from algebras to differential graded algebras.
The "Karoubi differential envelope", if I remember it correctly, is the left adjoint of the forgetful functor from differential graded algebras (and I mean ordinary -graded algebras) to algebras.
I used to be really into this stuff in grad school, when noncommutative geometry was sorta new.
There is also (I think) a related functor (or something) from directed graphs to DGAs.
There are well-known functors from
to
to
to
so we can compose those.
A universal first-order differential calculus corresponds to a complete graph.
(I easily get myself in trouble throwing around words like "universal" especially here :sweat_smile: )
I'm also looking at:
Differential Calculi on Commutative Algebras
I call a differential calculus arising from a directed graph a "discrete calculus".
So I'd say something like, "A discrete calculus is obtained from the universal discrete calculus by removing edges from the complete graph."
This is explained in Introduction to Noncommutative Geometry of Commutative Algebras and Applications in Physics
So I am basically trying to link up Muller-Hoissen's paper to Coquereaux and Kastler paper, but there seems to be some inconsistency between the two I'm trying to sort out and hoping this reference will help :pray:
John Baez said:
The "Karoubi differential envelope", if I remember it correctly, is the left adjoint of the forgetful functor from differential graded algebras (and I mean ordinary -graded algebras) to algebras.
Any idea where I can read about U: DGAlg -> Alg or is it trivial enough to state here? I hope it isn't just "forgetting the grade" :blush:
Would it be simply projecting to grade 0?
I guess there are a couple functors from DGAlg to Alg, as you mention: forgetting the grading and differential, or taking the degree-zero part.
Let me think about which one is right adjoint to the following functor F: Alg -> DGAlg:
the functor that takes an algebra A, decrees everything in it is of degree 0, and freely throws in a differential d that increases the degree by one and obeys all all the differential graded algebra rules.
So say we want a functor U: DGAlg -> Alg that's right adjoint to this F.
I.e. given an algebra A and a differential graded algebra B we want dga homomorphisms
f: FA -> B
to correspond naturally algebra homomorphisms
g: A -> UB
Note that f: FA -> B is completely determined by what it does to guys of degree zero, since we can write everything in FA as a linear combination of guys of this form:
a da' da'' da''' ...
i.e. a product of a guy in A and a finite number of differentials of guys in A,
and f of such a product must be
f(a) d(f(a')) d(f(a'')) d(f(a''')) ...
Since f preserves the degree, it's determined by some algebra homomorphism g: A -> UB where UB is the degree zero part of B.
Furthermore I claim any algebra homomorphism g: A -> UB extends to some dga homomorphism f: FA -> B by the formula above, or more precisely
f( a da' da'' ... ) = g(a) d(g(a')) d(g(a'')) ...
So, if all this is right, the right adjoint U: DGAlg -> Alg takes the degree-zero part of a dga, and the left adjoint F: Alg -> DGAlg freely creates a dga whose degree-zero part is the given algebra.
All this is standard stuff... easier to figure out than to look up!
The discipline of adjoint functors is a way to make sure one isn't just randomly making stuff up.
:heart: :raised_hands:
That is so beautiful :heart_eyes:
I wish I could just see things so clear like that.
The discipline of adjoint functors is a way to make sure one isn't just randomly making stuff up.
Yes. Totally. That is one reason I "try" to bring CT into the picture because I can build models and even write code to verify that it works, but it sometimes feels shaky withouy some nice maths to bring it together.
Thank you :raised_hands:
If I understand, the stuff in Kastler's papers is explicity constructing the functor , but the details include actually constructing from as well.
Referring back to the article that references this book (that I'd like to get my hands on), I think I understand how is constructed (when is unital), but then it says
Since uniquely extends to a differential of (i.e. a -graded derivation of vanishing square), moreover of -grade 1, becomes a -graded (in fact a bigraded) differential algebra.
Unfortunately, it isn't obvious to me how to explicity construct
and I need that explicit construction if I am going to turn that into some code and generate numerical results.
Since is so simple, is there a way to make 100% explicit (as in I can write code and compute stuff) at least when is finite dimensional and commutative?
Now, I "think" this construction produces a universal DGA, i.e.
where is universal so I'd still need some explicit way to obtain other DGAs once we have the universal DGA, but I'd be super happy to get to the universal DGA explicitly as a start.
Btw, if is finite-dimensional with basis elements and product , then the unit is just
and
The map
is given explicity by C&K as
However, this can be written as a graded commutator
where
Now, if we interpret as (dual to) vertices and as (dual to) directed edges from to of a directed graph, then is related to the adjacency matrix of a complete directed graph.
So we should have some natural way to go from
that is closely related to the above.
Eric Forgy said:
Referring back to the article that references this book (that I'd like to get my hands on), I think I understand how is constructed (when is unital), but then it says
Since uniquely extends to a differential of (i.e. a -graded derivation of vanishing square), moreover of -grade 1, becomes a -graded (in fact a bigraded) differential algebra.
Unfortunately, it isn't obvious to me how to explicity construct
and I need that explicit construction if I am going to turn that into some code and generate numerical results.
Is your notation for the things of grade in ?
I don't want to worry about the bigraded stuff now; I'd rather talk about an associative algebra and the free dga on that algebra, which I'd probably call - but in case Kastler means something really different by that I'll call it for now.
Since is so simple, is there a way to make 100% explicit (as in I can write code and compute stuff) at least when is finite dimensional and commutative?
is pretty explicit. First, note that every grade- element of is a linear combination of guys like
It takes a bit of work to see this because if you freely start multiplying things you get more complicated expressions like
So how can we get rid of all these b's? The trick is to remember that we're requiring
so
gives a recipe for taking multiplied on the right by and turning it into a difference of two terms of the form multiplied by something on the left:
Repeatedly using this rule we can take
and show it's equal to a linear combination of terms of the form
where the x's are elements of depending on the a's and b's in some way.
Anyway, once you know this it's easy to say what does:
But this is not magic, or cleverness! It just follows from the rules of a dga: repeatedly use and .
Indeed, everything about the free dga follows from the rules of a dga, the rules for adding and multiplying elements of , and nothing more. That's what "free" means, in practice.
So if you know how to compute in , you know - after some thought - how to compute in .
Thank you John :heart: :raised_hands:
I totally get what you mean about moving everything to the left (or right) etc using the product rule :+1:
It is not a mathematical problem, but I prefer to not leave a differential exposed directly on either the left or right side like that though (but I get what you mean) so I'd say write linear combinations of elements like
To see why, consider a finite (commutative) algebra with bases , product and unit we have
and
so unless you restrict to for some reason, then is a combination of . However, if you "close off" the dangling differential on the right, you get
We can read off a bunch of info from this, e.g. if , we have
so all three indices cannot be the same. Similarly, if none of the indices are the same, i.e. , and , then
vansihes again. However, if two of the three indices are the same we have
and
so any degree 1 element is a linear combination of where , i.e. linear combinations of In fact
(I have more to say - including actual code - but this comment is already too long :sweat_smile:)
Eric Forgy said:
It is not a mathematical problem, but I prefer to not leave a differential exposed directly on either the left or right side like that though (but I get what you mean) so I'd say write linear combinations of elements like
[typo fixed]
That's okay if you want it; just beware that two linear combinations of elements like this can be equal in highly nonobvious ways!
The great thing about linear combinations of elements like
is that it's really easy to tell when they're equal. The only relations that hold are 'obvious' ones saying this expression is linear in each variable. Namely:
where are numbers (elements of our field), and similarly
and of course the consequences of these. But these are quite manageable.
So: if you have a basis of your algebra , you get a basis of the free dga on by taking elements like
where each of the is chosen to be a basis element!
John Baez said:
That's okay if you want it; just beware that two linear combinations of elements like this can be equal in nonobvious ways!
Yes :blush:
When I am talking about generating the space, I think it is a little more intuitive to have that element on the right, but yeah, totally, when I compute stuff, I write all coefficients on one side or the other (and the choice can be interesting because degree 0 and degree 1 elements do not commute) :blush:
Btw, with , and , we have
which explains why we think of (which I usually write ) as (dual to) a directed edge.
Btw, I don't know what you're doing with these tensor products here, since you didn't explain it. One trick people commonly do is write
as
but it doesn't look like you're doing that.
since you didn't explain it
Yeah :sweat_smile: The tensor product is just usual tensor product of the respective vectors. 1 is vector with all "components" equal to 1, i.e.
So if , then is a vector
so
is just the usual tensor product.
We do need to define and , but those are straightforward
and
Okay - but you're not telling me the thing I really need to know, which is the differential in terms of these tensor products. Like if I write , what's that for you?
There's one standard convention where it's called . You'll see this in lots of work on Hochschild and cyclic homology.
Here are elements of our algebra , not numbers.
The differential is
See for example (1.24) of C&K.
Okay, great. That's a different convention. I guess that's pretty standard too!
I saw you write that but it just confused me.
So with this convention turns into some big mess, but that's okay.
So C&K give you an explicit way to construct and it says this. Unfortunately, the extension is not obvious to me :sweat_smile:
You extend it using the dga law .
This holds for any elements in your dga.
I'd like to find a natural / minimal nontrivial definition of such that
so that
You can just calculate it.
I have a notebook with pages of scribbles. No luck so far :sweat_smile:
But you inspired me. Let me try some things.
It's a calculation - no inspiration required, just follow the rules.
Yes sensei. I will try :blush:
Take anybody in and write it as a linear combination of products of guys like and .
Thank you :heart: :raised_hands:
Yes. It seems so obvious now that you say it :sweat_smile:
Then hit it with and use dga rules like
Yes
In fact, every reference on this stuff says it, but for some reason it didn't sink in until YOU said it, so thank you so much :pray:
I think when you're done you may have reinvented "the differential in the Hochschild cochain complex", but don't let that intimidate you... it's not really a matter of creativity, I think there's basically no free choice involved.
Yes
I don't understand something until I've reinvented it.
Good.
Thank you again so much :pray:
Sure! It's funny, I was working on this stuff all the time when I was a postdoc right out of grad school. I was convinced that noncommutative geometry held the keys to quantum gravity, and I was working on it in my own slow way when Connes started putting out papers on it, and then I had to understand those.
I eventually gave up trying to do quantum gravity this way, when loop quantum gravity came along.
I remember :blush: :+1:
To be honest, I'm still a little surprised there is not more work along these lines. Urs made the link between our paper and NCG pretty clear and the cool thing is that it is not only cool maths, it can be used directly to generate code for engineering applications.
So my homework is to try to tell what is
Or better
I got it. I knew the answer already, but was never able to reinvent it until now :+1:
So after 20+ years, I am still making slow but sure progress :nerd:
:check:
So I think it is safe to say I finally understand the functor :tada:
Or, at least, if you give me a finite dimensional commutative algebra, I can construct a DGA, BUT, if I understand correctly, that DGA is "universal" for . My next task is to understand an explicit way to construct a DGA that is not universal, i.e. the above gives
so now I need to explicitly construct a map
The answer should morally be something like "forget some edges".
I think I got it!
I spent some more time on this, including working through
It is very pretty :heart_eyes:
The idea is:
Give an algebra and a product , let denote .
The derivation given by
generates as a left -module (despite the fact is an -bimodule).
Now, if , then
since .
John Baez said:
So with this convention turns into some big mess, but that's okay.
Actually, it isn't a big mess. It turns out
if The point is that we don't take just any and form . needs to be in
Bourbaki helped me understand why
is a "universal derivation".
If is an -bimodule morphism, then
is also a derivation.
Conversely, Bourbaki shows that given any derivation , you get a unique -bimodule morphism
such that given by
In this way, we see
It was painful to work through that, but worth it :blush: :muscle:
Now, obviously (even to me), all of that :point_of_information: involves CT with some nice diagrams.
I was able to work it out to first order
Since all the morphisms are -bimodule morphisms I think we should be sitting in a category of -bimodules, but I am not 100% sure about that.
Now, I am struggling to extend this to higher order, i.e.
According to this, we should have
and I think the proof is in Kastler's book (this topic). I'm having trouble understanding this and how is universal and whether there are any new morphisms we need to consider for it to work.
Any help would be appreciated :pray:
What if you try running the machinery you described with ? In particular, I guess your multiplication is going to be something like . But since since is the kernel of , so the kernel of is the whole , Therefore you get .
I'm not very sure about the expression for ... it's probably wrong as I wrote it but I'm fairly confident whatever is the right expression will involve multiplying (with aka 's product) and thus trivializing to , as above.
Another candidate for is :
Thanks Matteo :pray:
I think you have the right idea, except the right side of your product is not in . However, we have a similar product
(which is what I think you meant) given by
This is zero if I think that is what they mean by so
The undecorated is actually where is a -algebra ( a field).
For sure, I think this is an important piece to the puzzle, but I'm not sure it is the full story :thinking:
My first guess here is along those lines:
This requires we do have a map I have an idea what such a map might need to look like from some other calculations, but it smells a little fishy.
My second guess was here:
This requires a map which feels a little better dimensionally, but it means we need a new third space
My most recent guess looks like:
This says that
I think I am getting warm, but not quite there yet :sweat_smile:
Of the 3 diagrams above, I think my favorite is the second one. However, we might augment it with a morphism (like in the third) so it is clear that
Eric Forgy said:
The derivation given by
generates as a left -module (despite the fact is an -bimodule).
Now, if , then
since .
Why is ? It sounds like you're saying the product of any two elements of is zero! That would be a very boring algebra.
if That is the definition of :thinking:
for arbitrary though.
Btw, there is a projection
given by
If , then
Eric Forgy said:
if That is the definition of
Okay, sorry. You mean more precisely that is the kernel of the map
coming from multiplication.
So contains everything of the form
and indeed everything of the form
Is everything in of the form
?
It's nice to know exactly what it contains.
I meant to say: is everything in a linear combination of terms of the form
?
Here is a (poorly edited :sweat_smile: ) snippet from Bourbaki III 10.10:
Lemma 1 demonstrates that all elements of are of that form :+1:
Note: Bourbaki has a sign difference relative to more recent articles and I am using this other convention.
Okay, I won't look at it... I used to know this stuff, I think I could prove that.
One part of the idea is this: if why is of the form ? Easy: just take .
Btw, I just found this (can't believe it isn't bookmarked :face_palm: )
https://ncatlab.org/nlab/show/algebraic+approaches+to+differential+calculus
I got there from here:
https://ncatlab.org/nlab/show/differential+monad
and chasing references. In particular, I'm reading this one (about Grothendieck differential calculus):
http://www.mpim-bonn.mpg.de/preblob/3894
I'm starting to feel not too bad that I'm struggling to reinvent this stuff. It is not easy! :sweat_smile:
Btw, I've said this before, but I think I can say it a little more clearly now that I'm slowly gaining some mental muscle mass :muscle:
If has basis elements with product and unit element , then
which looks like the finite difference along a direct edge (think "fundamental theorem of calculus"). So we take this seriously and interpret as "dual" edges
Also, any can be written in terms of basis elements as
and we have
so is the subbimodule with all "diagonal" elements set to zero. This can be interpreted as a directed graph with no "loops".
Then let
i.e. the sum of all dual edges of a complete graph, and we have
If we combine this with the results above from Bourbaki, then we can define
where is a subset of , i.e. is a directed graph with and and we have a new derivation given by
and a unique bimodule morphism given by
satisfying
The graph operator can be obtained from the "universal" (or "complete") graph operator by setting some of the dual edges to zero.
This is still all "first order" though so I think I have a pretty good handle on how things work to first order. So I still need to get a similar grip on
It is interesting to see that Grothendieck also defined his derivation as an adjoint, i.e. commutator, but it should be the graded commutator for higher orders.
So now we have a way to tie a directed graph to a first-order differential calculus.
In particular, there is a "universal" graph with
giving the "complete" graph on the set
This is the "universal endospan in " or the better, the "universal quiver in ". This extends (via free construction - I believe) to a "universal quiver in ", where
so I think the category I need to work in (at least up to this point) is actually a bicategory