 OK, good morning. Perhaps we can start the morning session today. So Mukundu is going to give his third lecture. So please. Morning, everybody. So I want to quickly remind you where we were yesterday and continue on from there. There'll be some new elements. I'll add some special facts that are specific to thermal states of field theories. And then take a detour and spend some time reviewing what we know about hydrodynamics. That will set the stage for the fourth lecture, where we'll put all these things together and construct something for the low energy theater hydrodynamics. OK, so what I convinced you yesterday was basically the fact that quantum field theory, asking general questions of all possible time moderings requires the consideration of an interesting class of contours for the path integral, contours that bind back and forth in time. You can call them time folds, out-of-time ordered contours, et cetera, et cetera. And I gave you the general lesson that these contours, while they encode lots of information, they also come inbuilt with an inherent amount of redundancy. And in the specific case of the Schwinger-Keldisch contour that computes time-ordered and slightly out-of-time-ordered correlators, I argued that there's a useful way to encode this redundancy in terms of some kind of topological symmetry. I'll come back to that point in a while, but let me specialize to a special case first of just talking about states of a quantum system which are thermal. Now, depending on where your intuition comes from, you think of thermal states as either some quantum system coupled to some reservoir, if you're doing stat mech, or you could just say, well, I have a system which is in a density matrix, which is simply given by a function of the Hamiltonian so that the states are distributed according to some Boltzmann weighting. And it's normalized. If you want to normalize this density matrix, you can normalize it by the partition function. Usually, I prefer not to normalize this density matrix so that its trace actually computes the partition function. So let me not do this, which is what we were doing yesterday. We were normalizing our density matrices so that here I would just say trace of rho beta is z. And if you want to add in lots of bells and whistles, you have conserved charges, you want to put chemical potentials, you can feel free to do so. But for the sake of discussion, keep it simple and keep the formulae compact, I'll just talk about just thermal field theory with no chemical potentials. Now, we are also used to the fact that when you talk about thermal field theory, this is talking about equilibrium of some quantum system. In statistical mechanics, it would just be equilibrium of some classical system. But there's one interesting fact that I want to draw your attention to, which is usually you think of this problem as saying, because of the fact that the density matrix is given in terms of the Hamiltonian, you could just ask for the Schwinger-Keldisch contour, which computes response. Let me call it 1 OTO correlators, giving true to our notation from yesterday. You can think of that contour as starting out at some fiducial time, exploring the real-time segments, comes back as it was doing yesterday. But now, to account for this e to the minus beta h, I could just draw a vertical segment of length, roughly, i beta. If this is epsilon, then this is really i beta minus epsilon. But I'm imagining that this is infinitesimal. So this excursion in imaginary time is i beta. And this fact is simply special to this density matrix, because the thing that does the evolution, the Hamiltonian, is the same thing that appears in the density matrix. So this is a stationary density matrix. Its Heisenberg evolution doesn't change. It's time-independent by construction. Now, usually what one does tend to do is tends to view the temperature as constant throughout space. There is no time dependence, because this density matrix is stationary d by d rho beta of dt is 0 by Heisenberg equation of the motion. But you could do something slightly non-trivial. And I want to exploit this in what I'm going to say later. So let me tell you this useful fact. So I can ask you the question, what is a general thermal equilibrium configuration? And by this, I mean I'm going to allow myself to turn on background sources, which will allow me to change the temperature, if you like, from place to place in space. But it's going to keep the density matrix stationary. So usually, we consider homogeneous configurations where the temperature is constant in space. With no loss of generality, you can choose temperature to vary as a function of x. But more importantly, you can do something more. You can choose. So when I say you have a temperature in Lorentz signature spacetime, we're doing relativistic quantum field today after all, I mean that you'll go to a local inertial frame, you go to a point, and you ask, what is the temperature? And in which local inertial frame are you measuring the temperature? It basically means the question of, what do you mean by your direction of time at that point? But I could make my local inertial frame vary as I go along in space. There's nothing preventing me from doing that. As long as there's no explicit time dependence, the density matrix is still stationary. So the most general stationary configuration is one where I allow temperature to vary as a function of x and also rotate my inertial frame in which I'm measuring this temperature. And this is very easy to encode. It simply amounts to turning on a background gravitational field, a spatial gravitational field. And we all know this, but we don't tend to use this. The atmosphere is in equilibrium, but the atmosphere is not in equilibrium with a constant temperature. The temperature varies up as we go across the atmosphere. You've worked this out in your standard undergraduate thermodynamics course, but you can do this not just in the altitude, but across the space. And that's the most general configuration. And so I could just declare that this is quantum field theory in some background geometry with a time-like killing field. And I need this time-like killing field to ensure that it's in equilibrium. So I'll call this time-like killing field K mu. And without loss, you can just parameterize this by some time coordinate T if you like. And then the geometry of spacetime on which your quantum field theory lives is just simply takes the color of the client-like form, an index m here to indicate that this is a spatial index. I have written on a Lorentz signature metric here. The main difference is that this time coordinate torques around as you go around in space. It's a stationary killing field, not a static killing field. And your spatial dependence in A's, sigma's, and gamma's. The local temperature is set in terms of this pre-factor sigma, but the local inertial frame choice cares about this AI. And the reason to do this is that later when we want to talk about various configurations, you'll see that it's very useful to use this as a tool to extract how the quantum field theory responds to these kind of geometries. So if I put a quantum field theory on a curved space or curved spacetime, the response is measured by asking, what is the energy momentum tensor of the field theory when it's subject to this background? So of course, all the field configurations will sort of align themselves to be in equilibrium, but they'll have to sort of align themselves subject to this nonlinear source. And that will allow us to extract the nonlinear response, the energy momentum tensor, as a functional of the source. We later generalize to allowing ourselves to not just consider source backgrounds, which are space dependent, but also time dependent. But for now, we're just talking about equilibrium, so I'm going to only keep space dependence. The Euclidean geometry of this, because T is a killing field, you can imagine sort of looking at the spatial sections. On the spatial sections, you can think of this as the Euclidean analog. You have spatial geometry, sigma, which is the restriction of this manifold at a constant time slot, so let's call this manifold m. You can imagine that there is a circle vibration, a Euclidean thermal circle of local period beta of xm, which is related to local twist given by ai of xm. In pictures, one usually talks about thermal equilibrium as if you have some space, and you have a cylinder where you have a circle of period beta that is homogeneous everywhere in space. But I'm advocating a more complicated setup where you have a vibration structure with the thermal circle changing its shape and size as you go along. The statement of equilibrium is simply that if you take this Euclidean geometry, which is the analytic continuation of this guy, and slice it on any section, nothing changes. You can pick any section of this geometry, and physics is invariant because it was time independent to begin with. So that's the background, and I'll come back to this in a second. But for now, let's try to ask, given this density matrix, what does it imply for correlation functions? Because that's all we were talking about yesterday. So I have this contour, and you can generalize much of what I'm saying to the k-odeos, but you can, again, for a k-odeo, you draw this back and forth, and then eventually, as you've gone down all the way, you'd go down another i-beta. So there's a useful fact about quantum field theories for correlators in the thermal state. I have to say one thing. They are analytic in imaginary time in the domain of width beta, the domain in the lower half plane on a strip of width beta. And it's actually easy to see this. And again, we can play the games we were playing yesterday to see this, but I'll write one identity and tell you how to see the remaining identities by just playing the games we were playing yesterday. So the usual statement you would see is that this implies what's called the Kubo-Martin Schwinger condition, or the KMS condition, as it's commonly known. The correlator of two operators, one inserted at imaginary time with a location i-beta, and the other at 0 is the same as to be consistent. I have to put hats here from my notation from yesterday. This is a very easy way to intuit this. Let's say T is positive. So the time-ordered correlator has A here, let's say, and B here. So let's say we're talking about the other guy, the anti-time-ordered correlator. So this is A, B is here, B should be here. You can imagine playing a simple trick of sliding A through rho-beta, which conjugates it to T minus i-beta. It's just cyclically rearranging these guys, and you can keep going through by just seeing how it works on this contour of moving operators around as long as they're unobstructed to get these kind of relations. So because A of T minus i-beta is simply e to the plus beta H minus beta H, which of course is nothing but rho-beta inverse a hat of T. So the statement is that given a correlator in thermal equilibrium, because the thermal density matrix does evolution in imaginary time, you can conjugate operators through, and keeping true to this statement of analyticity requires you to conjugate them this way. Requires you to conjugate with rho inverse in front. Conjugating up in imaginary time would have been rho-beta A, rho-beta inverse, but that runs into trouble with this domain of analyticity. So I'll consistently only move my operators down in imaginary time when I apply the KMS rules. I won't move them up in imaginary time. This is not what you would find, for example, in the literature. So in much of the stat mech literature, you also see conjugation by the rho-beta to the half, which moves A of T to A hat of T plus i-beta over two. But I won't go through that. I'll tell you something about this in a second. Now let me use what we had yesterday, and again, motivated by trying to go down to the low energy theory, try to recast this in a language that's amenable to a general treatment. Again, these statements are true if you know what your density matrix is, if you know what your microscopic is, but eventually I want to sort of unanchor myself from this microscopic description and ask what's happening in the low energy theory. So I start here and tell you that there's a following set of statements. This KMS condition, more generally, I've only given it to you for two-point function, but you can derive it for higher-point function, imply the set of thermal sum rules, among which you can say the following that the Schwinger time-ordering of the following set of correlators is also, so I insert in the Schwinger-Keldisch contour some right operator and I insert a left operator, put a tilde here just here, but not at real time, but at a shifted imaginary time. Yesterday, I gave you a very similar identity for difference operators. The key was both operators at the same time location, the right and left operator. In the thermal state, there's another identity where the left operator can be conjugated down to minus, to t minus i beta. Oh, m labels these operators, sorry. It's a product of operators of this kind, thank you. Let me come up with a notation, which is useful. So let me call o tilde l, just the operator o l, at t minus i beta, which of course is obtained by this conjugation and let me denote for future reference this guy as the action of some differential operator, which I have to define for you on o l of t. So delta beta is just some mnemonic for doing this conjugation. The conjugation of course will involve, if you write this out, this involves some Baker Campbell Hausdorff, you sort of expand out the exponential and you'll have nested commutators. You can think of those nested commutators as some kind of derivative action and I just call the derivative action minus del beta. I want to write this because later on I'll make some simplification with these kind of operators. There's also one more fact, just as yesterday we saw in the Schwinger-Keldisch theory, a largest time equation, a statement which said that you can't put difference operators out here. There's a corresponding smallest time equation, which would say that you can't put such operators, the difference of a right minus left tilde operator at the closest to the density matrix. So these are just correlation identities generalizing what we had yesterday and this I assert they simply follow by as a consequence of the simplicity of this density matrix. So couple of comments and especially in regard to this and an operation that I'm going to define that will be helpful for me in a second. Okay, so in the literature you find that these identities are also derived by people by demanding that you take o right of t, o left of t, by some, you map them under this KMS conjugation to o left of t minus i beta over two and o right of t plus i beta over two. This is discussed in the literature, it's mostly often discussed in the con mat literature, although some of the other works which I've been trying to understand hydrodynamics, especially Hong Liu and friends have also been using a conjugation that's very similar to this. This has the advantage that it's an evolution. If you do it twice, you get back the same thing. O right goes to o left of t minus i beta over two, but then o left goes to o right of t plus i beta over two. So if you do it twice, it squares to one. So this is a z two evolution. It suffers from this problem of wanting to go up in the imaginary time. So if you know something about the analytic structure of your correlators, that's fine. I want to be agnostic and I want to generalize to state to situations where I have genuine time dependence. So again, I don't want to do this. So the mapping I'm interested in and the one which would allow me to derive these identities trivially from what I had yesterday is this mapping which involves doing this plus a imaginary time shift. So you can think of o right going to o left of t minus i beta and o left going to o right of. And now given what I told you yesterday about the difference operator correlators vanishing, this operation on that identity will give me these identities up to a sign, but I'm being sort of, it's zero, so I don't care about the sign. A related point, which again I want to emphasize is that our correlators are computed in the Schwinger-Keldrich formalism where the generating function is this guy and not the thermo field double configuration that is commonly also used in the literature which in fact inspires the top map, the involution mapping. You take the generating function of correlators to be, you split the density matrix in two, stick them across these guys. The contour here would be different. The contour here starts out as a rho beta to the half here then goes back and then has the rho beta to the half here. And now you see what the problem is with this formalism, unitarity is no longer manifest unless you tell me something more. Operators inserted here are blocked by this density matrix. So unless you give me some information about analytic features of the density matrix and passing operators through, I can't immediately conclude those war identities. So working with this structure is much more amenable to seeing quickly what unitarity implies. It implies constraints of that kind but doesn't come out so obviously in this formalism. So this is very commonly used because it sort of what motivates various discussions in the context of black holes and so on and so forth. So those of you are familiar with holography or even pre-holography of work by Israel and so on would talk about this contour and not this contour. The one we want to use is this guy. Yeah, from this transformation. Oh yeah, so the, well there isn't very much assumption it's basically this conjugation. That's right, so the way it's derived and the way it's derived in the literature is just using the KMS condition and the fact that you can conjugate operators through. Yeah, I mean just as I was saying yesterday where you were sort of I was telling you if you draw the contour here, you can slide operators around and get the identities. Now you slide them around, you slide them, think of this, okay, the difference from yesterday our contour was open. Now it's closed, think of this as a closed contour but this leg is really an imaginary time leg. It's sort of, you come out in the real time, you go down and then you close it in the imaginary time of the plane. And now it's a closed abacus, so you just slide beads around and every time you close, you slide in one particular direction which is counterclockwise in my orientation, you pick up this factor, that's it. Oh, sorry, the dagger is here. Well, it's on the whole operator. Okay, so I want to know the same thing I did yesterday of trying to encode these relations in a useful form that will let me do the low energy physics. So let me do that. And again, I won't go through the derivation in any great detail. Let me make some assertions and see, tell you that they are useful and then explain why I need them for what I'm going to do next. All right, so we'd like to understand the KMS statement as some operator algebra statement. And to this extent, let me define, this will be useful in a second, let me define this operator which is this conjugation operator removed from one simply to ask such that delta beta acting on, say the O left would basically be O left minus O left tilde in my notation. So I'm not doing very much, but basically what I want to ask is if I have exact thermal equilibrium, there should be some moral sense in which if I could apply, put in here O left minus O left tilde, they are sort of equivalent. If I have the correlator with O left and I have the correlator with O left minus O left tilde, so I want to sort of use this as a proxy to measure deviations from equilibrium. There's a useful geometric way to think about it in this picture, which is why I brought this picture up, but not just kept stuck to this simple way. Now, the way I've been writing operators in quantum field theory, I've been agnostic whether they carry any indices. I've just been writing O, but you could have not just scalar operators, you could have tensor operators, you could have spinners, you could have various kinds of things. If I have spinners, one important thing is that I have to put in a minus one to the F here to account for the fact that spinner operators when they go around the thermal circle, they're anti-periodic, they're not periodic. But if everything ends for all Bosonic operators, they're periodic, but how I take them around the thermal circle when the thermal circle is not a direct product like here, but a vibration cares about what the index structure of the operator is. So, if I have an operator on this slide, and I ask you, what is the operator up here on some other section of this geometry, you have to roughly speaking, parallel transport it up that fiber, okay? So, in some sense, this operation which compares operators after a whole thermal translation around this vibration is like a, is best thought of as a lead derivation operation. So, think of all your quantum fields as some tensors living on this geometry. They live on this constant time slice. And then given that fields on this constant time slice, you can lead drag them and bring them back after one period. And then you can compare them and this operation does exactly that. So, let's think of this as some kind of lead drag operation associated with KMS commutated and in quantum language, it's a commutated action because that's what operators, acting on operators do. Okay, now I could give you two independent arguments for what I'm going to say next, but let me give you the simpler one and then show you where it comes from here. Have a simple bosonic operator, this lead derivation that allows me to compare a quantum operator and its thermal conjugate after taking its once around a thermal circle. But yesterday in the Schwinger-Keldisch formalism, I convinced you that there was already two, the Schwinger-Keldisch formalism requires is best viewed as redundancy which I encoded in this BRST symmetry. So, it had this Q SK and Q bar SK, these Nielpoten generators which ensured the Schwinger-Keldisch identities were true and this was, this put all operators into a single super field which yesterday I had called as follows. The key point here was that every physical operator had a fourfold realization. There was an operator, there was its ghost conjugate, there was its anti-ghost conjugate and then there was a difference. The single copy theory got embedded into a quartet of operators. So, there was a mapping from O hat which acts on the Hilbert space into, there's a canonical embedding of this into this structure. You can think of saying that the operator algebra of your theory has been upgraded to an operator super algebra where the super space is just indexed by two Grassmann objects. Okay, why do I say all these things? The reason I say them is that I now have a Bosonic operation that acts on operators. I can certainly ask how does it know about and how does it sort of work with the Q SK and Q bar SK? And the first thing to realize is that it cannot live by itself because it's, let's say it's Grassmann, even a Bosonic, these guys are Grassmann let's say they're fermionic then they're putting these two together will give me new fermionic operators. A better way to say it is that you can't have an isolated LKMS. It also has to be part of such a quartet. And otherwise the operator, the structure of the algebra doesn't work. So, what you learn from this little, this sort of just simple algebraic way of thinking is that you must have a quartet of operations that allow me to sort of ask how things move around the thermal circle which you can think of as a decomposition of LKMS into pieces which I'm going to call QKMS because there'll be a Grassmann odd guy, a Q bar KMS, it's conjugate and another Grassmann even guy, Q zero KMS. And you can think of these guys as part of a structure of the form I was writing yesterday. I put some signs here but the other structure is exactly the same. In particular, the direction of the arrows given by QSK and Q bar SK is isomorphic to yesterday. The point is that LKMS is like the top component, the bottom component is some other bosonic guy which I've called Q zero here and the middle components are two fermionic guys, Q KM and Q bar KMS. So that's the easiest way to argue for this structure. I'll write the commutation relation in a second but there's another way to argue for it by just asking the same question we asked yesterday which was, we said, look, we have these identities involving difference operators so they gave us some kind of BRST symmetry. Now have identities involving differences with the tilde operators. They must have new generators that guarantee new BRST generators that guarantee that this happens. And those generators are really this Q KMS and Q bar KMS. So these operators, or right minus or left tilde are not Q SK and Q bar SK exact that Q KMS and Q bar KMS exact. So if you put those two things together, you would again recover this algebra. So but let me write down that statement and then write down the commutation relations for that. I put this in quotes because you'll see why in a second. I think I'm missing some eyes for consistency. So that's the structure that would guarantee these identities but you can either use that to argue for this Q KMS, Q bar KMS or you can use the fact that there is this Bosonic generator and it has to fill out this quality. Completely equivalent arguments. So the statement I wanted to take away from this exercise is that at the level at the level of the microscopic theory just thinking of thermal density matrices implementing the Schwinger-Keldisch path integral here, you have a sequence of identities which are no longer accidents but rather consequences of the existence of a large algebra that acts on the theory. It acts on any theory. It only requires that you have some set of operators and you are in a thermal state. And you can guarantee the whole set of relations you want by asking that you not work with the double theory of the left and right but with a quadruple theory involving these ghost operators and on this structure there's the action of a bunch of generators. Six generators, four of which are grassmen odd and two of which are grassmen even. So this algebra, I'm going to give it a name because it comes from the Schwinger-Keldisch and the KMS condition. I'll call it the SK KMS super algebra. I'm going to try to use this later to constrain the dynamics of the theory. So all these supercharges are nilpotent and the only non-trivial commutators are these which should remind almost all of you of very basic supersymmetric quantum mechanics where you have Q and a Q bar which commute to Hamiltonian. The main difference is that the thing that appears on the right hand side is not the Hamiltonian that does time translations. In real time, it's a Hamiltonian that does imaginary time translations along a very particular Euclidean thermal circle. This has consequences. I'll come back to it later. Questions? Okay, so I'm sure that many of, I mean this algebra is not familiar to most people. I'm going to simplify it and tell you some more consequences next time but I want to give you some intuition for it. So let me remind you of a simple fact that everybody knows and interpreted in a language that's reminiscent of what that structure is which will also allow me next time to quickly generalize. How many people know some differential geometry? So say you have a manifold. Say you have some manifold, the Riemannian space for now. On this manifold, I can talk about the tangent space but actually what I want to talk about is structures that are topological. So let me not talk about vectors but let me talk about differential forms that is manifold, anti-symmetric tensor fields. So there's a space of differential forms. Let's say this is an n-dimensional manifold. Then the differential forms range from scalars up to top form which is n-dimensional. So that gives me a tensor product structure where I can put together differential forms of varying degrees and each. Each differential form forms in one vector space. So let me call that vector space of degree k omega k. So this is a linear tensor product of vector spaces. Sorry, it's a direct sum. On this direct sum, I have one natural operation. I have an exterior derivative which is a very basic object which takes a k form to a k plus one form. It takes a vector potential in electrodynamics to its field strength. Now imagine that now you also equip yourself with vectors on this manifold. Now associated with a vector, I can do two operations. I can take the vector, it has one index and I can contract its index with the first index of my differential form. It's an operation I can do and that operation is natural if you think of this vector as sitting. So giving you some directions of motion and you want to sort of look at the projection of your differential form onto that vector. So this will take a differential form of degree k and map it to a differential form of degree k minus one. It will just contract one index or it removes an index from a differential form. It makes it one rank lower. But then since I have a vector, without doing any putting a metric, I can also take my form and drag it along. I can parallel transport it. Sorry, I can just lead drag it. I don't parallel transport it because parallel transport requires a metric structure. So the natural thing to parallel transport, actually that's not true for differential forms. Parallel transport is the same as lead drag. If you're not sure about that, ask me later. This lead drag takes a differential form here and moves it somewhere there. We don't change its degree. Three very simple operations. Here are some interesting identities. D square is to zero, I square is to zero because you can't contract the same vector with two differential indices of an anti-symmetric tensor. But if you anti-commute D and I, low and behold, you get the lead derivative. It's also a statement that everybody knows. But we use it, but we don't sort of think about it in this language. And you can work out what the remaining relations are. They look like a baby version of this. These three operators can be interpreted as super operators. In fact, this is Grassman odd. This is Grassman odd. This is Grassman even. And the structure here is exactly the same thing with two derivations, a D and a D bar. It's just a generalization of this to something slightly bigger. But, and the lead derivative is a lead derivative. The point is once you have two differential operators, you no longer have one interior contraction, but you have three. So Q, Q, Q are the interior contraction. They are analog of I, XI. But the physical picture here is that we had this thermal circle vibration on our space. Lead dragging things along was this operation of LKMS. The interior contractions are just taking projections of various objects in a Grassman order, even fashion in those fiber directions. Thermal field theory secretly knows about super geometry. Okay, so I have five minutes. And let me pause to ask you the questions and then tell you some few facts about where I'm going next. Yeah, what is the vector field psi in this case? So vector field psi in this case is going to be, so I'm going to sort of generalize this slightly. So here you can think of psi as suppose there was a vibration structure on M. You can think of psi as pointing along those fibers. So the usual way this is discussed is we have a manifold and you have some group action on that manifold. So I take this manifold and I act with it on some group. So all the group would do is permute points in the manifold. It would defiomorphize them. It would take a point and move it alongside. So that's where this is the context in which this is discussed. In our picture, psi is going to be an infinitesimal vector field along the fibers of our thermal vibration, the analog of psi. Because we basically said that we want some kind of invariance undertaking the thermal vibration and slicing it in different directions. So M is going to be the analog of M is going to be sigma, the spatial sections, but I'm going to upgrade a little bit because I'm not going to do just equilibrium. I'm going to do hydrodynamics. So I'm going to describe what hydrodynamics is. You can think of the analog of M as a Lorentzian spacetime by the quantum field theory list and the fibers are basically thermal fibers. So psi is going to correspond to thermal directions. Yes, so it's basically going to be the measure of the local temperature and it's going to be a vector field that tells you both what the local inertial direction is and what the local temperature is. Yeah, so it's related to the same physical picture we drew before, but it's going to sort of show up in a slightly different language. Any other questions? Thank you, yes. So I was going to postpone that discussion till next time, but let me say what the question was. I motivated everything by KMS condition where I was doing discrete translations along a fixed-size circle of period beta. So this lead derivation actually takes me, it's like a winding operation. It's not an infinitesimal operation. It will be in the context I will discuss where the circle is going to be infinitesimally small. Yeah, so I'm going to simplify to that context. The full story when the circle is finite-size, I think is a generalization, but it's a deformation of this algebra, which I can tell you about, but I'm going to simplify this algebra precisely in that context tomorrow. So the question was, what information does the co-mology of QKMS give you? So this is why I put, it doesn't give you anything, which is why I put this thing in quotes. See here, the co-mology operation is on D, not on IKSI, right? So the QKMS guys are like the interior contraction. So you don't ask the co-m... So, okay, I should say the following statement. What you usually want to do here is you want to imagine some action on M. And you want what's usually called the equivariant co-homology for that action. For the equivariant co-homology, one easy way to describe it is to sort of talk about the principle group, the principle bundle of the equivariant group, and think of differential forms on that abstract space which are annihilated by IKSI. Okay, but that action on the manifold is simply interior contraction. Same thing here, the queues will sort of, we used to define some kind of, what are in that language called basic forms, but the things that whose co-mology we are interested in is still QSK and QRSK, because they are the ones that give you all the relations, QKMS and Q. So I wasn't going to do this, but the right way to do, think of this is sort of do another redefinition of this algebra, and talk about the carton charges associated with linear combination of QSK and QRKMS, and then just do, I'll think of this as a gauge theory, which is essentially what I'm going to do next time. Sorry, that was too fast for other people, but it's not going to be relevant for what I'm going to do, so any other questions? I think I'm done for the day. Let me repeat again.