 So what I still have to explain properly what the phase is, but I hope to get to it today. Well, but also must say that it's not a completely established thing, in the sense that it's my preferred definition, it's not general, general except definition. And let me remind you, like, the brief summary of the first, the second lecture, so we started with a pretty standard setup, like where you have just some algebra of observables of a lattice system, which is uniformly hyperfine as a star algebra. And then, but slightly non-trivial thing was that in this star algebra, we had a well, we had a special dense subalgebra, actually fresh algebra, of almost local observables, and that carries all the information, the sort of rigging carries all the information about the metric structure of the space you're working on, because, you know, by itself, this thing is defined in lattice, but the cladene's metric on a lattice makes no difference for this guy, but it does make a difference for this guy. So, and from that, one can construct some natural algebraic object, which I do it like this, it's the lowest, okay, so the lowest component, it's some differential graded vector space in the sense that, you know, it's a grading by n from 0 to infinity, and there is a differential of degree minus 1, so it lowers the degree, but also there's a bracket, which makes into differential graded Lie algebra, and in fact, it's this whole thing, this thing is actually fresh vector space, it has nice topology, and so I'll just write it this way, I'll just suppress the notation for the differential bracket, all the same, so it's natural algebraic object associated with this pair, and that, and the degree 0 part is simply Lie algebra, well, it's always true for any degree 0 part of any differential graded Lie algebra, but that actually has a physical meaning of sort of infinitesimal, of Hamiltonians, which act on this local observables by derivations, so these are all nice short range Hamiltonians, while, say, C1 has a meaning of sort of density of a Hamiltonian, and so forth. And what is this G, A, L, and... Just before I just denote it like, just C lower dot, I didn't put anything here, but I want to also have a freedom to talk about other similar things, so for now it's just notation. No, but I don't sense how does it depend on G. So I just meant, I just notation meaning to say that if I look at, say, so this thing is not, it's just a notation for, for me, for anti-self-injoint, adjoint, traceless subspace of this guy, so, and the meaning is because I think about, for example, this guy as a formal sum, elements of this guy, so you have some guy here, say, I don't know, called F, I think it was a formal sum of these elements, or my lattice, and each element here is of this sort, okay, so you had some other thing, you had some, well, and it says, yeah, so you had some other guys where they had similar objects, but in sitting in some other vector space I would maybe use this argument for different, to motivate my notation, so anyway, so I have this thing, and it's pretty straightforward kinematic thing, but suppose now you add a state, for any state you can, so you can define now sub, sub-lie algebra, sub-differenti-grade-lie algebra, which just made a wall, well, as far as it is not as concerned, these are all, all the symmetries of that state, sort of, all the derivations preserved that state, and as far as C1, it's all the densities which pointwise and lattice preserve the state, okay, and the key point is that, we can do it for any state, but if omega is gapped, then the homology of this, Lie algebra is zero, well, this one is zero homology, but, so, but this one, here in the gap condition, and so, and that allows you, well, so in this zero, this, this guy, this differential graded Lie algebra allows you to, well, it's a key, key object which allows you to construct various numerical invariants of states, and let me maybe state a bit differently, the same construction, so there are two versions, which I, only there's gap means there's some Hamiltonian, some Hamiltonian with rapid-link interactions, and it's one of those guys here, such that it's, it's a gap, and then, the two citations are considered, so two, two versions of the same construction, really, version one is when you have some homomorphism, some symmetry of the state, there's some, some, some, some, some Lie algebra g, it can have like some, it can have some guy, which is, so we have homomorphism from, so some g acts on, on this local observe, actually on local observables, while preserving omega, and then, the infinitesimal generator of this is just a homomorphism from Lie algebra of this g to this nice derivations, because the charge is one of those guys, so that's our, how I specify infinitesimal reaction, and then, and then you can form some more Cartesian equation, if you wish, but which is writing components, which allows it to sort of come, so you then, you then solve, you can solve, okay, so it's a little machine, so we introduce some, because you're going to say, let's call it little q, which is sum of, it's a sum of qn and qn, well in the one, I already discussed in the case of u1, but I can also do it in general, so qn is, so, okay, so it's just for u1, because it's only case, so for g equals u1, special case, which we'll discuss here, qn is simply an element of, the only even ones, let's see, or odd ones, something like this, okay, so we have this guy and q, q, say, p is in here, and we have equations for components of this q, which, so forth, so it's equations, which you can solve one by one, because of this exactness, so first you find the charge density for your charge, then you come to this commutator, you find the sum, okay, and also in addition we want to have this condition, q, p, so it's an invariant, it's an all invariant change, so this complex, not only this complex is an invariant, it has a cyclic, but also its version of chain satisfying, this also has no homologous, so you can solve, say, this equation first, while making sure that your q1 lives in this complex, you can then solve for q3 from this, you can easily check that this left-hand side is closed, because of that, and this, so you can solve for this and so forth, so, and so often infinity, but so the net result, well, so in, and that gives you, well, the only thing which you care about actually is class and degree d plus 1, so, and then you, it's an operator valid thing, but you can take its average, so when it lives, where it lives, it lives in cd plus 1, but not in a, it's a guy with many indices, but it has no, it's no longer an operator, just a number, because it took an average, sorry, it took an average, so it's a call on this, well, the definition, so a numerical, a numerical end chain is a function which sends, say, some collection of end sites to some, say, a real number, such that it's skew symmetric, and then this is, and f decays faster than any power, any power of distance, so this is a rapidly decaying functions, so, so, this is actually, it's called a chain, because summations over the first index define the map from end chains to m minus 1 chains, so the point is that this guy is actually a cycle, so it lives here, but it's also a cycle, so, and you can show therefore that also that the homologic class of this cycle is actually independent of various choices you made when you solve this equation, so this machine allows you, say, to any action of the league group on your lattice, lattice system, which preserves the state, you get an element of this, a homology of this numerical, of this numerical complex. Now, I don't know what this homology is, but there's a one, one very natural homology class, which I sort of explained related to this sort of partitioning your d-dimensional collision space into some cones, so, as I expected, it's the only interesting class, but I'm not sure what this means, so, to get a number out of this, I need to contract this sort of co-homology class, sort of integrate this homology with some co-chain, and the only way I know how to do that is by, say, in two-dimension, say, in two-dimension two, this is a three-chain, so, like, decompose my plane to three regions, doesn't matter what the boundaries are, and, say, evaluate this omega of q three, so, in the integral, it would be, like, simply the summation over the three regions. That's the only sort of co-chain I know how, I know, which you can, could use to contract with this element of this, with this cycle to get some number, okay, and in two-dimensions, for example, that, if it equals to that, turns out to be precisely the whole conductance at zero temperature, well, up to some numerical factor, maybe, and then, in dimension four, sorry, in dimension four, which is only, this only works for human dimension, in dimension four, you get some, some, sort of, something which is a high-dimensional version of a whole conductance, which means it's a, it's a co-efficient of a transform of the response of the system to the external field. So, one can do it in non-Abelian version, too, I'm not gonna do it here, but basically, we'll need some, put some extra indices here, if you wish, on this q's. So, that was a summary, and we're gonna do the same thing for, so, in the second version of this construction. Just to finish with this, so, this, okay, you can explain whole conductance. What this shows is that whole conductance is a, someone wearing a state, which is locally computable, but does depend on where you compute it. And something which I didn't discuss yet, you can use it to show that sometimes it's quantized, and it's also that it's invariant under deformations of the state. That's all, it is equivalent, doesn't change if you like, change this, sort of, the phase, change the state, but keep alternating the same phase. So, let me just say that, yeah, so, it's a version tool of this construction to do with this high bary. So, here, instead of having group action, you imagine that you have a family of states. So, psi is now a function, it's a family of states where m is a parameter space. And you require, moreover, it's, in some sense, a smooth function. More precise meaning is the following, I'll comment on this later, that you're given some, well, here's some g, which is the one for m with values in this Lie algebra, which sort of does a parallel transport of psi between different points in the following sense that you have the following equation, where a is any element. So, that, that basically means that, if you imagine some, your m into points and some path connecting them, then, let's see, some path, which is smooth, then psi at a point, m of 1, is psi at a point gamma 0 composed with automorphism. And automorphism is, roughly speaking, I'll say more precisely what I mean by this, is the exponential of the pullback of this one form to this interval. So, there's a unitary evolution with some time dependent Hamiltonian, which is local also. So, it means to, it's a smooth means in this case. All states are really like that. Of course, you choose a different curve, you're going to get different automorphism. So, and then, from this notion of a family, you can combine with that, well, a natural differential grade Lie algebra and construct. So, consider now, contensor that Lie algebra with forms on m. And then, there's also more carton equation, which in components looks as follows. You have some, say, you have first look at the curvature of this connection, which is a two form here, but it actually happens to preserve the state. It's because, you know, this, this covariant derivative is not flat, this, you know, d plus, in other words, d plus g squared. f is not 0, but it's preserved the state. So, it lives here. And that's kind of the analog of that q. So, q was there, was similar thing. It was a, sorry, I forgot to say, it should be like really take values in the, not in the different, all the derivations, but it also preserve the state. It's similar to this guy. So, and then, and then you write similar equations. So, like the first equation looks like this. For this, okay, so you write them, you define some collection of objects, define some sort of g, which is sum of, sum of g1 plus g2 plus g3. And they each, they all have degree one, total degree one in this complex. So, they all live here. An equation kind of similar to have this. So, they have total degree one. So, for example, yeah, so, right. So, this is, yeah, so similar equations. We call this, okay, so we can define also, so differential, d plus g. It's kind of equations. So, the equations up there kind of the same, except you know, sort of, they're half fewer, as many equations, because those g are always even some odd degree, but here have all, every single degree. So, for example, there's no analog of that equation there. So, the first equation, they do match. This one that doesn't have an analog, the third one is basically has an analog there, and so forth. So, you can solve this equation because, again, the complex is a cyclic. You can show this complex here is a cyclic. And then, again, you extract the d plus one, dimensional guy, and that leaves, again, in numerical chains of degree d plus one. And it's more of a closed. So, sorry, d plus two forms with values in d plus one chains, numerical chains. And that thing is closed in a total differential. So, if you want to get something just closed on M, you just integrate this omega over, again, a similar way, and it gets something just closed. So, that's the machine, which gives you some invariance. But one can combine the two machines into one common construction. So, in general, when you have a group acting on a parameter space, as well as the system, you get some element in degree d plus two, G-query and cohomology of the parameter space. So, that's the machine. Now, I want to discuss... Okay, let me just... So, what do you do here? You integrate over this sub-sum cycle in the parameter space? So, I'm saying that if you integrate in that sense, and, like, using these conical cones. So, that just leaves an omega d plus two M. And since the integral of exact things are zero, you also... Let's call it, say, big omega. No, it's all not comical. Let's call it psi. All right. So, this just leaves in omega d plus one M. And that actually is closed because it's, you know, d of this guy is exact, so it gets zero. Okay. So, this is how you construct this cohomology classes on the parameter space. And I guess the next question is, first of all, what examples of this, apart from those which I already mentioned, second, what should they measure? Okay. So, the most straightforward... What's the physical significance of these things? Well, the most straightforward application would say, well, they provide some constraints on the phase diagram or your many-body system. Let me... Well, there's some old story which goes back to, like, 1930s. So, in 1930s, there's some paper by von Norman and Wigner, who looked at the following problem. They ask the, okay, we know that energy levels of a quantum mechanical system with the finite-dimensional Hilbert space, as you vary the parameters, they kind of repel. So, what is the... How many parameters we need to tune in the Hamiltonian? So, they actually do this single level crossing. Like, what is the generic co-dimension in the parameter space? So, generic... They said the following. Generic level crossings occur in co-dimension three. So, they need to tune three parameters. So, which means that suppose you have, like, say, some phase diagram, and let me draw this three-dimensional guy. So, suppose you have some two-sphere in a three-dimensional parameter space. And suppose you know... Okay. So, suppose you know that somewhere in the interior, there's a level crossing. Two levels just cross because you tune three parameters. Now, you can ask, okay, what happens if I deform the sphere a little bit, right? Deformed, you know, the sphere into the bigger parameter space with more... This sphere is in some three-dimensional parameter space, but I can imagine embedding this R3 into some even bigger parameter space. So, what happens? Well, nothing much happens because you cannot remove this sphere, this point, basically, in co-dimension three. There'll still be a co-dimension two-sphere surrounding some point in this bigger space. And that... They didn't realize it, but the reason for this kind of topological rate. So, like, 50 years later, Barry realized that actually this level crossings have to do with the fact that the very curvature of the very connection on this sphere is actually non-zero. That's why you cannot remove this level crossing. So, if you've measured the very curvature integrated by over a sphere, you can be sure, if it's non-zero, you can be sure there's some level crossing inside. So, in many-body context, we have some phase diagram in many-body system. We have d plus three parameters. So, you can imagine... So, now we have, like, s d plus two sitting in r d plus three. And suppose your very curvature, your high-bury curvature, that class psi integrates non-zero on this sphere. Then there must be some point inside where the gap closes, because if it didn't close, you would run into contradiction that would be contractable. That class would be defined on the whole ball, but the ball has no cohomology, right? So, you would have to have zero result of integrals. So, you can sort of detect, in this case, crossing of the excited state with the ground state by this high-bury curvature. That's the most straightforward interpretation of what this high-bury curvature is for. So, there's some interesting... Yeah, some examples, yeah. So, you can look at, for example, various things which I usually explain using some anomalies, some various protected points here, free formulas, which usually explain using anomalies. Usually, these anomalies, you know, kind of unnatural, because the systems themselves come from perhaps a lot of systems where some of the systems just don't appear, right? Say, when you have a vector symmetry and axial symmetry, yeah, there's mutual anomaly, but you cannot realize both on the lattice, right? So, what explains, really, then the stability of this point? So, if a massless formal point with respect to arbitrary perturbations, if the axial symmetry is not even a symmetry, it's some accidental symmetry, if you wish. And a better statement is that, well, what explains it is some sort of a version of this high-bury curvature. So, in that case, it's not really this high-bury curvature, just parameter space. In that case, it's important to take into account, also, you want symmetry. It's one of those equilibrium-bury curvatures which explains stability of reforming point. Well, some bisonic examples. I have paper with Ryan and Poshan Xen, which gives you some examples of bisonic systems, too, where you can explain certain things, stability of some points by these considerations. Yeah. The examples both with fermions and some symmetries as well as without symmetries of fermions. But those works, they preceded the theory that you're allowed to explain us. Well, that theory, well, there's a more pedestrian approach. You can just write out formulas of this very class. Once you believe that they're there, you can just guess the formulas. And there's a paper with Spudinenko, which does that, which just wrote a formula. But just in that approach, you don't see it's an invariant of a state. You just see it's something, expression, you can write down using many body greens function. And it's hard to see why, for example, it doesn't change if you, say, deform the Hamiltonian or deform the state, see. So, this formalism sort of cleaner, and you don't have to guess, you know, basically the same machine gives you all the invariance. Plus, it allows you to prove quantization, which is something we'll not turn to as well. Okay, what else would I say? Okay, that's one interpretation of this hybrid, yes. I'm confused. Are you saying that the low crossing is the kind of gap condition? Yeah, so in the many body case, in the phenomenon of Wigner case, it could take any two levels and ask, is there some protected crossing, right? In the many body case, you typically have only one special level, the ground state, right? And the rest of the spectrum just continues, right? So, the best, the only thing you can study is whether the continuum ever collides with the ground state. Yeah, but in the first case, the low crossing is when you vary parameter, and the gap is for a fixed Hamiltonian. So, how can you match it? No, it's the same thing, though. So, I have three parameters on which Hamiltonian depends. So, here, you have two levels, and as you vary parameters, here I imagine that all other levels are just far from these two. So, just starting these two levels. So, it's basically a two-state system. Yes. So, I thought the gap condition was just with one Hamiltonian. So, here, I have two levels which collide, right? In the many body case, you have, if you wish, one girl like that, and then some continuum, which is, parameters might collide with the, with the, see, here's my continuum of excited states, here's my ground state, and as a vary parameter, say, here, I get the gap-less phase. So, the question is, can I detect this thing while staying away from the point where it actually happens? So, everywhere on this d plus two-dimensional sphere, the gap, there's a gap. Now, that is by measuring this berry class on this, high berry class on this higher sphere, you can detect that there is something, the gap cannot stay open inside the sphere. Somewhere inside must be, must close. Similarly, here, just by measuring berry curvature here, you can detect that there must be a collision of two levels inside. So, actually, for Norman Wigner remarked in the paper that this statement about the gap closures, the co-dimension is actually depends on symmetry. For example, they discussed anti-unitry symmetries as well as unitary symmetries. So, anti-unitry symmetries, the co-dimension is different. So, symmetries are important. It's similar here. These high berry classes, okay, get one co-dimension if you have no symmetries and another co-dimension when you have symmetries. Also, one thing which is different in higher dimensions is that nobody said that this invariance I discuss are the only ones. So, just the ones we know how to construct. And then, if you believe in filter, you can predict for lattice, if you believe in filter, you describe everything about lattice systems, you can predict what other things can happen and there are different kinds of sort of protected level crossings or protected collisions of continuum with the ground state. So, that does have no analog in the phenomenon Wigner situation. I'll talk about it later about these other things which we don't really understand. So, yeah, so, somehow, these guys we discovered are the ones which occur in the highest possible co-dimension. So, there are also other things supposedly which occur, other protected level crossings, collisions like that which occur in lower co-dimension. So, ours like is the most the most non-generic generic crossings because in the highest co-dimension. So, okay, so now, okay, now to discuss, well, there's another interpretation of these high berry classes and also of the, well, this doesn't really answer how to interpret this aquarium high berry because, you know, the group is not like a parameter space. So, to explain what, you know, the interpretation, let me talk about phases finally. So, let me say the following. So, let G say be some smooth function from an interval to some derivations. So, that makes sense because this guy is actually fresh as space, it makes sense to talk about, say, derivatives of a function on an interval with values on the fresh vector space. It also, say, derivatives continues and it can differentiate more and again, it makes sense. So, now you can ask, okay, what can you do with it? Well, it turns out you can actually exponentiate it. So, there is a unique solution to the following ODE. So, you want to construct some family of automorphins depending on parameters called the parameter T and I'll let this automorphins act on some observable, any observable. And I won't solve this equation. So, that's the solution. Well, this is the mathematician's way of writing the statement that it's a solution with a time-dependent Hamiltonian G. So, if I solve it, I basically solve for time evolution. Now, this presupposes that that guy here is actually well-defined and then sort of that that automorphism preserves the fact that this guy is almost local. So, it turns out it's always, it's a unique solution. Well, with some such that, of course, alpha of zero is identity, automorphism. So, the unique solution is the consequence of something called Librobins and bounds. So, this follows pretty straightforward from Librobins and bounds. And the variety of these statements, something called Librobins. So, Librobins tells you that if you take, have a local lattice Hamiltonian, some, the unexponential of this is also local. So, evolution in time with local Hamiltonian preserves some vestige of locality. And specifically, if you have interaction with GK faster than your power, then the statement is that any finite time evolution still maps your local observable to something with tails with GK faster than your powers. So, as a result, if you start acting on something which just has such tails to begin with, the outcome is still something which has the same kind of tails faster than your power. That's true in any dimension. That's some various other versions of this statement. So, I already said that when you consider this situation with a parameter space, states in different points related by such automorphisms. So, I call them, so, alpha automorphism is a locally generated automorphism if alpha is given by one of those guys. So, if you can get automorphism by exponentiating some local Hamiltonian, time dependent Hamiltonian, you say it's an LGA. So, these guys form actually a group. Not entirely obvious, but they do. And so, in fact, that's sort of an infinite dimensionally group. And it's a Lie algebra. It's precisely those nice derivations. So, I have this notion of a nice automorphism, a locally generated automorphism, something which is obtained by evolving a system with self-adjoint, well, in this case a multiple, using the sentence of a joint, absorb Hamiltonians with interaction with GK faster than your power. So, it's a very important notion because, if you stare at, say, this picture, what I mean by smooth family, well, these are different states here and there, but definitely should require that they must be regarded as states in the same phase. If you can continuously deform a system using this automorphism from one state to another, I must believe that these two are in the same phase. So, and that's almost the right definition of what a phase is, the gap phase is. So, here is now finally a definition of a phase of quantum matter for a fixed algebra, well, for a fixed lattice. So, first I'll say that a state, say, psi, so, let psi, psi prime be pure states. Here, we say that psi is lj equivalent, sorry, lj equivalent, if psi is equal, psi prime is equal to composition of psi and some automorphism, a which is an lg. So, in principle, it doesn't matter here whether psi and subprime ground states of a gap Hamiltonian, but you can easily check that if psi is a ground state of a gap Hamiltonian, then so is psi prime for some other Hamiltonian. We may be a very complicated one. So, this notion at least makes sense. It plays well with the condition of being gapped. Also, yeah, so, that's one notion. I'll say that two states are in the same phase if there exists a pure factorized states. And here, okay, so here must be on the same algebra. These guys don't have to be in the same algebra. So, I say here on A and here, say, on A prime. So, we say that it has some other states on some other algebras which are some other algebra of that form, same form lattice, same lattice but different matrices on each side. So, all my algebras are all infinite products of matrix algebras over all sides. Of some finite sides. Yeah. So, and this I put some zero. I just mean the state is actually pure n factorized. It's just a product of pure states over all sides. Such that psi sort of stacked with psi naught is the Algae code into psi prime stacked with psi naught prime. So, basically I'm saying that, well, these are sort of ancillas, ancillas which I stack with my system of interest. And I'm saying that two states are in the same phase if they can be deformed one to each other in this, using these LGA's after stacking with some trivial unentangled ancillas, okay, ancillary states. So, that's my differential of phase. So, therefore, so in particular psi is equivalent to some factorized state or maybe on a different algebra. Psi is in the same phase as a factorized state. That just means that psi is in a trivial phase. So, what it means is I can, what it means, it means the following, that if I tensor this guy with another pure factorized states on some different algebra, then I can reduce it to a pure factorized state by some, one of those LGA's. So, for me, it means basically this can be disentangled after tensoring with some uncillas. So, so if, for example, psi LGA equivalent to psi naught means that psi can be disentangled using an LGA. While maybe it cannot be disentangled without some uncilled, but if you, if being in a trivial phase means you can do this after stacking with some unentangled ancillas. So, that's what, how I want to think about the phase. Can I ask a question? So, you know, when you describe finite international families of states, there you have this connection. Yeah. And there you did not use this notion of LGA's. Implicitly, I did because when I have this connection. It's the same thing. Then on any path which connects, a continuous path from my manifold of parameters, there is an LGA which connects it to states. Yeah. So, so this, so this notion equivalence is stronger than, of course, in this one than the usual tilde. All the invariants are introduced. They're all actually invariant under stacking with trivial things. So, they really only care about these equivalence classes, the, the, the, the phase. Okay. So, by the way, this thing is usually called short range entangled. Such states like this, but so, psi is a short range entangled. You can be disentangled using this local evolution. So, and once I'm here, I might as well define invertible states, which play the role in the first talk. So, psi is invertible if it is psi prime on some other algebra, psi prime such that psi stacked with psi prime is a short range entangled. That is, can be disentangled using an LGA on the tensor product algebra. So, so that thing is called an inverse to this psi. So, these are, these are in special states we're talking about. The invertible states have very many special properties. For example, well, obviously, if it just stacked different systems, the new system, so phases form a semi-group or a monoid rather. So, the, so the trivial element is kind of like the trivial state. But invertible states actually form an a billion group. So, because again, there's an inverse. So, and Ketayev's conjecture is that actually, if you just focus on invertible states and there's space of states, space of invertible states, properly defined should become a spectrum, this omega spectrum. Now, but for that to make sense, you need to take some sort of limit. Here I'm fixing the algebra, right? But to get that omega spectrum property, it looks like you need to take a limit over all algebras, make a finer and finer lattice and put some extra degrees of freedom or something like that. So, that property will not hold unless you actually have some limiting procedure on your lattice and the algebra. So, it doesn't matter because I'm, I'm just, I'm not trying to define that mysterious infinite dimensional space of all invertible states. I'm just trying to extract some of the variance. And so, I'm looking at some, you can imagine that this mysterious states has subspaces, right? And I'm just looking at restriction of this, say, cohomology class on this big mysterious space with subspaces and just look at those. So, but I'm not trying to construct the actual space and the, and the cohomology classes on the mystical space. Okay. So, now let me just, okay. So, before I break, let me make the following couple of remarks about this. First, I want to say are expressed, again, there's another way to say what high-berry classes are obstruction to. So, so, what is this manifold m? Again, and usually, usually, very class, high-berry class. So, so for any, between any two points and the primate, and of course, I imagine drawing, well, suppose connected manifolds, you can draw curves. So, these two guys are related by an LGA by definition, the states here and there. Now, let's go for a new point so I can ask, okay, is there some, so does there exist, exist some G on the product of m and 0, 1, the values here, such that if I compute, if I define alpha at a point m, as a sort of p-exponential, this G, the component of G along, okay, maybe should, okay, so here, this way. So, they only have components along 0, 1 part. So, if I compute the integral of G along this curve, I'm going to get something which still depends on choice of point here. So, so that this guy satisfies psi at a point m, so can you globally produce your whole, every single state in the family by acting by some LGA, which sort of smoothly depends on the, on, on, on point and m. So, now, locally, it's always possible. You can always cover your manifold with charts and do this. If your chart is contractible, that's kind of obvious, but turns out that if you have no zero-high-berry class, you cannot do it globally. And this interpretation is actually very similar to what you, the way you discuss usual berry class. For example, if you look at, say, spin magnetic field, so it has a parameter space, which is a two-sphere, just the direction magnetic field, say a fixed magnitude. So, if you look at the ground state, well, it's a single vector. In the case, there's some unitary which smoothly depends on a two-sphere which produces that state from a fixed, a fixed vector. And the berry class actually is an abstraction for finding such a unitary. So, you can get a projector to this ground state in such a way. You can find, you can find the projective unitary, which gives you including projector, but you cannot list it in actual unitary. This berry class is four, and this is kind of similar. So, I have this abstraction of finding some, well, this guy's an analog of this unitary operator, which depends on a point. Here's, here's not just a unitary thing. It's actually not the unitary tool. It's an automorphism of the algebra. And, plus, there's some locality constraint. It's not just a random unitary. So, that's another way to think about a berry class. And that actually has an analog for the case of symmetry. We're going to have, say, we have, say, symmetry G, and you can construct this. Suppose you have, suppose, suppose I'm at a point, but G is non-trivial. It's another case for which this machine works. So, in that case, we get the sort of high berry class, which takes some symmetry group of the state. So, it's a, no, symmetry group is non-trivial. Symmetry group is non-trivial. It's called H. So, I have this high berry class, which takes values here. This is just invariant polynomials on the Lie algebra of degree D plus 2 over 2. So, this is the algebra. Okay, so what is the meaning of this thing? Well, the meaning is that you can ask, okay, suppose I've given a state, can you, with a symmetry, can you disentangle it using only evolution which preserves the symmetry? And that's an obstruction to that. So, these classes don't allow you to disentangle something with a symmetry, with a symmetry preserving way. Okay. Well, even after stacking with some uncillos, which is symmetric too. Okay, so, yeah, so last remark I want to say before the break is that all the constructions, by the way, we're doing here depend, well, mostly depend on the fact that we're working on some discrete metric space, which in our case is just some lattice embedded occlusion space in that the volume of a ball grows as some power, no faster than some power of the radius of the ball. All the constructions depend on that only. So, that actually is, it's interesting because some such spaces are considered in the context of, they're special in many regards. So, for example, something called Gromov asymptotic dimension, and this is a space which have finite Gromov asymptotic dimensions. So, I wonder whether one can work out, well, pure mathematical, one can set up this machine also for other spaces which also have finite Gromov asymptotic dimension, but are not just as simple as occlusion space and what we're going to get out of that. Like, simplest examples would be, say, take some manifold M, well, not, did not get called X because not the parameter space, take some remaining manifold X and some metric, pick some lattice in it, it's a finite thing, and then consider, okay, so, let's look at this, take some remaining manifold and multiply by r plus and take the metric which is just metric on this X times r squared and plus dr squared through a cone, infinite cone with base X. And then choose some subsets, some lattice, sort of some discrete set of points which are, this sort of uniformly filled the whole thing. So, it's a simple example of a space, it's an occlusion space, but something else. For example, you can take, I don't know, a bunch of points, say, take this guy just to be finite number of points, right, then you're going to have some space which looks like this. So, and then take lattice in this collection of one-dimensional lines and consider lattice systems on this thing. So, it gives you something, I don't know if it's interesting physically, but it gives a different, the question is what is the, what are the invariants of the situation? They're different than for a single, for a single line in principle, so I don't know what they are, but it's an interesting thing to think about. Okay, let's take a break and until the break I'll discuss quantization when this hybrid class can be shown to be quantized. So, in the first lecture, I sort of sketched Kittai's conjecture in its refined version, which said there should be some spaces of invertible states, some infinite-dimensional spaces in every dimension d, and there's even prediction for what they should be, or at least that what the homotopy type should be, because it's already only about homotopy type. And when I get back to that, now that I gave a definition of an invertible phase, now can we construct something? Well, okay, so the space of, say, if I have a table, dimension zero, the trivial, and here's our space, in case of, well, it's a bit different, in case of bisonic and thermodynamic systems, I'll just list the bisonic case. So, dimension d, well, the homotopy type in zero dimension will, it's known, well, basically cp infinity. So, it's only up to homotopy, so I'll say it's kz2, that is, it's only, homotopy group is degree 2 and it's equal to z. So, general pi, so k, the homotopy type called k an is defined by the condition is any space, such that in space y, such that pi n of y is a and pi m of y is zero, if m is not equal to n. Okay, it's called the Eilenberg-McLean space. So anyway, so in dimension d, it's just that, it's kind of trivial statement, just this is space of all lines, infinite dimensional Hilbert space. In dimension one, it's supposed to be this kz3, so it should be a degree 3 homotopy group, and in dimension two should be z times kz4, and here's z, so it should be kz5 times s1, which is really kz1. So, the homotopy types can be, well, that's only for spaces of invertible states. Now, in particular, well, let's just do dimension one. For dimension one, we expect to be some special degree 3 class here in cohomology, because, you know, the first non-trivial cohomology group is equal to the first non-trivial, sorry, first non-trivial homology group is equal to the first non-trivial homotopy group by this Horowitz theorem, so this has degree 3, so what it means is that the first degree 3, the first homology group occurs in degree 3, and some other complicated homology groups, they all torsion, by the way, in high degrees, all homology group of this guy is torsion, so about in degree 3 there is a non-trivial homology group, which is z, so there's also cohomology group, which is also in degree 3, is also z, and it seems like we already constructed something of it, we constructed some, we're gonna have a map from some finite dimensional manifold m to x1, so you have map from m to x1, so therefore we can, it should be a class f star of some, you know, some canonical class, omega which should be leaving, so we have this some canonical class, there should be class in h3 of x1z, and its pullback therefore should be leave here, so that's our sort of high bearing, the first non-trivial high bearing class, now so we already constructed something like this, but only coefficients in r, so can we construct a class in integral coefficients, which lifts that real one, now it will be even more interesting to construct some element of, well this guy has h0, right, so in dimension 2 we have h0, so the actual, it's kind of the components of the space, that will be really interesting, I'll get back to that, we cannot do that, but it should be some locally constant function in the space of two-dimensional systems, which tells you where you are on this guy here, we only know how to make it, make something like leave, which leaves here, pullback of degree 4 class there, but again so far I only just did it for with real coefficients, we really expect to have something with integral coefficients, but only in the case of invertible system, against stress, so for gap system you don't expect any quantization, okay maybe there's some irrational sort of quantization, but the denominator is not under control, we can get any denominator presumably, almost any denominator, so example which we're familiar with is the whole conductance, well that's for the case of with symmetry, so maybe I should, okay, say for symmetry case there's a different table of the same kind, in particular in the symmetry case, symmetric case, say you want invariant case, in dimension two there are actually two z-factors, one is for the whole conductance and the one for the mysterious one, and whole conductance of course is quantized for, well in that case quantized because they're dealing with invertible systems, but for general gap systems, whole conductance is not quantized, it's a rational number, okay nobody ever proved it's a rational number, but it's expected to be a rational number because of fractional quantum whole effect, so we don't expect, therefore, that cohomology glass we constructed to be quantized in general only for invertible systems, okay, so you assume that integer quantum whole states are invertible, fractional quantum whole states are not invertible, well you can show actually, you can prove that any invertible state has a integer quantum whole conductance, so we have fractional that definitely not invertible, so, but I want to do the whole conductance, I'll just make a few remarks about the whole conductance case, basically the same methods work for whole conductance, but let me just sketch how it's done for this first and trivial one, so, and no symmetries are assumed for now, so, okay, so how do we do this, so, well it helps to think about hyperic less of an abstraction to constructing this globally defined family of LGA's which produces all these states in the family, but so let's, so the first step, here's our M, we cover the M with charts, and we define say, well, so your A is an open cover, and I'll assume that each UA is contractible for any, and also that double and triple overlaps are contractible, so you can always choose such a cover, and so on each UA you can produce psi by acting on some fixed state, okay, so, yeah, on some fixed state by some automorphism which is going to call alpha A tilde, it's called alpha A tilde, okay, so it's a local generator of the morphism. Now, my goal is to just proknotize it for invertible states, but I'm going to even simplify my life by assuming that this state is actually factorized, pure factorized state, so I assume that this is pure factorized as a consequence, all these states actually in the trivial phase, right, by definition of a phases, they call this a shortage entangled, it doesn't actually make much of a difference because once you prove this quantization for a family of shortage entangled states, you also get it basically for free for invertible states just by some tensoring, so it's not, basically because you can tensor this whole thing, if you have some family which is invertible at every point, you can just tensor with a fixed state, ancillary state, and then disentangle, so that simple trick tells you that way you just can replace your original family of invertible states with a family of shortage entangled states, that doesn't change the very class, so in particular in fact it's quantized, so we can just, it doesn't change, but so for short range, I'm just saying that if you have some, I understand the trick, but the conclusion seems to be that but is the very class for a short range entangled state going to be trivial? No, because it depends on the M, it has to do, very class has to do with some variation as a function of parameters, well I'm saying that if you fix what if, say sine, if sine not has an inverse, I can just stack this whole system with this fixed inverse, and stack this with fixed inverse, and then I'm just dealing now with states which are in the trivial phase, but that stacking with a fixed state doesn't change the very class, the very class will do with the non-trivial connection or whatever, so it's the same thing, so suppose it's, therefore this guy is pure factorized, so okay, what do we do next? Well, this is a one-dimensional system, so I can just arbitrarily split into left and right, it doesn't matter where, and I can just replace this automorphism with some other automorphism, it's called alpha a, alpha a is a restriction of alpha till the a to say to the left side, what I mean is the following, well alpha a is obtained by explanation in some Hamiltonian, I can just take that Hamiltonian which is independent, just only keep the terms which are left on the left side, they'll have tails, so it's not strictly living on the left side, and so alpha a also does not gonna be just concentrated purely here, but it's gonna decay very rapidly, approach identity very rapidly on the right, so that's my sort of truncated evolution, I just, so instead of evolving my psi not using the whole alpha till, they're just gonna evolve it here, okay, the resulting state, so the resulting state, it's not psi, right, but it's very close to psi, or if you wish, okay, let me do it this way, so I'm gonna do it, yeah, so it's approximately psi not, why? Well because on the left, because I'm just doing this alpha till, then alpha just not even different, on the right it's even more trivial because alpha, alpha is just identity basically when you go far to the right, so approximately, sorry, I mean that notation, is very close to psi on left, well far from this origin, say 0.0 here. Now suppose you have some one-dimensional state, and you have some, and some other state which is very close to both left and on the right, and they're both pure states, now you would expect that these two states probably, first of all, can be thought of as a vector space in the same Hilbert space, I understand, psi, if you compose psi with alpha, sorry, sorry, sorry, sorry, sorry, sorry, very close to psi not, on the right, this is, sorry, all right, because to psi not to psi not, oh sorry, yeah, we basically interpolate between psi not on the left and psi on the right, yeah, let's see, yeah, sorry, I should, no wait, that's not right, so yeah, so I should do, no, let's rather, yeah, let me introduce alpha AB to be alpha A tensor, alpha B inverse, so this is that defined only on UAB, sorry, so that, they have this property, so that psi, yeah, of AB is very close to psi, far from origin, so on the left because they both become same as alpha tilde, then you get, they just cancel, and then on the right because they both become identity, okay, so that's just, don't double overlap, now it's supposed to have a station, I have two one-dimensional states, and they agree at infinity, and they're both pure, like very naive, you might expect that one is obtained from the other by acting on some local observable, a unity observable, but in Hilbert space you would say, well, there are two vectors, like you can connect them to the unitary, now, so that's not, that's actually much more subtle, first of all, we have two different states, the corresponding, they don't necessarily sort of live in the same Hilbert space, so there, you have two different pure states, the corresponding general representation might be different, non-equivalent, the first question is other equivalent, now this effect that you have non-unique representation, it has to do with the behavior at infinity, so here the behavior is actually the same, so we would expect that that's not a problem, and if one can show that two states which agree at infinity, it doesn't matter how fast the infinity, how fast the error decays at infinity, when I would agree at infinity, they sort of live in the same Hilbert space, so for you can think of this state, so this guy is a vector space, vector state in its own general representation, so this must also be a vector state in the same general representation, second question is can you obtain one from the other by acting not just with any unitary, but with a local unitary, almost local unitary, so a general unitary is kind of useless, now again this states, if you act with a general unitary, then of course they won't agree at all, but you expect that since they do agree with infinity, then presumably this unitary can be chosen to be almost local, it again is not a trivial statement, turns out actually first of all it only works for the case when this state is in a trivial phase, so it's easiest to just to prove it for the case when this state just, when one of these two states is actually just a factorized state and the other one is not, and second it turns out dimensionality does make any difference, like in general in general dimensions when you have the situation some lattice, two states, one of them just factorized state and the other one is maybe not, but is not, but they are both pure and they all both agree with infinity, you can show that one is obtained from the other by applying some local unitary, almost local unitary observable, so the proof is not very complicated, but it's, uses some basic quantum information and qualities, so I won't try to give a sense of that, so construction itself doesn't immediately generalize to large dimensions, because what do you mean like you have to split, yeah that fact is used in also another similar proof in higher dimensions, that's what do you mean when you say in high dimensions there's going to be some playing not a point and so, you can, there you have to do more overlaps, on double overlaps you cannot do it, but on higher overlaps you might, okay, so that's a very useful fact, and but I'm not going to explain that, so but suppose that's true, so then psi alpha b is equal where by add v is entomorphisms like this and v here itself a unitary observable, the unitary element of this almost local absolute algebra, so we have this, and now it's basically smooth sailing, so once we believe that such local unitary actually exist, we can consider the following thing, so we just define the following combination on triple overlaps, that belongs to the unitary that lives in, sorry, it's not, yeah, so this lives in the unit, well everything, by the way, everything here depends on a point in the parameter space, it's an extra level of argument which ensures that you can actually choose this guy to depend smoothly on the parameters, because that's not a trivial statement too, so vab is actually, yeah, so vab is in addition here, okay, that requires some proof, but anyway, so then you can form this guy who lives again in now in triple overlaps, it's also unitary obviously, and then you can check the two things, well there's several things, first of all that, you can check that this just preserves your sign, now again a general fact, if you have some pure state and some observable, which conjugation which preserves your state, you would guess that actually what this means is that in the general representation corresponding operator just multiplies the ground state by a phase, okay, and that can be shown to be true, like without some little trick, well not a trick, but just by basically application of this famous theorem of von Neumann, so-called bicomitant theorem, show that actually it's true, so therefore if you average your observable like this, that actually belongs simply to to smooth functions of values on u1, this is the phase that you get, okay, so okay, so finally we have now smooth functions on triple overlaps, and then easy computation shows actually it's got cycle, so if you call this say HABC, so it satisfies well, well it's not very short actually, if you look in the paper, there's a paper, separate paper, you'll find that I recently came out like a month ago, or maybe just about a month ago with Sopenko and Artemovic, so it's a bit quite a few lines, but anyway, so literally there's no, it's only one way to do it, so so you have this cycle condition on quadruple overlaps, and it's well known that any such thing defines a class in H2 of m with values in the shift of u1, of u1-valued functions, okay, so that's the construction, and then this group is actually isomorphic to this, so it's the construction of the class, and then takes some extra work to show that the image of this, if you can further compose with the image to the, usual image to the Ramco homology, you can show it's actually the same class we constructed by different means for this more carton equation, so this guy therefore is a refinement of the high berry class to the integral class, so well you can show that it's the same, but this suggests that for invertible phases there's no need to suffer, this is a much easier construction in a sense, yeah, or nothing, the other one though works far better dimensions, this I don't know how to generalize to general dimensions, some things though can't be, yes, so it's a simple question problem, but what made you succeed in assuming there was a torsion, you said you had some identity of continuous coefficients, so then you lift it to z, no this torsion, this is not a u1, this is a u1 valued functions, smooth u1 valued functions, the previous approach gave just class here, it missed the torsion, well it only works for invertible states first of all, so that first approach works for general gap states in any dimension, this approach we only know how to do in very low dimensions, it's quite laborious because there's no machine there, so we cannot do without indices, for example we never wrote up, well okay there's a similar definition in dimension two, but for example if you want to show that that class you construct in degree four, we have lots of more indices, right, to like the frecacycle conditions quite, I don't remember it, and then if you want to show that it's the same class after go to the ROM, we didn't even work it out, it's just horrible, horrible mess, so we don't have any systematic way of showing the same class, so in So understanding data, could you construct a theory with torsion? Yeah it is, no this torsion, no we get actual class in integral cohomology, so with this torsion cohomology we can get torsion, in fact there's some several papers from last month which construct examples of families where you only have torsion and they get torsion element here, so torsion does appear naturally here, all I'm saying is that that only works for invertible systems, at least in low dimensions. So another question, so you constructed in the previous parts of the talk, you constructed this elements of the RAMc homology which we're saying they're not integral, we don't expect them to be integral, but now you're saying that when invertible faces the action integral, for just for one dimension, I don't know, so is there some risk-hitting factor that you discovered that you have to multiply them by something? No, it just naturally comes out to be right in one dimension, without to be right, okay, no, if it was something natural I would get some factor of three or something, plus we actually norm, okay, to be fair in our previous paper we had construction of the same classes using the explicit formulas rather than using the Hamiltonian, and we actually show that that class is literally it can be related to quantized things in the free course of non-directing formulas, so we need the normalization, okay, so this is not a surprise, also we need normalization from special case of whole conductance because there we know what to get. So let me just say a few words about other cases, we know how to prove quantization, not very many cases, unfortunately, okay, so in dimension two we can construct degree four class in a condition, you know, because we have this KZ4 factor and we can hold it, it's quantized, we didn't check the same class because again it got lost in the formulas with lots of indices, so another case, interesting case, when there's no integral group, suppose G is, suppose you have a symmetry group G, and so case one, suppose G is a compact semi-simple, so the SE2 for example, so actually just enough to just think about SE2 as we'll see in a second, so and suppose D equals to two, so that's the case where there's whole conductance, or more generally like some transimus coupling or whatever, so what can we do in this case? Well, in that case, suppose you have any system, so any system gives rise to a family of quasi, any 2D system, gives the rise to a family of quasi one-dimensional systems, parameters by G, so how do we do that? Well, we have two-dimensional lattice, you just cut it into two halves and do the transformation by G, constant one, but only for the one-half of this system, so the system is invariant, if I did the whole, if I did constant transformation on the whole plane, the state wouldn't even change, so when I do it only on the half plane, far from this line, it just doesn't know that I did something different on the other half, so it's there, so half on this line, nothing changed at all, so the only change is occurs here, on some neighborhood, so that seems like it should be essentially quasi one-dimensional version of this previous discussion of high barrier classes, they have a family of quasi one-dimensional systems, parameterized by the group G, and in any compact semi-simple G, I have a natural three class, you know, I can just take any pseudo subgroup and that defines me some three class, so they'll, you know, how much they equivalent, they'll go homologous, so I can ask, okay, can I compute this high barrier class of this quasi one-dimensional systems, and then just integrate over the degree three cycle of the three sphere inside G, what I'm gonna get, I'm gonna get some number, that number is quantized by the previous considerations, right, well generalized to quasi one-dimensional situation. On the other hand, you can show that actually this a covariant barrier class or something, so this is a, so this number which can get h3 of h3 of G, before I come back to semi-simple, it's simply, well, actually it's simply here, but in fact we know, okay, that's actually Z, so we get an element here, and that actually turns out to be the same as the sort of whole non-Abelian version of the whole conductance, so we know how to tokenization of that. Another case which is interesting, but it doesn't apply to U1 though, but for U1 you can do something else, so for U1 another case, let's consider G, let's consider G, no G, but dimension one, actually no, let's do, okay, dimension one, and symmetry group is U1, so in that case, so in this case, we're expecting the variance to live here, degree 3, well, sorry, of a, sorry, no, some manifold, sorry, so what is there? Well, there is of course the just usual class, and then there's also this, so that part is where, you know, it's quantized, this part, well, one cannot show it's quantized, so that's known as the tautless pump, what is this thing? Well, if you have a one-dimensional system, you can imagine going along in some cycle on M and ask how much charge got pumped from the left to the right side. In the gap situation, you expect this to be an integer, and actually one can show that's the case, using the similar approach, using some checker cycle or whatever, so this guy is actually integral, you know how to show it's integral, and now we can apply it to the case where G is 2, G is U1, and M is a point, so we can do the same, similar thing to there, so we have now, like, split our two-dimensional system into two half planes, and now apply U1 transformation all in the half of the space, so get a family of U1 invariant, quasi 1D systems, states, states, which all get parameterized by what? By this group, which is a circle, so in the case, non-Abelian case, we wouldn't have symmetry G because G is non-Abelian, right, so when you do symmetry transformation with only half of the system, the resulting family doesn't have any symmetry, but we had a non-trivial cycle, three-cycle there in that case, here we don't have non-trivial three-cycle because our group is U1, but on the other hand we have something else, we have the fact that the group is Abelian, so after applying what U1 transformation on the half, you still have U1 symmetry, so you can say, okay, that's a quasi 1D situation similar to this one, right, should be some integral element here, right, because H1 of a circle, just, you know, Z, should be some integral invariant, that invariant is the whole conductance, so you can show that, and by the way, this is a rigorous version of Laflin's flux insertion argument, so what's going on here, so when you, what happens when you apply U1 gauge transformation to half of the system, so, well, there's one parameter here, the circle, right, so after the parameter varies from zero to two pi, right, so what happens is basically you turn on electric field when you have, well, when you have time-dependent transformation, it's same as having electric field, and that electric field drive, if there's whole conductance, drives a charge in this direction, so that's precisely, so we may say how, well, with this Paulis pump measures how much charge flow through this line, right, so we just essentially, this is our quasi one-dimensional system, we turn on electric field in this direction, we measure how much charge flow through this section, that's Laflin's argument, we, the difference with Laflin is that Laflin considered the following situation, considered insertion of flux at a point, and electric field is kind of like, electric field is kind of like, like this, when you have time varying flux, electric field goes like this, so the, your charge flows to this point, here we sort of inserted this, sort of map this point to infinity, so we don't see it here, but it's the same type of argument. Incidentally, this version of the Laflin's argument actually is better, we have another paper where we actually do this way, because when you do it, when your point is not on infinity, you can actually get extra mileage from that, for example, you can do it at a couple of different points, and now ask, well, what are the statistics of these guys if you can exchange them, this is something you cannot do if you have this point at infinity, if you can get extra information from that. Okay, so this is the whole situation where you can know that something is quantized, and to go further, to get some general arguments, you need to have better machinery for invertible state specifically. Okay, and in the last 10 minutes, or five minutes, let me just say, what are the big open problems if you don't know how to solve? So, actually let me just say one, the most vexing one, and the most physical one. So, the refined cadive conjecture says that dimension two, space of states, has this form, should be some integral invariant, labeling different phases into dimensions for invertible states. What is this thing? Well, put a question mark, because you don't really know, but it's something called chiral central charge. So, what is chiral central charge? Well, there's no rigorous way to define it, let's put question mark here, but number one, you can describe like this. So, how is it two-dimensional system, how do you know that it's a non-trivial phase? Well, you put actual physical boundary here, so that the system, well, you just modify the state, so that on the lower half it can be trivial. And then if it's a non-trivial phase, necessarily you'll have a gapless system. And more or not just gapless, any gapless systems. Okay, we don't know what you get, but in the bulk system hardly changed by near the boundary, essentially it should be some field theory. We don't know what this field theory is, but for example, it could be one plus one dimensional conformal field theory. And the belief is that if this invariant simply the difference between the central charges for the left-moving modes, right-moving modes, that is the theory on the edge supposed to have a left-moving varusaur and right-moving varusaur, the difference of central charges is must be the invariant. So, there's a lot of problems with this. The first problem is why on earth would the field theory describe this edge at all? It's a ladder system, and if it does, why should it be Lorentz invariant? And if it's Lorentz, then why is it conformal? So, there's too many assumptions. So, way number two is the following, more physical maybe, but still rather mysterious. So, we can imagine as we place the system at positive temperature, heat it up a little bit, but still in the same situation with the boundary. Now, in the case of conformal field theory, whenever we have unequal left and right central charges, then there's a net energy flux, I don't know how to call it, along the boundary, which can show it's always proportional to temperature. And the coefficient here, this is stress energy tensor, this is temperature, and the coefficient here is somehow proportional to carousel central charge, I don't remember the exact proportionality constant. Okay, so this makes sense actually, even if we don't have Lorentz invariants, conformal symmetry, we can just say, well, if I put the system in a half space and put at positive temperature, will there be energy flux along the boundary? So, you can show that even though this requires choosing a particular termination, you know, this seems to depend on the choice of this, physics of this edge, actually one can show that it doesn't matter, because if you have, the reason being you have like a strip of this system, then the net flux should be zero, cannot have a net energy flux when it, when I have no temperature gradient. So, flux should depend on the speed of light, fix the speed of light on the boundary? No, I think if you look at the units of energy flux, I think as such that it doesn't actually, so that is net energy, energy flux is actually, because of this temperature, the energy, the energy cancel, so you get just something as you pitch is dimensionless. Like, say electric flux is dimensionless, right? Because electric charge is dimensionless. And here you have extra temperature which cancels the energy here, so it's actually dimensionless quantity, this kappa. So, this is known by the way as thermal whole conductance. So, this is known as, so, okay, so thermal whole conductance. So, why this name? Well, because imagine having a 2D system, we have two different temperatures for the two edges, right? Then there is energy flux kappa t1 here, and because of change of orientation minus kappa t2 here. Sorry, I guess I messed up, sorry, this should be t squared. Sorry, not t squared. No, sorry, no, that's right. So, no, actually it should be t squared, sorry, I messed up, t squared. I need to get back on the dimension. So, this kappa, so for kappa to be dimensionless, it should be just t squared, because the net energy flux then is a, for small differences, simply just proportional to the temperature difference. So, the next flux flows this way, it's proportional to temperature, and as well as this number which is caro-central charge. So, right. So, that's usually called thermal whole effect, in the sense that temperature gradient is in y direction, but the heat flows in the x direction. So, it's very hard to measure, people measure it, but experiments are a difficult measurement, but it's believed anyway that this coefficient is quantized, but it's a rational number. Nobody knows why, because we don't have any analog of this machine to apply to this thermal whole effect, because first of all, this definition is not very convenient. It deals with positive temperature systems, and we don't have any way to deal with positive temperature systems. We only work with gap systems at zero temperature. So, anyway, so that's the outstanding problem, and moreover, it's believed that this quantity makes sense even for general gap systems. And for gap systems, it's still called caro-central charge, just not necessarily integer, but it can be fractional. We also don't have any means to construct, forget about quantization, we just don't know how to construct this in a very period. So, there's some indication that this invariant has to do with some hidden rotational symmetry in the problem. That is, we're dealing with a lattice system. When there's u1 symmetry, we define this whole conductance. When there's no u1 symmetry, there's thermal whole conductance. You can ask, okay, is it similar? It looks very similar to electric situations. So, what's the analogous symmetry? What's the analog of u1? Well, the answer seems to be its rotational symmetry, but it is in field theory, the rotational symmetry. But in lattice, there's no rotational symmetry, so that's the main problem. How do you see a hidden rotational symmetry in a lattice setup? Can you mention what are the physical systems which do not have u1 symmetry which is supposed to show this effect? Well, in practice, every single, okay, first of all, if you just, okay, thermal whole effect, actually very general thing. Take any insulator which has no interesting charge equations at all, put it in magnetic field, you're gonna have thermal whole effect. So, take a, yeah, just forget about that, yeah. So, but of course, that's not very, that's not quantum, but it's not the case when you expect quantization, just some, some, some, some quant, some quanted, right? Yeah, yeah, we're interested in the situation when there is some contribution from just electrons. So, fractional quantum whole states exhibit thermal whole effect as well as usual whole effect or integer quantum whole states. For integer quantum whole states, it's, they're proportional. For fractional ones, there's no reason why it should be proportional. You know, in practice, as far as I know, in all cases, which are very few, where it has, has been measured, they turn out to be, to be proportional to the same coefficient as for integer quantum whole effect. I don't know why, because these are interacting systems, there's no reason for them to be like that. So, but it hasn't been measured in very many systems anyway. Okay, so that's a very mysterious thing. We've been thinking about this, like, for four years, made zero progress. Yeah, that's it. Are there some things like spin whole effect which could show zero thermal whole conductance? No, formally you can of course consider this. But since spin conservation is a real, real symmetry, right? And we're interested in systems with actual symmetry, then no. The things you can ask about, you may be, situation with, like, say, time reversal symmetry. But there's a much tricky to deal with, when you have anti-unitory symmetries. Any other questions? Well, there's one more thing that we'll talk, we should ask Antonin Private, the proof of goals of theorem using all these things. That's going to be a long-term discussion. So that's