 So you're, the one problem from problem set four is due today. But there won't be another problem set assigned today or next week. Come on, we have an exam coming up. And so that exam again can be in the evening. I'll bring science from 5.30 to 7.00 p.m. in one morning maybe. And again, that's going to cover the material through lecture seven. Notice that we'll complete today through these problem sets. And so we'll have a little, be available to discuss your questions at our usual problem session for at least 45 minutes. We'll have a little review, okay? I won't come if anything should be prepared. I'm addressing questions that you have. GOV has problem set two graded. So I will look at it and I'll put it in the front office for you to pick up. I'll let you know when that's available. And we'll have the next two problem sets. This is just one problem graded as soon as possible, right? By the way, so the exam is going to be closed book. And there's not a lot of formulas. And remember, it's more conceptual, all right? So just keep that in mind. All right, very good. So last time we had established the fundamental properties of states in quantum mechanics. A state in quantum mechanics, most general state in quantum mechanics is described by a density operator. We typically call that row. And this operator has some basic properties. It's a permission operator. It's also a positive operator. And we, by convention, normalize it so that it's easy to calculate probability. We can always re-normalize attributes if that weren't the case. Since it's not a fundamental property, it's a property that we impose in order to easily calculate probabilities. And because it's permission operator, we can diagonalize it. And it has a decomposition term of its eigenvectors and eigenvalues. And now these eigenvalues are all non-negative numbers because this is a positive operator. So we can always think about the density operator as a statistical mixture of its eigenvectors weighted by its eigenvalues. But as we noted, we should be a little bit careful, we can think about the density operator in an infinite number of ways. If it's for a general mixed state, there are an uncountably infinite number of different statistical mixtures, all of which give the same density operator. So you need speculative composition up to degeneracies. But as we saw in the homework we saw last time, we can mix together different pure states and get the same state. Which is to say, if we do measurements, our predictions are exactly the same. We couldn't tell the difference as far as the measurements we would make. It means to say the state is the same. Of course, if it's a pure state then there is a unique density operator. Because there's only one thing in that there's only one term in this decomposition. Right. Now we talked about a way of determining whether the state was pure mixed by looking at something called the purity of the state. The purity of the state is, assuming we have a normalized state given by the trace of rho squared. And so if it's a pure state, this purity is one, and if it's a completely mixed state, then it has the minimum value of one over the dimension of the Hilbert space. And in that case, the density operator is just proportional to the identity. And it looks the same in all cases. For a pure state it's just a single project. Okay. One concept we didn't introduce, but I just want to mention, another way we can think about the degree of mix-in-ness of the state is thinking about entropy. Now you've learned about entropy in the context of statistical physics or thermodynamics, but entropy is a somewhat more general concept from information theory. And you can think about entropy as the information I'm lacking about the state of the system. So when something has zero entropy, I'm not missing any information whatsoever. I have a perfect description. It's perfect. There's no randomness. Okay. There's no missing information. So a state with zero entropy is a pure state. A state that has more entropy is mixed because I'm missing information. Yeah. This is the logarithm. What's your choice? So that's a good, I actually, by, how should I say it? I purposefully didn't put the base here because it depends. You have your choice what you want to do. If you write this in base two, then you're measuring the entropy in base. If you write it in a natural logarithm with base e, then there's something called NETS. So it's up to you how you want to work base. It's just like a convention that you decide how you're going to measure the entropy. So if this was a two-state system, that is to say if it was a dimension two, and we measure this space two, then the maximum entropy is one day, which means I don't know whether it's up or down. Yeah, you had a question? Well, this has really more to do with that than that. Do you need to multiply it against some sort of constant? Right. If we want to measure it in energy units, or, I mean, Boltzmann constant, is a way of relating the entropy to, in some sense, energy, right? Because both of the constants have time. That's a convention. But we don't need to do that. We can write it as this is dimension less. So in some sense this is a counting argument. Okay? So it's another way to think that I want you to just keep in mind that pure states, again, are states where we have maximum possible information, those states have zero entropy. A state that's mixed has, we're missing information about the state of the system, then it has finite entropy. Now, we, at the end of lecture last time, talked a little bit about dynamics. And in particular, what we talked about was closed system dynamics. So if we have a closed quantum system and all degrees of freedom are accounted for, then as we've argued before, that evolution must be unitary. And if I say have the state, I know what the state of the system is, I define, or at least this is the state I decide based on my information at time t equals zero, then the state of the later time is given by this unitary transformation on the state, as we showed last lecture. Okay? And that kind of evolution is something which preserves the purity of the system. The SSA, as we calculated last time, the trace of the row squared at the later time, at any later time is the same as what it was at the initial time. It's unchanged, it's conserved with time. Okay? Equivalently, we could say that the entropy is preserved because the eigenvalues are the same at all times. So this is a point that I want to just mention, this is a point of linear algebra that we didn't state before, is that unitary transformations preserve the eigenvalues matrix. Is that obvious? What I mean by that is, let's say I have an operator A, and I do a unitary transformation on it. I can put it in the U or U dagger, I can either stop it or it doesn't matter. Let's say I do it like that. And let's call this B. What I claim is that both if A, so let's say A is a commission operator, so it's B then, right? And what I claim is that the eigenvalues of this operator are the same as the eigenvalues of this. How would you prove that? B minus lambda i. So then on the right-hand side, rewrite i as U, U dagger, and factor them out and rewrite it as A lambda i. You could do that. That would definitely work. We could go get the determinant and that will work. I'll show you a quick, a very quick and dirty way to do it. Let's just say that A is kind of a spectral decomposition, right? Then what is B? These are a new set of vectors, but it's the same eigenvalues. So it's just a quick and dirty way to prove it. So if you do a unitary transformation, you don't change the eigenvalues. If you don't change the eigenvalues, you don't change the entropy, you don't change the purely. The degree of effectiveness is the same. All right. I'm going to scoot by here, so we'll do it. Thank you. So throughout, we talked about in discussing dynamics, we restricted our attention to closed systems, okay? And in a closed system, what we're imagining is that we have a description of every degree of freedom in the system and we can track it. So closed system dynamics, every degree of freedom. And as we discussed, there are quantum mechanics that has to be described as a unitary transformation. But maybe in certain circumstances, we don't or it's essentially impossible to account for every single degree of freedom in the system. That's particularly true as systems become more and more macroscopic. It becomes more and more difficult to do that. So in fact, we're familiar with that in classical physics. We, for example, when we talk about thermodynamics, which is typically applied to looking at physical systems in the macroscopic world, we imagine the system of interest might be in contact with, for example, a thermal reservoir. We call it the bath, right? Because we used to think about it as some, you know, liquid that you immerse things in. And this thermal reservoir might be at some temperature T. And then the system is exchanging, say, energy with the reservoir. And typically we imagine that at some point later this comes to equilibrium, equilibrium, for example. Now, thermal equilibrium is this kind of steady state evolution. It's not really, I mean, it's an irreversible process, typically. And it's not consistent with the microscopic laws of classical dynamics. The microscopic laws of classical dynamics are time-reversible. But I think when you, you know, shatter the chalk at least once a lecture, that's a pretty irreversible thing. Now, it's only, we say, effectively irreversible. All intents and purposes irreversible. Because of the macroscopic nature of the environment. That's another word for this. So the reason that things are irreversible in some sense, we imagine at least one way of having irreversibility as an emergent property from fundamentally reversible dynamics is if we kind of coarse-grained and say that there is sort of, you know, there's a fine-grained microscopic degrees of freedom, but within some coarse-graining of the degrees of freedom things are lost and they're just sort of there so long they never come back. And this is, of course, a deep problem. It's a complicated problem. How is there an hour of time really more fundamentally? Are there things that are fundamentally irreversible? Like black holes? Or is information lost behind a black hole? There's, of course, very deep and fundamental questions. However, putting those aside, we could say that we could, even without getting that deep about it, that we can get irreversible dynamics effectively, fundamental reversible dynamics in the microscopic way. And, for example, one of the things that could happen in this kind of thermodynamics is, for example, this might come to equilibrium so I might have had something which, say, once I got in contact with the heat bath, its entropy increased, for example, right? That's a problem for you. You solved. So the entropy of the system isn't conserved in this case. It would be if I had perfectly reversible dynamics, I would cheat, not change in any way what information I had about the system if it was all perfectly at every molecule and can follow every one of those trajectories. But if I lose track of it, I lose track of information, entropy increases. So that's classical. And the same thing would be true for quantum open system dynamics. Whereas if we take account of every degree of freedom that describes the state of our system, that is described by unitary evolution. If my quantum system is in contact with the environment, I call it the reservoir of the bath, my quantum system, whatever this is, could be an atom, could be a molecule, it could be a superconducting circuit. And there's, it's in contact in some way, something I call the environment, it's a quantum environment. All the degrees of freedom of the environment are themselves also described by quantum mechanics. Then I can get effectively irreversible dynamics here, too. In the same way that I had it in classical thermodynamics. And this is not unitary because information about my quantum system is being lost to the environment. It ain't coming back. Because it got lost in this macroscopic number of degrees of freedom that I call the quantum environment. So I could get a situation where, you know, my quantum system comes to thermal equilibrium. The spin to out of the up-in, the silver atoms out of the up-in in the original Sturmbierlock-Gadonkin experiment that we discussed, I mean, what's going on is the spin is the silver atoms bouncing around, hitting, colliding with other atoms, and it's depolarizing the atoms. And it's totally random. And I lose information about whether it's been up or spin down to the collusion, information that's stored in the other atoms which provide an environment for any one atom or the walls of the up-in itself which interact through collisions for example. So I can get an increase in entropy. I can get a decrease in entropy. That's called cooling. I can change the entropy of the system. So if it's non-unitary dynamics, do not conserve purity or entropy. Entropy, I could heat. Entropy increases. I can cool. Entropy decreases. Of course the entropy of the environment increases. And of course this is one of the ways that you prepare a pure state or try to. This is by taking a state, you know, some sort of state and then getting it towards this would be the ultimate in entropy decrease. Now you can't do that perfectly. That would violate the third law of thermodynamics. But you can get as close as you possibly can by improving the refrigeration. Is it possible to have a physical close to system dynamic? Because if you decrease entropy in the closed system the entropy outside the system must increase to keep it balanced. Right. So if it's completely closed then entropy is conserved. Because then it's unitary. So it's always relative to other degrees of freedom where that entropy is hiding. Couldn't you in theory say that it would involve all the degrees of freedom of the universe then the entropy would just not be increasing? That's right. I mean if you were able to track everything both of the system and the environment then the whole system, the entropy has remained exactly the same. I mean that's the same thing too. When you're going to crush the chalk I'd increase the entropy of the chalk. But of course that information about how that fracture is stored in all the vibrations of the floor which ultimately hit the wall and whatnot and I have to keep track of all of them in order to maintain complete knowledge. But if I say I'm only going to get a track of these degrees of freedom and all the rest of the degrees of freedom are effectively lost to me because I either don't want to bother or don't have the ability to keep track of them then the entropy of this system has increased. Information about that system is lost to what's going on here. Now there's more in the quantum world than just that about energy exchange and heating and cooling because we know that when we gain information about a quantum system or lose information about a quantum system our state of knowledge about the quantum system is changed in a very fundamental way. So coupling the quantum system to an environment affects our state of knowledge to a quantum system in a profound way. For example, if suppose that time t equals 0 I have a pure state and through the coupling to the environment I have some later state of system which is now mixed because I've lost information in the environment. For example, if that's the case then I have lost coherence. In other words, it might be the case that through this coupling to the environment I might have for example let's say first been one half system and I'm t equals 0 we had a pure state which we said generally as this matrix representation in the basis of spin up and spin down along some axis. That's my pure state. Through the interaction with the environment I can end up with this state where I've lost information about whether it was in its superposition or not and this is this process whereby I lose this is called decoherence. Decoherence is a dynamical process completely described within quantum mechanics. I don't need any magic collapse postulate. It's as equivalent in classical mechanics to the approach to equilibrium. I can derive this in a non-equilibrium statistical physics whereby I have some model of the environment some model of the coupling of the statistical environment and some effective coarse-graining. That's important. If I don't coarse-graining then I keep track of every vast detail and then I never lose. I have to somehow say that that went away effectively. So decoherence is describable completely by so consistently within quantum mechanics in the same way that the approach to thermally equilibrium is described within the context of class mechanics. This kind of evolution is non-unitary. Now, there's a few other things I would say about this kind of evolution. There's lots of ways of thinking and how to think about it is that the environment measures the system but doesn't tell us the result. The environment contains information about whether this thing was up or down but we don't know. If we could dig it out of the environment then we'd know which one it is but we can't do it because it's buried in these crazy molecules in all kinds of directions. And since we don't know it it's as if Alice prepared spin-off with this probability or she prepared spin-down with that probability but didn't tell us. For all intents and purposes it's the same thing. So what that means is that the environment prepares another way to think about decoherence in a related way is that the environment does a random unitary. It doesn't tell us but we don't know which one. So how would we describe that evolution? Well, we would describe that as a following. If it did a particular unitary evolution let's say the i1 then the evolution on the state is that. Right? But we don't know which one it is. So we have to average this over all the possible weighted by some probability that it did a particular unitary evolution. Okay? So this kind of evolution is not unitary, it's a sum of unitary weighted by these probabilities. In fact we typically would write this in the following way. This operator ai is the square root of probability and the i is the unitary. This is what's called a cross operator. This kind of evolution is a map on the density operator takes the initial state and the final state but it's not a unitary map. It's a non-unitary map. It's what we call a completely positive map for very technical reasons but basically it takes a positive operator a completely map with a positive map. For example, let's suppose I have a spin one half part. Let's say that spin one half part was initially polarized along the x direction. Okay? Let's say a time p equals 0 it is prepared along the x direction. Spin up. Okay? And that state we know is a superposition of spin up and down along z, right? Now suppose that the environment gives a little kick, a random kick to the spin kicks its direction by giving a small rotation around the z axis. There's some environment that's out there that's giving it a random kick and it rotates it about the z axis by a small some random angle. Okay? So let's say the environment kicks and rotates it slightly about the z axis. Now that rotation action of rotation we will study that in detail later in the semester to go right down. The rotation around the z axis is defined by a unitary operator. It leaves the up along z and down along z alone. So those are the eigenvectors of this. And the eigenvalues of a unitary operator are phases, right? Do you remember that? And the phases are if my angular rotates it around around the z axis by angle find that this is e to the phi of the 2 and this is e to the plus phi of the 2. This is the rotation operator. Rotates around the z axis. So let's apply this to our state. So let's say so my rotation by angle phi on the state I'll do it in the density operator. So this is equal to let's write this out. This is a half up up plus a half down down plus a half up down plus a half not plugging in the state. It has diagonal matrix elements and off diagonal matrix elements. Right? So what happens when I apply this operator to this? You guys tell me what happens to this guy with the u dagger? So let's look at this diagonal term first. What happens to this term when I apply this guy? So let's look at that. U dagger, you pick up the phase and then on the other guy you get the opposite phase. Right? So it does nothing. Right? And that's true because this is an eigenvector of rotation around z. It doesn't change it. So the diagonal terms stay exactly the same. But what about the off diagonal terms? And this guy either get u to the minus i and this guy get the conjugate of a u to the plus i. Those get the same. So this guy becomes one half either the minus i. Right? And this guy becomes either the plus i minus plus. The other guys don't change. So written as a matrix after this middle random kink does what? Well it leaves the diagonal terms the same and then puts a phase on the off diagonal. Because we've rotated by that. So now suppose that I have a random phase. So my actual state at a later time is the average over all phases. So I'm going to average this by summing over all the random rotations. I'm going to integrate that over all angles. Zero. This became a completely mixed state. I started with a maximally pure state. A completely pure state. And I ended up with a maximally mixed state because this state was a statistical mixture of this and this and this and this and this and this and this and this and this and this and this. And all of those states have phase relationships with respect to one another. And when I statistically average over them it's as if they have no phase relationship with respect to one another at all. I've randomized the spin. This is what we call defacing channel. Which is another word for decoherence. I've lost the coherence because I lost the phase information about my superposition. Alright. So, decoherence is a phenomenon. It's a phenomenon that must take into account when we have a quantum system that is coupled to the environment. And quantum systems are always coupled to the environment. Even the vacuum is the environment. Because the vacuum itself is a quantum thing. And an atom, it's in an excited state will decay irreversibly. That is coupling to the open quantum system. That is a non-unitary evolution. And that happens because the vacuum itself is the environment. So, last bit about this. Does decoherence solve the measurement problem? Some people would say it does. For example, Zurich, I've been in Los Alamos, is one of the key figures who's developed this whole idea about decoherence and how important it is in thinking about quantum mechanics. And there's no doubt it is. Some say decoherence is then the answer to the measurement problem. Because I went from a quantum superposition state to a state that's a statistical mixture. And that's what we kind of see. So, what do you think? Does decoherence have that function anymore in decoherence at the end of the story? Well, what if we hit the new density operator with the rotation again? Well, it wouldn't do anything anymore after I've averaged it out. Because it does, I mean, the optimal element is now zero. Before your average? Well, then you keep getting random kicks. I mean, it's really the averaging here is all about lost information. If we keep that information, then it's just a protein instead. Yeah. So, I mean, we're saying we have random kicks as we don't know. And information is in the environment. So, if we kind of knew we wouldn't sum overall, I mean, would you still have a... quantum body state which has coherence between the thing that did the kicking and the kicking? So, yeah, it's only... the rabbit went in the hat here where I said, we lost information, but I didn't derive that very fundamental thing. I would say there, so my own, this is the editorial opinion here. So, we'll go into the editorial page. This is not a textbook. Is that decoherence is an important breathing of understanding measurement. One thing it does is it answers the problem, what is a measurement? A measurement. So, it answers the question of Bohr as we discussed and the Copenhagen interpretation invoked the classical world. It said that a device performs a measurement if it's a classical measuring device. But it didn't quantify what devices should count as classical and what devices are just other many body quantum systems that are talking to this little quantum system. What we can say is a device is... does a measurement when effectively the pointer of my meter is classical and if that pointer doesn't the different quantum states of the pointer have decohered. In other words, I can for all intents and purposes never see quantum interference between the different states of the meter because those orthogonal degrees of freedom have been lost to the amplifier to the power source on the grid in P, P and F. For all intents and purposes, I can't see interference so it's a classical tendency. And so if I can't just from an operational point of view if I never see interference between those different alternatives they are classical. So in my mind answers that piece of the question. What kinds of devices are classical measuring devices? They are the ones where the states of the meter decoher through the measuring process because they are effectively macroscopic. So more intuitive than that I think. I understood what meant what decoherence doesn't answer is the question why do I see only one memory now? There's no collapse here. There's still no collapse of the wave function where it's one or the other, it's now in this mixed state. And there is no way to get randomness out of system which are fundamentally reversible and deterministic without putting it in by hand. And the quantum Bayesian say it's the observer the Copenhagen interpretation says somehow something random did happen we just don't know which one it was and it randomly became this pointer that's still a mystery in my mind. And it's still something we don't have a self consistent picture and we just kind of do a manner. We say something above quantum mechanics and we just Adrian you had a question about the density operator and the notation that we didn't get a chance at. Yeah, actually so there the notation is the sum of this guy? Yes, perfect. So it's the sum of the same ket? Yep. Can we write it as the sum of different kets? No. So that kind of state is true that if we look at this from back over here that this is a sum over some different kets but in itself that's a must collapse to something which is a projector or a sum of projectors. Should I be wrong about it? There is not a general form. I mean it's true we can always right grow in a basis right so if I put in a complete set of that's a perfectly good representation so it is a sum as you say of different a broad and a ket of different but that's a must collapse ultimately to this. There is no state of a system which is this five times sine is not a state. As long as the sum ultimately collapses to this then yes because as I said this is a sum but this is not a permission operator right? Yeah. So it would collapse to that only if you put it in the right basis though correct? Yes exactly you wouldn't be able to see it like as we say this guy doesn't collapse that because in the z basis it has off diagonal moments but in the x basis it's hard to tell but the basis is better to think about it from basis independent indicators like is it her mission is it a positive operator which if you look at the expectation down your expectation should always be positive you can't always look at the sum and see it alright so to conclude this I want to say that let's we have talked about for weeks now the postulates of quantum mechanics and we started with the Copenhagen interpretation and we kind of modernized Copenhagen I want to state for you what I would consider to be the modern postulates of quantum mechanics which are unfortunately not written in your standard textbooks on quantum mechanics so I'll write them in sort of two so I'll write them in the Copenhagen way and modern states are vectors in the Hilbert space I'd say the modern state is a positive operator comes with probability P A given by the square of the amplitude where A is an eigenvector of observable what I would say is a measurement is described by a P L V M which is a set of positive operators which sum to the identity such that the probability of outcome U is the trace of A these are generalizations of these and then we have one final thing is about dynamics general dynamics of the state of the system that's Copenhagen modern we'd say the closed system row of T evolves according to unitary transformation and finally generally a CP map which is that row of T is given by a sum acting on the state conjugating that where they're not unitary post-measurement state if we find A that's Copenhagen modern post-measurement state is given by a particular cross operator given associated with that operator acting on the state we know of us you now know pretty much now I note that the modern thing that I broke down is the void of physics says nothing about the physical world doesn't say anything about permission operators doesn't say anything about observables where the heck is the physics does physics forgot states well what it's saying is that this framework here is a kind of information theoretic framework and it's a framework that has developed in quantum information science which is my subject of interest and research but it tells me that there are parts of quantum mechanics that in some sense are about information theory that aren't really even about physics how we know things about the world and how we make predictions about the world not about the nature of the world as the physical world as it is and we want to now finally one month and a day after say how does this stuff connect to physics and that's what we're going to begin now so how do we hook this information theoretic foundation onto the physical world so the way we're going to do that is through something called not hysteria now note there was physicists about a hundred years ago we studied in that time classical mechanics but not hysteria is very profound and important so what not hysteria tells us is the following that every physical symmetry is associated a conserved so this is an intimate relationship between symmetries and conservation this is something that can be derived from the quantum mechanics you guys derive that in your classical mechanics you might have seen it somewhere and then there's a connection here to mathematics which is that symmetries which is said for symmetry transformations and in particular what we're actually when I should have said physical continuous symmetries in Portland a group of which are continuous these are the zebras that the conserved quantities are the generators of the group so what are some examples of conserved quantities energy momentum these are associated these are conserved in nature because the universe has certain symmetries because we believe that fundamentally the universe is this all properties are the same at any point in time the properties of the universe are the same properties we can ask what the fundamental constants are actually changing in time but to the degree to which we believe in that this is related to time translation and there is overall I mean of course we can have local variations but overall the laws of physics we assume are the same no matter what point of time we're at momentum is conserved because of spatial translation variance that is the laws of the universe are the same at any point in any place in the universe and angular momentum it is about rotational the universe is at some level as a product what this says is that each one of these symmetries is associated with conserved quantity these symmetries themselves are groups of transformations on space and time and the generators that generate those transformations when I need my generator I'll explain in a moment are the energy the momentum the angular momentum so the connection here to the quantum theory is about the generators of symmetries and the symmetries themselves so let's talk about groups remind ourselves when a group is so a group is a mathematical construct so this is a set with a composition law so we have some set of objects let me just call them lambda and there is a composition law such that if lambda one and lambda two are in the group then I have some way of composing these two together this is also a group it has one group we have an identity element such that I compose this with the identity in either direction I get the in back and it has an inverse such that if I compose this yeah, that's right it's also an associative composition okay so this is a group and in particular it could be a lead group if this set is a continuous kind of differentiable set I'll just loosely say that it's a random like point on the line so how are we going to connect this to quantum mechanics excuse me if I'm going to speak by you here no, what we're going to do in a moment is what let me explain so you can digner talk about the theory of symmetries in quantum mechanics we're going to talk about this in great detail so next semester let me just begin this so in quantum mechanics we represent by unitary or anti-unitary we haven't described anti-unitary we'll get to them they find my role operators what do I mean by that well what it says is that for every moment in the group I associate a unitary operator and this is a representation of the group if the operator composition of the two is this so this is what we call a group representation so the elements of the group are unitary operators now and everything that was in G is now a U depending on that okay so for example the U for the identity operator is just nothing in it for now that equal to 1 is just the identity operator and U of the inverse is the inverse unitary which is the anti-unitary so this is a representation of the group and you always make a unitary representation or does that only apply to something no you can always do that you can always represent them by matrices but not unitary so now if it's a lead group and there's this notion of differential continuity we could have what we call near okay so this is the identity I'm going to have something that's close to the identity in the sense of a differential distance away from it or a small distance away from that okay so um let's say I have some unitary which is the identity involved or composed with some small epsilon is in some sense small close to the identity this is equal to I claim the identity operator plus that epsilon times an anti-permission operator how do I prove that so this is let's say this is the group under addition what do we have to prove well we have to prove that the composition law works and we have to prove that the inverse is the adjoint okay so let's do the composition between two of these well that's true that works this is you add to the thing that's close to that sum that's fine the composition law obviously works this was addition the inverse according to the addition would be minus epsilon so that's equal to one minus epsilon a but a is anti-permission so minus a is a dagger so this is equal to that which must be so this is a way of representing the elements of the group near the identity and this is known as the generator of the group now there may be if I don't have a one-parameter group if I have a multi-parameter then I'll have more than one this would generally be a vector and I have a vector of possible generators but for the moment let's talk about one-parameter group like translation in time or translation in one direction space now by convention we write a as minus i times a Hermitian operator so that the near-unitary or near identity let's say something about this connect to physics let's talk about in the last two minutes and then I'll commit time translation so we want to translate the system in time from some initial time to some later time and those time translations are parameterized by all times along the real line so we're saying that the group here is defined by a unitary operator parameterized by that element of the group where I'm going to fix the initial time that would just be some agreed to this is my time translation operator let's talk about the near-unitary near identity so let's say that we go just a differential time away from t0 so this is the near-unitary if I didn't do anything if I didn't translate by any amount this would be the identity of the time amount this, according to this is the identity operator minus i times something which I'm going to call omega why am I calling it omega because this has the units it's a Hermitian operator it has to be now, what is that operator well, according to Noter's theorem the generator of time translation is a thing that's conserved and what is that? energy so that means that this operator has to be proportional to energy and that energy operator is an operator we call the Hamiltonian so there's some constant here it has to be that so that this thing so alpha has the dimensions of energy alpha has the dimensions of alpha are equal to 1 over energy this has the units of 1 over time and this has the can this thing, energy times time is in constant physical we call action now what that constant is is not given deroidable it's just given by nature can anyone guess what that unit of action is? that's constant, hooray we finally get h-mark no, it's h-mark that's the constant that appears here constant which allows us to translate the physical quantities of energy and time into Hilbert space units and so we have that the unit d is the identity minus i over h-bar times the Hamiltonian d and we have finally written down and we will continue with this symmetries, conservation laws that's the hope