 Will we have homework this week? This was okay? We don't. But this is, you know, it wasn't about some things. Right. Then we'll be very psychotic. Okay, guys. So, you have homework to do. And there won't be another homework. So I'm going to go to the next week because you'll be studying for your exam. The exam will be by popular demand, or at least not popular protest, a take-home exam. And so I'll distribute that after the problem session. We'll do a review, time for a more review, on Friday. Are we from Friday? I don't know. And then we'll do in class the next Tuesday. All right. I will discuss more the rules of the game when the time comes. Or what's allowed and what's not allowed relative to the take-home. But not flexible. So the exam covers the center of the material in lecture notes 8 through 16. The associated podcasts are there. And through the problem set that's due today. Yeah. Which one? From October 30th. Last Thursday, the notes aren't posted. Not every lecture has, in terms of the date, has a lecture because we don't cover all the material. Sometimes it's just a podcast because the lecture notes extend over multiple days. Okay. All the notes are being posted at this stage. So that includes the discussion of quantum dynamics. That is, high alert picture, short alert picture, abatic evolution, sudden evolution, all that kind of stuff. The wave-mechanical representation in quantum mechanics, position of momentum space in your function. It's running a wave packet. In solutions for bound states, in 1D scattering states, all that kind of stuff. And the studies you've been doing, we've been doing in recent weeks on the online oscillator. Okay. So that's a little bit of stuff. We'll discuss it more as the time approaches. All right. So just to sum up what we were talking about last time. We talked about this idea of the quasi-physical states in phase space. They're also known as coherent states. And we obtained those states by a displacement, a phase-based displacement on the ground state of the symbolic oscillator. Okay. And that the phase-based displacement operator is a unitary operator which takes this form or equivalently in terms of the dimensionless X and P variables that looks like that. Okay. And the thing that's special about this state is that it is an eigenstate of the annihilation operator with a complex eigenvalue of alpha. Okay. So it's a kind of the closest approximation we have to a simultaneous eigenstate of X and P. Obviously, it's not a simultaneous eigenstate of X and P. It's an eigenstate of the annihilation operator. But the annihilation operator is this linear combination of X and P. So it's an eigenstate of that non-remission operator. Okay. And it has some very nice properties. It is, its expectation values are just the X and P values that's related to the real and imaginary parts of alpha. So it has its mean value is centered at X is in the X operator and its mean value is centered at P. And moreover, it is a minimum uncertainty weight packet with equal amounts of uncertainty in X and P just like the ground state of the harmonic oscillator is. And moreover, as a function of time, if the Hamiltonian of the system. I mean, I want to step back to state. These states are a set of states that exist in Hilbert space whether we're talking about a harmonic oscillator or not. Okay. They're just a set of states in Hilbert space and we can talk about representation, representation, coherence states regardless of whether we're talking about a harmonic oscillator in just those states. However, if the system is a harmonic oscillator, to say if we're talking about a particle in a harmonic well, then if at time t equals zero, we started in one of these states, one of these coherent states, then at a later time we stay in a coherent state. And the mean value, or as I said, the eigenvalue of the annihilation operator just evolves according to the classical trajectory. Okay. So X and T are just the classical harmonic motion of the particle. Okay. But if this is a wave packet, then, you know, we start, this is Gaussian state, the Gaussian wave packet just oscillates back and forth. Exactly like the harmonic motion does. Completely undistorted. So the wave packet is completely undistorted. It doesn't spread. It doesn't breathe. It doesn't do anything. It's just that wave packet that just goes back and forth in the harmonic well. As a classical particle would do. Okay. The other thing we discussed about these coherent states is that, of course, if they're going to have something that has a well-defined phase of oscillation, kind of know where it is in what the phase of the oscillator is, then it doesn't have a well-defined number of excitations. Right? It doesn't have a well-defined amplitude. It has to have some uncertainty in its axis because there's a number phase uncertainty rotation. And we saw that explicitly if we decompose this operator in terms of the number of states. It is a superposition of number states. It has a huge number, which is the square of the amplitude, just like you would have classical. But it also has some fluctuations in the number of excitations. And those fluctuations we calculated were given by the square root of the mean. And that's characteristic of a Poisson distribution. And we see that explicitly if we look at the probability distribution of the, which is the square of the amplitude in the number basis, then that's given by this, which is a Poisson. And in phase space, we kind of look at this and say, what this state looks like is some kind of packet that's localized in phase space around this value of x and this value of p with an uncertainty bubble around it that's completely isotropic. And classically it would just be this point in phase space. It's the pure state we know exactly what it means. Quantally, we can't define the states as having definite x and p, but we can define it as a kind of minimum certainty packet that surrounds that point in phase space. And it has this uncertainty, right? One half or one over root two in the other. And this state is a minimum uncertainty in phase. Just looking at this geometrically, the number phase relationship is a minimum uncertainty. Yep. Added to the phase in the alpha. You added time to it on the previous board. Adding and adding time to it in summation is trivial, right? In what way? Just bring it into the... Sure. It's a linear operator. So we can bring that in. We can bring that in. And that's u of t on that. And then what is that? Good question. If it's our monocosmic. e to the i h bar t or h t or h bar? Yeah. And what is... So that's e to the minus of h t over h bar on that, which is what? For this hematomy. So that becomes e to the minus i omega t n plus a half. Right? And then... Right? Now of course the overall phase I can factor out inside here. I don't really care about that. So putting all that back together. e to the minus i n. And then there's some overall phase out here that is interrupted. Right? But look at what that is. This can be combined with that. You're getting ruffa. Right? So this becomes alpha e to the minus alpha t to the power n over the square root of n factorial times... And I'll just put that in here too because I can't. It doesn't change anything, right? And so that's exactly what we said. That u of t acting on this is equal to that, which is alpha t, classically. Classically the phase just rotates clockwise. So... That's not good, right? What's the worst? What's the drama's history of the mankind? It's a lie. You didn't want to try it any time. Okay. So that's why this is called the Kwasi Classical State. The center of the weight packet moves exactly along the classical trajectory. And moreover, it's uncertainty about that classical trajectory is the minimum possible. Okay? And that is what makes these states the Kwasi Classical States. But there's a little bit more to the story than we have. I just want to conclude our discussion with one state. I want to say a little bit more about the classical limit. What does it mean? So, yeah, you know, what do we want? What do we...in what way... In what sense is classical physics different? From quantum physics. And there's a lot of ways. And so there are lots of ways in which we talk about classical only depending on which of those ways we're looking to match. One of them is we have classical dynamics. That is to say we might... One thing we might want to see in the classical limit is that we have, you know, Hamilton's... Let's just talk about dissipation-free classical physics for a moment. We have Hamilton's equations of motion, Newton's laws, right? We might want, in some way, that the... ...dynamical evolution of the system is described by Hamilton's equations of motion. That's one thing we might like to see in the classical limit. Now, we talked about this a little bit before. We know that we get this in the short wavelength in WKD. We see some recovery of this, right? We saw that in the ion states, and we also saw that if we have a wave packet that is made up of a superposition of energy eigenstates that have short wavelengths relative to the scale over which Petro is changing, then we have geometric optics, and that geometric optics is Hamilton's equations. So that's one way we see that. The harmonic oscillator, though, is kind of special in the sense that, for the harmonic oscillator, what's special about the simple harmonic oscillator is that the frequency of oscillation is independent of energy. If I take my oscillator, if I put it over here, just a little bit, having given that much energy, the period of oscillation is the same as if I started up here. Go back. The period is exactly the same. The frequency is independent of the amplitude. That's special. Only the harmonic oscillator does that. If I had any other potential, then this guy would roll down from back. But that period would depend on where I started it. And we kind of see this part of that as the fact that the Heisenberg equations of motion are the same as the classical. We have x dx dt, it's p over n, and dp dt is equal to minus the spring constant. That's the forces minus kx. That's just like classical, and what that means is that no matter what wave packet I chose, no matter what wave packet I chose, the center of the wave packet will follow the classical trajectory. If I take the expectation value of this, the expectation value of this follows the classical trajectory. Of course, if at time t equals zero, the mean value is zero, then we never see any evolution at all as in the stationary states. But if I took any wave packet, it doesn't have to be this place harmonic oscillator, I mean the place ground state, no matter what I chose, it will just follow the classical trajectory. So if I had some other funky thing, and maybe I have something that looks like this, it will follow the classical trajectory by having a ellipse around there. If I had any shape whatsoever, because every point on this curve kind of goes around at exactly the same frequency. That's what I just said. So it doesn't matter where it is, it goes around undistorted. So there's more than just the fact that the mean values follow the classical trajectories. We want more than that to say that something is classical. And that has something, but there's another aspect that distinguishes classical from quantum, and that has to do with prediction of measurement outcomes. So in classical physics, we can assign a joint probability to both x and p. There's some probability on measure x and p. Classical physics, we would have a probability distribution on x and p. In classical statistical physics, that's what you do, right? That might be the Boltzmann distribution. If it's in thermal equilibrium or if it's non-equilibrium, whatever it is, right? Okay? And the fact that we can do that means that we can calculate some, if we were to measure x and p or x or p, we know how to calculate that from this probability distribution. And this probability distribution has some properties. These are numbers that are always positive or non-negative, I should say. And we typically normalize the distribution to... In quantum physics, generally for business, the assignment, because x and p don't commute, now it's a little bit subtle why those two statements are the same, that they are. I mean, we know for one thing, if x and p commuted, then we could write down simultaneous eigenstates of x and p, right? And they would have both these. This would be a simultaneous eigenvector of x and p. And then we could define, you know, some probability distribution of x and p as that. We could try to do that. But these states don't exist. There are no states that are simultaneously eigenstates of x and p, right? If I like to find a state that has this form, I cannot do it, because x and p don't commute. Yeah, Steven. You said, would I generally, would it apply to what's being case, what they do with this kind of curve? Yeah, we'll come to that in a moment. Yeah. So, that, because this is not true, I can't do this simultaneously. I can't generally do this. Okay. So, now we do know, as you've been studying, there's this thing called the figure function, which is called a quasi probability distribution. The property that it has the correct marginals. So when we talk about, this is the, if I have a joint probability distribution on two random variables, then the marginal probability distribution is the probability distribution on one of those random variables, irrespective of the other. So we said the marginal probability to have a certain value x would be just summing over all other possible, because that's the margin of the distribution. And we demand that this equal for a pure state, the square of the weight. And the probability would be the momentum space. Why is this possible probability? Because this thing can be negative. Which means that it doesn't represent the joint probability to find x and b. Because the probability can't be negative. So, one of the things that we would demand from classical limit is that in the classical limit all of the figure functions are positive. And what distinguishes in some sense a classical state from a non-classical state is whether the bigger function is positive or negative. And this comes back to the question that Stephen was, when I wrote General Leo's thinking around the fact that there can be states where this is positive. In fact, you're doing one for homework right now. The coherent state. The coherent state is a state whose linear function is everywhere positive. Yeah? For the little balls, you know, with the expected kind of homework, just to be like just a nice little hill. Exactly. Okay. Right. So, let's take a... I'm going to give away the answer if you're going to derive it more. But let me just show you. Yeah! If you can give me the homework answer. I think that's a graph for the last part too. Okay. Come on, computer. Tom, let's do what we can see first. I need to turn it on a little bit. Just take a moment for the first bullet. Yeah, let's get to the bullet. So, this is a picture from the website of quantum optics, but it's the same view. So, what am I showing you here? What I'm showing you are the bigger function. This is taken from a general article. As a function of position momentum here, a position is called q and p. Okay. And we're looking at this for four different states. Guess what this state is. Ground state? Could be the ground state. Indeed. Or any coherent state, which is just a displaced ground state, displaced to some mean value of q and p. Okay. Notice, as you suggested, it's just a little hill centered at this. The bigger function is everywhere positive. Okay. So, I can just think about this as if there were a classical distribution of possible complex amplitudes. I just don't know which one it is. This is another example of a state with a positive bigger function. You know what that state is? That's a squeezed state. A squeezed state is a state which is also a minimum circuitry wave packet, but does not have equal uncertainty in x and p. Okay. And these are the marginals are being shown here. And these are the projections onto the screen there is by integrating over all p. You get the distribution of x. Integrating along x, you get the distribution in p. Right? What is this state? Like the n-excited state. Yeah. Which one is it? Second. Another guess. The number of p's. The number of nodes. Excellent. What does this have? It's got one node. Right? This is the square of the wave function. Right? It's not the wave function, it's the square of the wave function. And this is the wave function, square of the wave function in p. And this is the square of the wave function in x or q, whatever you want to call it. Notice in phase space, it's exactly what you would expect. In phase space, if I have the excited state, so here you know x and p or q and p here's the ground state. The first excited state is localized around some circle. Right? With that amplitude. So it must be that the wave function is rotationally symmetric around the origin of phase space. It must be. And that's what is being shown here. Notice that this state is negative. Right? And it has to be because there's no other way I'm going to get a node in here when I do the projection onto the x or p plane. It's got to be negative. So this kind of state is highly non-classical. And in fact, if I had a very excited state even though the WKB projection would have peaked around the more localized near the turning points in terms of its shadows of x and p, it would be really negative. It's only when I make a superposition if I can get rid of it. What about this guy? What's this? Something you might have seen in your homework. Close. Kind of like that. What's that? Which homework did you say? Come on, you did the homework. You've read the solutions. That's not a problem. This can be a problem on a oscillator, can it? It's not an eigenstate of a harmonic oscillator. It is not. Can they do part of it? It's too... Oh, could they be two different delta-wells? Or is that what she said? Yeah, that's what she said. It was too Gaussian. Too Gaussian! That's exactly what she said. This is mischroding your cat. Now! Look at the joke. It is in a... This is a state which is a superposition of being here with this position and this momentum or being here with this position and this momentum. If I were a classical distribution I would have some probability or some probability of my amplitude. Some probability to be here and some probability to be here. But because I'm in a quantum superposition of being here and being there then I have quantum interference between those alternatives. And I have all this quantum weirdness in the middle. Notice, what is the... This particular guy is in a superposition of two positions. Right? That's okay. So it's in a quantum superposition of two positions. When I look at the shadow on the position screen well it has a probability distribution here or there. Right? That's the square of the wave function. What does it look like in momentum space? It looks like that kind of thing. You know what that is? That's the double slit interference pattern. Right? What you see if I have a double slit and the particle is either here or there in some superposition then in the far field I see the Fourier transform. The Fourier transform is the momentum space wave function. So the negativity here is an indication of the non-classicality of the state. Okay? If we got to the classical limit what we want is somehow these negative pieces have to go away so that I'm left with just either this or that rather than a this superposition with this. And there's no way we can get that dynamically out of just a closed system dynamic. To get rid of this and we'll talk about this and we need decoherence. Decoherence is what gets rid of these negative fringes and gives us these two alternatives. We mentioned decoherence at the beginning of the semester and hopefully we'll talk more about it as the semester progresses. But one of the messages that I want to take away from all of this is that one of the modern perspectives in quantum mechanics is to think about classicality versus quantum mechanics of the statistics of the probability distribution of what you can measure. And moreover how those statistics can be used as a resource for processing information. So a classical state even though this is, you know, a state of quantum mechanics for all intents and purposes the statistics I would see if I did measurements would just look like classical Gaussian fluctuations. Whereas the statistics I would see from these guys would be nothing like that. And it's that distinction between what you can see in a measurement what statistics probability distribution can generate is one of the ways that quantum mechanics is distinguished from classical mechanics. It's not just about Hamilton's laws versus the Schrodinger equation. It's about the intrinsic statistics of the measurement state. So a quantum harmonic oscillator in some sense is in some sense always classical. Because because it's linear if you start in the ground state and apply a harmonic force you're just going to move this classical bubble around to make something non-classical and need something non-linear. Very good. So that concludes for now what we're going to talk about relative to the harmonic oscillator. Let's continue on the backboard. Yeah, let's turn up. We've been talking about the quantum mechanics so far. We've been talking about one degree of freedom. We've been talking about we've talked about state. It's been one half of the time. In which case our Hilbert space was the set up was two-dimensional complex vector space. Right? Or we've talked about a particle moving one phase of the image. Hilbert space was the space of square normal magical functions on the real line. Right? That's what we've been talking about. How do we talk about more than one degree of freedom? Now let's first review how we do that classical physics. How do we deal with that classical? Well, let's look at some examples. We might have two particles moving along the line. They have positions and momentum. Or we might have a situation where we have one particle moving in a plane. They're both examples of two degrees of freedom. And the way we do it is quite simple. If the phase space is what we call the Cartesian product of the phase spaces for each degree. So if I have X1, P1 for particle 1 and X2, P2 I take the Cartesian product of those two phase spaces and I have coordinates X1, X2 and P1. That's the Cartesian product. And the total dimension of the dimension of the phase space is two times the times the number of degrees of freedom. Now, of course what coordinates we use is not necessarily. We have lots of choices. For example, in here I can talk about the position of particle 1 and particle 2. Or I might talk about the relative position between particle 1 and particle 2 and the center of mass position of particle 2. That's a perfectly good set of coordinates. And then there will be a conjugate momentum to this P braille and P center of mass. And I would just I would have the relative momentum the relative position as my description. That's perfectly good. Or in this guy, I could say for example instead of talking about the X and Y coordinates I can have the cylindrical radius scripty row and angle. That's a perfectly good set of coordinates. A conjugate momentum P row, the radial momentum and a conjugate momentum P5 which of course is the angle so that's another possible set of I could have R piece of R or R5 I'm not hard to row scripty row. It's a tensor product but those two space are a partition product. Okay? Alright, so that's classical physics. What about quantum physics? So now let's talk about quantum mechanics quantum description. Let's for example just consider for example two degrees of freedom two particles or one particle which is called that I could call so it's two degrees of freedom having X A P A and X B A and D represent the two degrees of freedom they could represent two particles or they could represent the two coordinates associated with the plane. Now of course in the quantum description these are operators moreover so X A and P A satisfy the congenital communication relation for degree of freedom B but A and B know that that has to be true if we think about for example the fact that P is the generator of translations in position so if we let's define the momentum let's just think about this for example 2D as an X component and a Y component what we expect is that if I do a translation by some amount delta X on the X operator I translate this by an amount delta X where the translation operator you recall is e to the minus i over h bar which is approximately 1 minus i over h bar P hat delta X delta X is small so if I look at that infinitesimal chain over here let's look at the I component of this the I component i over h bar P J sum over J which is X she knows X's and these things have character no these are numbers second line down on the line should it be X hat ETX Y hat ETY here yes indeed they should and this is supposed to be equal X i minus i over h bar and so this must equal this is going to work this has to equal i h bar the chronicer delta so what we see here is a general rule as we'll understand more in a moment observable variables associated with different degrees of freedom commuting so I can simultaneously talk about the position along X and the momentum along Y quantum mechanics allows that alright what about the Hilbert space in this case for this particle moving in a plane defining representation so my wave functions I can I can since I can simultaneously specify this two degrees of freedom I can specify a joint amplitude for being having X a and X b such that this square is the probability density to be near X a and X b the joint at the same time and I can talk about the joint momentum now the position momentum so these wave functions if they're going to be probability issues like normalized they have to be in the set of square normalizable functions on the plane so this is the Hilbert space we're talking about there are functions of two real variables which are square normalizable of course I can talk about the position and momentum space wave functions related to one another to the Fourier transform no reason I can't in this case because X and p for different degrees of freedom can be I can write down a joint probability distribution that's kind of mixed I'm going to know what got a half of a tilde on it maybe I'll put it on this mark I don't know what we would call it we don't have a name for it per se but I can do this there's no reason I can't that's to say this thing is I'm just for a transforming with respect to b so I can write down a joint probability distribution by squaring this joint for the position of part a and momentum of part b it's not the same thing as the Wigner function which had to do with X and p for a given degree of freedom these are for two different degrees of freedom now there is a special set of states which are product states suppose that side of X let's write this one suppose that my joint probability distribution separates into a product of two functions then of course the joint probability distribution factorizes the probability particle probability density for that where this is the square of this wave function of this wave function the product states are in some sense uncorrelated that is to say the probability of finding this has nothing to do with where b is the two probabilities factorize but not every state is of this form not several which has a special name in detail later they're called tangled states suppose I have a state of the joint system which is some amount of that product and some other amount of this product chi and eta are not equal to sine of phi then the probability distribution for this thing is a factorize but this is in the Hilbert space because I can always take if this is in the Hilbert space this is in l of 2 of r2 and this is in l2 of r2 then by definition any superposition of them now moreover every state in Hilbert space our two-dimensional Hilbert space not two-dimensional Hilbert space our Hilbert space on the two-dimensional plane can be made up of superpositions of product states how do we know that well, let's look at a basis the two-dimensional superposition of product states yes so every state this is in every way function in this Hilbert space can be expressed as some superposition of products not every state is a product state but every state is a superposition of product states that is to say there exist basis vectors that are product vectors exactly so that's what I'm about to say so this is a basis these states the simple harmonic oscillator number states they are a basis Hilbert space on the line so I claim the following history before I say that what is it going to be a basis that means that the set we know is complete it needs to be a basis and if I look at that let's look at that let's look at this as in position numbers that says that the sum over all the number of states because on the basis un of x un of x prime is delta x prime that's what it means to be a basis now I claim the following history to be a basis as I say to take a product basis of all the basis vectors for degree of freedom A and all the basis vectors for degree of freedom B how would I prove that? well what I want to be true is that when I sum over all these guys both from their infinity of this should be star of course but they're real that's why they're off but this should be a two dimensional delta function that's the completeness function let's say this is a two dimensional delta function is that true? sure it's obviously true because it just factorizes I have the delta function for A and then the other sum is the delta function for B so what this says is that I can form the joint Hilbert space as a super position of product states from the individual degrees of freedom and that has a technical name it's called the tensor product not to be confused with the Cartesian product so what this says is that the Hilbert space L2 on R2 is equal to the tensor product of the two Hilbert spaces for each degree of freedom so this is HA and this is HB and this is the joint Hilbert space of the two degrees of freedom you can find this formally well let's talk about the Dirac notation here so let's so let's consider some Hilbert space which I've formed as the tensor product of the two Hilbert spaces let's suppose I have a ket psi A which is inside Hilbert space HA and a ket phi Hilbert space B then I'm going to define a product state just like I did with the weight functions but I'm just going to put this tensor product symbol in the middle but a state in the Hilbert space is obtained by looking at the tensor product that's just like multiplying the weight functions with one another often sometimes we are sloppy and we leave out the tensor product symbol and just put them next to one another but it is understood we often always write tensor products symbol there I'm just going to leave it lying on the side yeah B.a yeah that should be right so how does this learn the rules of the game well the tensor product has a following kind of properties it's linear let's just say if I take a constant a scalar complex number and multiply that's equivalent to I can multiply this guy or I can multiply that's the same thing the inner product another part of linearity of course is that if I look at the a superposition tensor product like this that's the sum the inner product so let's suppose I have a product state the inner product between this product state and this product state that's just you take the inner product first case and you multiply that number by the inner product on the second degree you know that's exactly what you would have you had two weight functions right I mean this is very fancy notation but if I had two weight functions that were side one of x a side by two of x b so this is one weight function and I look at the inner product of this with the other weight function or I look at the inner product between those two products base how would I do it I'd start this multiply it by that and integrate over the two variables right and that would factorize this integral and that input that's what this is saying in a more abstract form okay and finally the last one I'll say about this is so suppose A is an operator that acts on the word space A map vectors in the word space A to other vectors in the word space A and I have an operator B that acts on the word space B then I can define an operator that acts on the joint of the word space of the two values for example the product operator is A tensor B how does this operator act we get it acts such that if it acts on a product speed then the first operator acts on this degree of freedom and the second operator acts on this yeah so the operator can only act on an object that's in the right space though you couldn't act that no way it's an operator acts has a domain and the domain is defined by the hover space upon which it acts now we can be a bit here's an example just finish this last thought out so let's consider particle moving x, y, y now if I write the x operator that's the operator that acts on the x degree of freedom really this is a shorthand for x tensor the identity on line because it's really a shorthand we don't always and we rarely would write it this way it would be incredibly pedantic and p, y for example really means the identity on the x coordinate and the momentum operator on the y coordinate that's really what it really means now why do these guys commute? well you see because let's look at the commutator of x with p, y why do operators have different degrees of freedom to commute? because of this structure right this really is the commutator of x with the identity on y commuting with the identity on x and the momentum on y that's what it is so let's write that out that is equal to x tensor the identity on y times identity on x momentum on y minus the identity on x tensor momentum on y times x tensor that's just writing out the commutator in absolute order detail now how do you write this? I didn't tell you that I didn't tell you that if I had A tensor B times C tensor B that is A C tensor B that follows just for you how they act on the states so now you tell me what's this times this so this is x times the identity tensor the identity this times this tensor this times that what about this? 1 times this this times 1 operators acting on different degrees of freedom compute because they will always have this structure always in fact on this degree of freedom it has the identity on all the others if it has x and this is the degree of freedom it has the identity on all the others and the identity commutes with this so it's always the case that observables associated with different degrees of freedom compute so for example the spin that we know spin x y is a no commute but the spin along x with particle 1 commutes with the spin along y particle 2 it's only when they run the same particle that's the degree of freedom we'll continue this