 Right, so the last time we talked about the properties of particle diffusing while in the presence of a magnetic field and we saw that the different components of the velocity of this particle were correlated to each other and we also saw that the diffusion coefficient in the direction of in the direction transverse to the magnetic field was reduced from the normal free diffusion coefficient okay. Now let us turn to the problem of the analysis of noise in general and I would like to introduce various kinds of noise but we need some tools to understand this. So for a while let us look at some general formalism I would like to introduce concepts like the power spectral density and then the Wiener-Kinchen theorem and so on which help us analyze noise in general without any specific reference to whether the process is Markovian or anything like that okay. Now one of the key tools in this kind of analysis is the exploitation of the fact that noise general under suitable conditions is stationary that you have a stationary random process and then a great deal of simplification occurs. If the process is not stationary it is generally non-stationary then what one does is to look at it in windows of time where it is essentially stationary and then approximately you can assume the process to be stationary and go through the formalism I am going to develop now. So let us first look at what this entails what is meant by the power spectrum of a noise power spectrum. Now we have in mind in the simplest instance some random process as a function of time which is noisy random so it is got a very very irregular time dependence and for to be specific let us call this noise X of t or Xi of t for instance easier to write X X of t a random process and we will assume it to be stationary then one could ask if you plot this X of t in a typical realization as a function of time you plot X of t function of time you are going to get some highly irregular curve we saw in the case of Brownian motion it was so irregular it was not differentiable anywhere but in other cases the process could be differentiable for instance but in any case it is unpredictable in some specific sense because it is noisy okay. Now one would like to first of all to understand an irregular curve like this based on our experience with very complicated curves in which occur in sound for example is to Fourier analyze the whole thing and ask what do the Fourier components look like what is the frequency content of this noise but this is not always trivial because this function needs to be absolutely integrable before you can have a Fourier transform so it is possible that the Fourier transform of X of t does not really exist in that sense okay. On the other hand we have much more powerful tool which is the autocorrelation of this function of this random process and that is a much smoother function as we have seen function of t so what you do is to take X of t not X of t not plus t the product and take its average over all realizations and then this if it stationary is a function of t in general expected to die down as t becomes very large. Now we will assume that all this process has zero average so it simplifies the writing of the formulas otherwise I would have to subtract out the mean each time and write correlation functions so we would like to look at a correlation function like X of t not X of t not plus t and if it stationary this is of course equal to X of 0 X of t and we could ask what is the Fourier transform of this tell us in principle okay. Now turns out that there is a very deep connection between what the process itself does and what the Fourier transform of the correlation function does these two this is the content of the so-called Wiener-Kinchen theorem which I will write down we are not going to prove it rigorously but I will motivate it and we will go through some of the steps to see what is entailed. So what one does is to take this thing and look at it over a long period of time okay so if you took at various instance so here is 0 to t some long instant of time you look at it at various instance of time so this is t 1 this is t 2 and so on and compute e to the power i omega t i at those instance of time multiplied by X of t i you have sampled it at those instance of time time is some infinitesimal interval of time around it in this fashion and you sum over this i equal to 0 to n and let n become very large and take the average with the mod squared of this quantity so that it becomes real and take its average 1 over t 1 over let us put a 2 pi also limit as t tends to infinity. So consider this so I am trying to do a kind of Fourier transform what I am doing is waiting this with e to the i omega t i multiplying by the time interval and summing all these pieces together okay of course this thing here is also equal to limit in the limit in which these intervals become infinitesimal become 0 10 to 0 this is 1 over 2 pi t integral 0 to t t t e to the i omega t X of t mod squared. So this is a function of omega now let us try and see what this function gives us and what it is going to become equal to it will turn out and this is the Wiener-Kinchen theorem that this quantity is equal to the Fourier transform with respect to time of this correlation function. So that is our target you would like to establish this that that limit will turn out to be this one the Fourier transform of this quantity here but this thing here is defined as the power spectrum. So this limit whatever it is is equal to by definition the power spectrum of this random variable S X and it is a function of omega and we will see what information it contains okay pardon me should that mean oh I am sorry yeah t tends to infinity sorry of course sorry. So let us see how this arises okay now the proof is subtle it is not a trivial theorem it is subtle but we are going to slur over the important part of it the part that is really requires a little bit of justification we go through just the algebraic manipulation but it will motivate how this result arises to start with so we look at that integral but before that a couple of properties of this thing here of this correlation function. So let me call this Phi of t Phi X of t to show that it is for the variable X in fact we are soon going to have different random processes here and here okay different components say of a vector random process for instance so I need a little better notation when I come to it will be careful so this is equal to X of 0 X of t but notice also that because of stationarity I can add a t0 to the argument say without changing anything or I can subtract any amount without changing anything so this is also equal to X of minus t X of 0 if I just subtract t from the argument of each of the time each of the time arguments right but this is X of 0 X of minus t if these are classical variables if the quantum variables if they are operators then we have to be very careful there is a formalism which will tell you what this the correct answer is you cannot commute these things randomly but the fact is these are classical variables at this level and therefore this is equal to Phi X of minus t. So the first piece of information we have is that in the simplest instance of a scalar process single component stationary process the autocorrelation function is a symmetric function of the time this is why when we computed it for the velocity in a line of one component in the one component of the velocity for a Langevin particle we found e to the minus gamma modulus t for the correlation it is a symmetric function we are going to exploit this as we go along if you have more than one component of course then the symmetry property becomes a little more complicated and we will come to that in its time okay. So let us look at what this S X of omega becomes equal to the limit part I will omit and let us just look at the integral let us look at that thing alone so we have 0 to t d t 1 e to the i omega t 1 integral 0 to t d t 2 e to the i omega t 2 with a minus sign because I want the complex conjugate of that X of t 1 X of t 2 that is what this quantity is this modulus squared is equal to that now a whole sequence of manipulations first of all I can also write this as integral 0 to t d t 1 0 to t d t 2 and then X of t 1 X of t 2 no averaging or anything like that is being done I am just sampling the time series as we go along here times e to the i omega t 1 minus t 2 but this is a symmetric function of t 1 and t 2 right and this quantity is real so the imaginary part must vanish identically and indeed it does because the imaginary part is sin t 1 minus omega times t 1 minus t 2 that will be odd under the interchange of t 1 and t 2 and it will vanish okay so that is a trivial statement this is cos omega t 1 minus t 2 it is a real quantity so the imaginary part must vanish identically okay but you can also write this because it is symmetric under t 1 and t 2 and the range of integration is symmetric 0 to t in each of them you can write this as twice the integral from 0 to t 1 and then the next step is obvious change variables to from t 2 to t 1 minus t 2 so let us put set t 1 minus t 2 equal to t prime so d t 2 equal to minus d t prime so this guy is twice integral 0 to t d t 1 integral and when t 2 is t 1 it is 0 and when it is 0 it is t 1 so it is again 0 to t 1 d t prime x of t 1 x of t 2 but t 2 is t 1 minus t prime correct me if I am making a mistake we have to be careful t 1 minus t prime cos omega now let us set that equal to t it is easier because it is going to come out on the left so t 1 minus t cos omega t simplifies the notation okay now let us interchange the order of integration and what is this going to be if I interchange the order of integration this is equal to twice integral well t 1 runs from 0 to t and t prime runs from 0 to t 1 so if I interchange t prime will run from 0 to t and t 1 will run from sorry d t t to capital T okay so this is going to be 0 to t d t integral t to capital T d t 1 x of t 1 x of t 1 minus t cos omega t now the obvious thing to do is to change because of this thing here change variables to t 1 minus t right so let us put t 1 minus t equal to t prime so d t 1 equal to d t prime that is equal to twice integral 0 to t d t integral where does this go t 1 is t so this is 0 and capital T minus t d t prime x of t 1 minus t is t prime and then x of t 1 is t prime plus t oh we forgot the cos cos omega so which is twice integral 0 to t d t let us pull out this cos omega t because it does not involve t prime and then an integral 0 to t minus t d t prime x of t prime x of t prime plus t now look at what is emerging you got precisely the structure that you need for the correlation function if it is stationary because it is saying take any instant of time t prime and take x at that time and x at time t prime plus t staggered multiply that to and keep doing this summing over all t primes okay and this is the step which requires rigorous justification that if this random process has this property of ergodicity namely it takes on all the values in its available sample space given enough time an infinite number of times over and over again then the time average of that integral is equal to the ensemble average over some prescribed distribution over a distribution for the stationary variable which we have not specified. So this property is known as ergodicity let us write it down time average over a very long time in the limit t tending to infinity tends t tends to infinity to ensemble average this is at the root of equilibrium statistical mechanics if you think about it because it says that long time averages of the system given enough time all the accessible microstates are accessed by the system and the average over all of them is equal to an ensemble average over some prescribed distribution which you have to find. So what is actually being done what you actually measure in experiments are averages time averages what you compute using the rules of statistics or statistical mechanics are ensemble averages and the article of faith is that one is equal to the other this requires proof rigorous proof and it is the property of ergodicity in the context of random processes you have to specifically check that this is true in a given instance that the property that this is that the random process is indeed ergodic this is possible to do once you know a little bit about the statistics of the process you can do it we are not going to prove it we are going to assume that this is true and then this quantity limit t tends to infinity t integral 0 to t dt prime x of t prime x of t prime plus t is indeed equal to this quantity is equal to the ensemble average of this quantity x of t prime x of t prime plus t that is the property of ergodicity that we are using. Then of course the power spectrum reduces it becomes we have done a little bit of slate of hand here I have interchanged limits I have shuffled limits here I have taken the limit t inside here and then said argued that this guy is in fact the ensemble average but this can be made rigorous this is the part that I am slurring over but it can be made rigorous the fact is that it is physically clear that it is if you ergodicity is valid it is this quantity this integral integration which is a time average because of this is equal to this correlation function here okay. And of course this thing here by stationarity equal to x of 0 x of t equal to phi x of t that is how we defined this correlation. So finally it tells us that sx of omega that we have is twice the integral from 0 to infinity because remember there is a t going to infinity limit here of dt we will write it out explicitly x of 0 x of t cos omega t but we already saw that this is a symmetric function this phi x of t is a symmetric function. So you could also write this as equal to integral minus infinity to infinity d t or there is a 1 over 2 pi right. So this is 2 over 2 pi 1 over pi and it is equal to 1 over 2 pi dt x of 0 x of t cos omega t which of course is equal to 1 over 2 pi this is the Wiener-Kinchen theorem. Sometimes the there is a wrong impression that the Wiener-Kinchen theorem simply says that the power spectrum is defined as the Fourier transform of the correlation function no not true there is a non-trivial theorem here it requires proof which again I emphasize we have not given we have only done some of the manipulations but it is possible to show that if the process is ergodic and stationary then the power spectrum defined as that sampling integral squared is equal to the Fourier transform of the correlation function autocorrelation function. Now of course we have assumed stationarity here we have assumed all the properties of ergodicity stationarity etc but it is an exceedingly useful theorem in this form it is very useful either in this form or in this form you can write it either way you like. Let us see what it tells us what it specifically does let us look at some specific instances. So let us look at the Lanjama particle the particle that we talked about including that with the Gaussian white noise. So if you recall our equation was mv dot plus m gamma v in the one component case was equal to square root of gamma over square root of gamma times eta of t where this was a delta correlated Gaussian white noise okay let us let us call this whole thing let us call this zeta of t let me know what the correlation of zeta of t is zeta of t is a zero mean process satisfying zeta of 0 zeta of t equal to capital gamma delta of t it is delta correlated and I took the strength in here inside here okay. So what is the power spectrum of this s xi zeta of omega equal to what is this equal to all we have to do is to use the Wiener-Kinchen theorem namely substitute it in there and that is it that is the end of the story right. So if I put in here a gamma times delta function at t equal to 0 just brings out the gamma nothing else right. So this is equal to gamma over 2 pi and remember this gamma was 2 m little gamma k t the two cancels. So it is gamma m m m gamma k Boltzmann t over pi and that was it. Now we could ask what is the power spectrum of the output variable of the velocity itself what does that look like etc. What is that going to be so the output variable has got sv of omega equal to once again this is equal to integral from 0 to infinity or 1 over 2 pi minus infinity to infinity dt e to the i omega t times v of 0 v of t right but we know what v of 0 v of t is in equilibrium we computed it it is k t over m times e to the minus gamma mod t we computed it. So let us write that down this equal to by the way you could write this out as twice 1 over pi this guy here. So it is equal to k Boltzmann t over m pi times dt e to the i omega t e to the minus gamma whatever sorry cosine now once you have written the symmetric part then it is just cosine omega t once you have written it as 0 to infinity it is a cosine. So this is equal to e to the minus gamma t so it is equal to gamma k t over 1 over gamma square plus omega square that is not white noise white noise is something whose power spectrum is constant because it is delta correlated and you immediately get a constant this is independent of omega but this is got a Lorentzian shape in omega it drops down essentially what the power spectrum does is to measure the intensity of this noise in a given window about any frequency omega in a small window delta omega about omega tells you how much of the noise if you like how much of the amplitude is intensity is sitting there right and this says that it drops as a function of omega variable this is unrealistic because it says it is got the same power everywhere for all frequencies from 0 to infinity which is obviously unphysical the moment you put a finite correlation time it will drop to something like that is there a connection between this and that there should be because there is a connection between these two variables here okay now if you do this sort of closing your eyes do it heuristically and take Fourier transforms on both sides look at what is going to happen by the way our Fourier transform convention was to say that if you give me a function of t then 1 over 2 pi minus infinity to infinity dt e to the i omega t f of t equal to f tilde of omega that is our Fourier transform convention so it also implies that f of t equal to integral minus infinity to infinity the omega e to the minus i omega t f tilde of omega so we will stick to this convention so that these factors remain kept track of carefully now let us look at this equation and kind of take Fourier transforms on both sides see what happens then m times v dot well if I write v of t in this form and do v dot I pull down a minus i omega so the effect of Fourier transform is to take the derivative with respect to t and replace it by minus i omega so m times minus i omega plus gamma on v tilde of omega is equal to zeta tilde of omega or v tilde of omega equal to 1 over m minus i omega plus gamma zeta tilde the moment you have a relation of this kind in general you have one function relate the Fourier transform of one the output variable related to that of the input variable through a susceptibility of this kind this is the called the dynamic mobility in this particular case then the power spectra are related by taking simply the modular squared of this susceptibility okay that is a relation which can be proved in some generality so let me state here without going through the details that this automatically implies that sv of omega must be equal to 1 over m into minus i omega plus gamma mod squared s zeta of omega it implies that okay all we have to do now is to put work backwards and check if this is true or not so is that true we have already got we already had this chi here s for the velocity was equal to this and now if I take this quantity here this is equal to 1 over m squared gamma squared plus omega squared s so is this true I take s zeta of omega which is this and divide by m squared times gamma squared plus omega squared and I get this precisely so it checks out in this case we knew already the velocity correlation but if I did not know it I can now find it by using this relation we did this by a long procedure of actually writing down the distribution function for this we the solution etc. etc. found out its autocorrelation actually what we did was to solve the Langevin equation and found the autocorrelation etc. But you do not need to do that all you need to know is this relation corresponding relation so in more complicated instances where you may not be able to write the explicit solution down so easily you can still write down what is the the powers how are the power specter related to each other okay. So this is a useful trick to write down the power spectrum of the output variable given that of the input variable there is a name for this thing here what is this guy called engineering parlance it is a transfer function it is a transfer function it is exactly what it is it is the mod squared of what this is would call the generalized susceptibility in this case the mobility okay. So what it is telling you is if you give me a unit applied force of frequency omega the steady state response will also be of frequency omega and it will be attenuated by a complex number called the generalized mobility generalized susceptibility which is this quantity what happens if you have more than one component now things get a little more tricky we have to be a little careful here let us look at a physical example we actually went through one where we had more than one component so there what I do is if you got a whole lot of components of some vector process I would define S ij of omega oh incidentally one small property which is easy to understand we found that S x of omega was equal to twice integral let us let us get all the pi factors right is equal to 1 over pi integral 0 to infinity dt x of 0 x of t cos omega t and this is equal to S x of minus omega so it is a symmetric function of the frequency formally that is a useful piece of information that is one of the symmetries in the problem right now let us look at it in the more complicated case when you have more than one component right. So if you have a thing like S ij of omega equal to 1 over 2 pi integral minus infinity to infinity d to the i omega t x i of 0 x j of t the power spectrum now becomes a matrix if i and j run from 1 to n for example it is an n by n matrix with these elements here and there are all these cross correlations sitting here we still assume the process is stationary so the whole this every one of these averages function of the time difference alone then the question is what is the corresponding property of there but you can see by stationarity the following is true you can see that x i of 0 x j of t let us call this phi ij of t defined by phi ij of t but that must be equal to x i of minus t x j of 0 pi stationarity because I stagger the time argument by minus t on either side but this is equal to x j of 0 x i of minus t which is equal to phi j i of t pardon me of phi j i of minus t thank you okay. So we have this property here and therefore the symmetric and anti-symmetric parts of phi ij of t would be respectively even an odd in time right so this implies that phi ij of t plus phi j i of t this is the symmetric part of the tensor phi ij of t and that quantity is even in time as you can see and the odd part is odd in time okay where the anti-symmetric part of the tensor is odd in time that follows in a straightforward way yeah what we need here is the in this place is the following so let us look at s ji of omega equal to 1 over 2 pi integral minus infinity to infinity dt e to the i omega t so let us call this phi ij of t will be done with it phi ji of t but we just saw that phi j i of t is phi ij of minus t and let us take the complex conjugate on both sides so it is me minus this guy here and a minus this guy here this is real because my x is a real value random variable but now I change t to minus t in this integration and this gives me s ij so what is the higher dimensional counterpart of the symmetry property of s this is the power spectrum is an even function of the frequency for a single random variable for a random process a scalar random process in the moment you have a multi component process it says the ijth component omega sorry function of omega it says the ijth component of that tensor is equal to s ji omega star so what does it say about this tensor s ij of omega or this matrix it is a Hermitian matrix right so this matrix s whose elements are s ij of omega is a Hermitian matrix and we can write a generalization now of the Wiener-Kinchin theorem in this case so if you have a month we have done this already in one example let us let us take a look at that instance so that so again going to the example of particle in a magnetic field remember that we computed for a particle in a magnetic field b equal to b times some unit vector in the n direction we found the following we found that v i of 0 v j of t this correlation function phi ij of t we had an explicit expression for this quantity here right this was equal to what it was k Boltzmann t over m that is always sitting there e to the minus gamma modulus t sitting there to multiplied by if you recall there was a portion that depended on n i n j and a portion which depended on delta from the chronicle delta and then there was an anti symmetric portion right so this was n i n j plus delta ij minus n i n j cos omega the cyclotron frequency times t minus epsilon ijk nk times sign so this is the symmetric part in i n j symmetric under ij interchange and this is the anti symmetric part now what about the time reversal properties of these quantities we already saw what is going to happen we saw that phi ij the portion of the symmetric part of this tensor must be an even function of time and the anti symmetric part must be an odd function of time right but that is exactly what is happening so this is symmetric or even function of time and that is an even function of even function of t and this guy is an odd function in the diffusion tensor this portion I stated did not make it the odd portion did not appear at all okay we did not actually derive that formula for the diffusion tensor but I made it as a statement I said that this portion does not contribute to the diffusion tensor at all it is only phi ij plus phi ij of t plus phi ij of minus t integrated from 0 to infinity which was equal to the proportional to the diffusion coefficient but the odd part remains I mean it is sitting there and so on and it will contribute to the power spectrum and to the mobility and so on okay I have not talked about this maybe I will this will contribute to the so called Hall mobility okay so there is a contribution which is not not the usual current but the Hall current and that portion will make a contribution to it we have not looked at this in great detail we have not done we have not talked about the in the linear response aspect of this particle in a magnetic field but this also has a physical significance it is not it is not useless right. So if you give me a general process I can write down using the Wiener-Kinchen theorem I can write down the power spectrum of this process and then it gives me a great deal of physical information in particular what it does is this relation that it is equal to the Fourier transform of the correlation function this quantity here is in fact the response function in linear response theory so when you apply an external stimulus and you ask what the response of the system is like it gets proportional to this thing here what linear response theory is essentially is in the context of statistical mechanics it is first order time dependent perturbation theory together with the statistics that the statistical mechanics classical or quantum that you need and that is essentially what it is so this autocorrelation this cross correlation function here is in the absence of the external perturbation and that measures the response under the external perturbation to leading order in the perturbation in external force okay. So that is the sort of gist of linear response theory in some sense and this is something we have not specifically talked about if time permits will come back and make a few comments about linear response theory but I thought that one should know this because the reason is that what we are going to do is to go on to use this power spectrum the concept of the power spectrum to look at what kind of power spectra generated by different kinds of noise okay. We already have one statement that white noise will correspond to a flat power spectrum and the kind of response we had the Langevin particle for example has a Lorentzian power spectrum goes with high frequencies like 1 over omega squared okay by the way you are used to this in another language so let me write that down point out that is exactly the same thing that we are talking about if you look at resistor or at some finite temperature then of course there is Brownian motion of the electrons and that leads to an instantaneous voltage across the ends of this resistor and then there is a fluctuating current. So one could ask what is the power spectrum of the noise like what is the power spectrum of the current like and so on okay this is called Johnson noise if you measure the power spectrum of this voltage and there is a relation called the Nyquist relation which tells you what it is it tells you it is essentially proportional to the resistance and it is proportional to the absolute temperature which is why you would like to lower the temperature to reduce this noise here and that comes about very easily because this resistor is always got a self inductance so it is like effectively an inductance and the resistance in parallel in which case this L so you have L di over dt plus R of t equal to the voltage V of t applied or spontaneous we do not care what right in this fashion then this is the same as our problem our earlier problem was m d V over dt plus m gamma V of t was equal to zeta of t and then we found that in this problem s zeta of s zeta of omega was equal to what was it you have to tell me the factors now this is some gamma k t m gamma k Boltzmann t over pi or something like that yeah with the 2 pi over pi yeah so the correspondence between these two guys is that in this electromechanical analogy is m is this and m gamma is the equivalent of the resistance R that is the correspondence between the two so this immediately tells us that the this thing will also imply that s the voltage of omega equal to m gamma which is R k Boltzmann t over pi that is the form in which you are familiar with it in terms of Johnson noise right no almost almost but what form are you familiar with it pardon me same from 4 k t 4 k t 4 r k t for the factor 4 actually yeah what happens is the following our Fourier transform convention was to say that f tilde of omega was 1 over 2 pi integral minus infinity to infinity dt e to the i omega t f of t and correspondingly f of t was equal to integral minus infinity to infinity d d omega e to the minus i omega t f tilde of omega but the electrical engineers use a convention in which the 2 pi factor sits here 1 over 2 pi and not there so an extra 2 pi factor they also define the power spectrum as twice the Fourier transform so there is a 4 pi factor which multiplies this whole thing so for them this is not true it is a multiply by 4 pi this is equal to 4 r this is surely familiar right that is the form in which it is written in textbooks so this factor 4 pi is there I mean it is it is really there it has to do with the convention it is plus the fact that you defined it as 4 times the Fourier twice the Fourier transform but I just chose the simplest convention and I chose this purely as a matter of convention there is no because this is the one that is most convenient in the usual formalism of linear response theory where you have a one sided Fourier transform with a plus sign here for the generalized mobility okay or the susceptibility and if you use that convention then this generalized susceptibility has no singularities in the upper half plane in omega and has singularities only in the lower half plane so it was to ensure that that I needed a plus sign here and the 2 pi was a matter of convention because it corresponds with what is used when you go from spatial Fourier transforms from x to k so I just wanted to keep that whatever it is you do you have to stick to one convention so this is the usual Johnson noise or whatever there is also other kinds of noise like short noise semiconductor noise of various kinds so we will talk a little bit about that subsequently okay so let me stop here today.