 So, let us again redo some, I am going to respond in my notes here to give you a better response here. In quick video that we are talking about the connection for a set of, I will do it with a family of random elements, A. A here comes from an index set, here the index set is set of integers and typically context of little spectrum index set is positive integers or non-negative integers, so we are talking of a process, a specific process which is collection random variables, it could be binary, it could be infinite set of, so each one of them is at different random variable, that is what we are going to understand, we will make some simplifications and in fact now I am going to reduce the most simple specific process first, we gradually understand the concepts, but in its most, what I would say in its most general form, we are actually considering a set of random variables, each one of them can have different distributions, different beam, different variance, different high order properties, so as a most general example, we are going to work with a very, very simplified form of this particular, so it is not going to be this complex, most complex case, nevertheless I said that this whole theory was actually to level up by then, it is not so easy to understand it, so I showed you some realizations of the process and said, so the process can be a little realizations, this one abstract plot, I will show you some realizations of a real system, basically the idea was that associated with each time point, there is a probability density function, that is a model, you know modeling system that has a kind of behavior, one confuse that we have, here it is probably much more complex, we are developing a model, we are saying that each one of them can be looked upon as a sequence of random variables, and each one of them has some pd episode, pd is again a model, let us say Gaussian density function is a model that we have developed, the idea is model that we have developed, I showed you this earlier last time, this was temperature measurements collected from a bath from some period of experiment that I conducted today, this is just cold water sum, you know there is some water recirculation in this because I am pumping liquid out of it, it was the steam, but not everything some liquid is you know from the pump is diverted back to the sub, there is a recirculation process, some heating of the water in the sum, and you can see fluctuation in the temperature rate, it fluctuates by 2 degrees, this is data collected over some 102 hours, I think 5 seconds from an example, which is, and I said that this kind of data, you come across in all different ways for block market and exchange rate fluctuations, because it is a speech or you have various units or DCT, let me show you some examples, this is an example, I have looked up from, I have given here reference, this is from NASA website, this is line plot of global mean land ocean temperature index, this is global mean land ocean temperature index from 1880 to present year, K here means year, so 1880, 1890 or 1882, it is convention to draw connect these points, actually these are discrete points per year, it is a convention it is a convention, you can probably think of this connection as interpolation if you want to, but actually these are separate points, none of these they are connected, now you see this is a deviation, it is a deviation variable, you see this, it is not even in terms of absolute temperature, some reference temperature is taken between 1951 and 1919, some average value is taken between 1919, that means subtracted from the global mean temperature and the deviation data is recorded, this data is of importance in global warming, so this dotted black line is the annual mean of temperature, it is scripted and you would like to know whether there is any relationship between now to ask me what are the inputs that are relevant here, that are causing these changes, developing a model from first principle is very difficult, you can appreciate it, it is not like this is not possible at all, it is to work on other predictions, of course they can develop all these things from physics, but editing this completely from physics is a real raw order, it is not possible. And then the what is shown here red line which is shown here is a moving average, moving average of last 6 years or so, I think you know that moving average right now, that is what is important, what is important is the data, the temperature deviations, it is the other data, it is speech data, this is the recording of syllable A, R, samples are taken, 10 samples per second, this is just first thousand samples and analysis of such symbols is very very important, this is the time series, I can view each one of them as a random variable, what is going to happen after some time or what is going to happen at a particular instance, we have a model of these random variables. Now where is this data of interest, this data of interest when you do speech recognition, so we have all we can go to, so first thing which I want to stress here that the data that we are going to learn as a part of these lectures is not limited to the process mental or control in the world, the modeling of any system where you have data coming, some measurements coming and if you happen to know some inputs, manual inputs that go in, where the weather case probably they are not manipulating but there might be some inputs that go in like coordinate CO2 additions as from the temperature of the data. I am going to now first introduce the simplest of the stochastic, which is the most similar idea stochastic process that I can think of, this is going to be called as discrete line white mass, why this is called white, we will come to that, hopefully in this, we are very familiar with the next lecture, white about white mass. So what is this, let us go over it, it is a random process of variables easy even if you, whether I start k equal to 1 or I start k equal to 2 or whatever, the collection of random variables even if it is 3 or it is 0, it will be 0, where typically k would be no time, separate time, we are in the world where we get data at, let us assume that it is discrete time instances. So e k are independent, so e 1 and e 4, e 1 and e n are independent of each other, the collection of random variables, all of them are independent of each other and they are equally distributed, it means all of them have identical distribution, simplest random process, a collection of random variables, all of them are same distribution and they are independent, which distribution if you want to go even simpler, say all of them have Gaussian normalization, simplest, in fact what we use most, there will be many models, means normal process of white noise is the Gaussian normalization, that is the most useless process that we need. So yeah, the second argument is from the problem is space, actually we suppressed that here, I should write e k over time, omega is the random variable associated at, if you fix yourself to time k, see here this last multiplication, the first time you make is if you say that zeta is equal to zeta 0, then you get a realization, a particular realization and you fix yourself to a time and the second variable is the random variable, second argument is the random variable, so at a specific time the second argument is the random variable, so there are two arguments but we cannot be writing all the time, two arguments, so we suppress, so it is a convention to u k, actually it should be e k, omega, so we are not going to keep writing omega, there are always two arguments, one is fix yourself to a time k, then it is done at least, now I am going to assume even more, I am going to assume that this is a zero in process, zero in process, that means if I take mean of e k at any time, now you can see it, it is simplest way of defining the stochastic process and it is variance is equal to sigma square, each one of them, all of them are identically distributed, each one of them have same variance, it is a collection of random variables, most ideal stochastic process one can do, that is sequence of random variables, it is called as discrete time wide price, and I promise to talk about what is so wide about this, but one can appreciate that this is the simplest stochastic process which can be done, nothing can be more simple than this, all of them identically distributed, all of them have zero mean, sigma square is variance of each one of them, so we often call this i, i, d, i, d, i, d, i, d, i, d, i, d, d, d, i, d, i, d, d, i, d, i, d, i, d, i, d, i, d, d d i, d, i, d, i, d, i, d, i, d, i, d, d, i. So this in the simplest process I am just showing you here the different realizations of this process, how did I do it, in MATLAB I related sequence of 10 normally distributed random numbers with variance equal to 1, 3 different realizations. If I do the experiment of generating 10 random numbers or 15 random numbers, I will get 3 different realizations of this stochastic process, every time I generate 15 random numbers they are going to be different, if I do this as a collection, as a stochastic process and plot it, then what will happen at instant 1, what will happen at instant 2, what will happen at instant 1, perfectly random, can you see yourself, see if I fix myself, if I fix myself to tie 4, then I got these 3 points, so there is a randomness if I fix myself to a tie 4, and then each one of them is coming from Gaussian mongrel distribution, so this is the realization, so that is the way you will develop understandings of what is the particular between the process and the realizations, sometimes we keep using it in a different way. So the same stochastic process, what is the stochastic process, Gaussian white noise, all of which have standard deviation equal to 1, and mean equal to 0, these are the 3 different realizations of that stochastic, the other thing which I am next thing we are going to consider is something called moving average process, moving average process, so let us start, I am going to fabricate another stochastic process starting from white noise, my ek is the white noise, what is the white noise, Gaussian white noise, you know what the white noise is, you just talk about it, what did you use, a new random variable called vk, can you see here, this vk is a random process, vk is a random process because it is summation of 3 random variables, I have collected some data, and then I have collected new random variable, think of these as you, for experiments I use is not here, in matlab I am going to do the simulation, right, then I can actually construct the I am not fitting you have to distribute I am not fitting I am saying that each one of them has a constant distribution and I am not fitting a constant distribution and I got three realizations I am not fitting a constant random variable you are talking of reverse feeling I am not making more attempt to fit any I am saying that the true process is one where each one of them has if I observe this process okay if I observe three different instances of this process I will get these three realizations okay I have got I am not trying to fit it please get it so yeah I have to fit in a continuous heat instead of factor random variable is the same actually yeah random variable is the same yeah I am saying what is the most simplest thing I am doing for this process most ideal all of them are equal many situations we construct ideal models to help us you know construct more difficult ones because see for example you take maybe of your resistance you have some but we ordered something for your resistance ordered something for your resistance and then we say that the real thing is some combination of these two effects so pick up this as a model now I have a model called white noise it is all of them I have to understand that and this is going to be one of the useful signals to develop you know ideal values right you know you know right right right right right why is it you come back we you come back otherwise we can introduce moving average and okay which is 3 points okay you can right okay okay okay okay okay The other kind of stochastic process that you could use is what is auto-represing process you can see here x is auto-represing process. So what is the meaning of this? The meaning of this is that current value of x is some function of previous value of x. We have seen this kind of equations appear when you are modeling this kind of a thing. So what you say is that the new value of x is some function of old value of x and a random variable of x and a random variable of x and a random variable of x and a random variable of x. So what is auto-represing? Typically the second part is the useful when you want to model periodic period. Current depends upon cost. So these two processes are again idealized stochastic processes or moving average processes. You see this, these are stochastic processes. What is the property of white noise? e k and e k minus 1, e k and e k minus 4 but e k and e k plus 4 were not related any much way. Can you say this for e k here or for x k? No, e k is a stochastic process where e k and e k plus 1 are likely to have some relationship. Same thing is true about x k. x k and x k plus 1, x k and x k plus 5, x k and x k minus 10 will have some relationships. And this is the feature that we don't use in other models. I am just showing you some realization of moving average. I just put the same data. I took one realization of white noise process, e k is the blue as the white noise process and this red here is the moving average process. What I have done is I have taken three points, find the b. This is how do I use this moving, how do I use this moving? You can use it many many times. For example, you are reading data for temperature measurements. They are noisy. We use average of, we use average. This moving is and simplest average you can just take some of past values and find the b. So, this is how the new process can look like moving average process we came in the right time. This is example of auto-recreation process. I took the same data and created auto-recreation. Auto-recreation, I just give you a equation that x e k is x k minus 1 minus 1 minus 2 plus e k is this blue signal and x k is plotted here which is x k. So, I am able to fabricate, I am able to fabricate new stochastic processes using this basic unit word white noise and some operations and this basic idea I am going to use in modeling for that. This is going to be like building block. Moving average process, auto-recreation process, I am going to have building blocks when I develop stochastic models. It is just one video by itself. Here is something that I am going to show you. I am going to show you. Now, let us look at the stochastic process at n distinct time points. If I take any n distinct time point, n is 1, it will be 2, it will be 5, it will be 10. I have a process which is k going from 0 to 2. It can fit any subset which is finite. And if any subset is finite, this finite subset could be consisting of n equal to 2, n equal to 5, n equal to 10. Then one can talk of joint probability. Each one of them is a random variable. You can talk of joint probability. So, what is the probability? I can pick up some numbers. It is a zeta 1, zeta 2, zeta n. And I can define a probability distribution function. What it is? It is function of the probability distribution function. It is function of n. Now, you had this question. It is function of n random variable zeta 1 to zeta n. And it is function of the time point zeta 2. See, when I am saying this question, I am not insisting that there is a p next to each other. It could be time 1, time 17, time 19, time 200. And I can ask this question. Is there a joint distribution? When such a distribution exists, then it is called as a finite distribution of distribution function. Most of the times, because when you know the concepts of distribution, they work with Gaussian variables and multi-dimensional Gaussian distributions can be defined very easily. So, even though this definition is very, very varied, it is practically attractive if you point out the Gaussian normal distribution. The simplest process, as I said, is the Gaussian distribution. Where all these joint distributions, all possible combinations of other things are possible. Even though I am a pedagogical reason, I am defining all these things. Each one of these concepts may not be directly used by, when you start working. So, for making the subject, development consistent as defined in some way, because all of them are Gaussian random variables. Or if all of them are jointly Gaussian, it will go more complex than all of them are individually Gaussian. What is nice about Gaussian distribution? How many moments are required to describe Gaussian distribution? Only two moments are required. Mean, standardizing. So, throughout course, we are not worried about Gaussian distribution. They are not getting distributed. Or multivariate process. The nice thing about the Gaussian process is that it just needs to know mean and standardization. And then we have to find out the moments. The first thing is mean of the subject. What are the two moments that are of interest? Mean and variance. If you go to a book, you will get this definition. F here, probability distribution function. And then you will get, this is the definition that looks very complicated. The figures over and the sample space, the guy in the sample space. That means all possible, the collection of all possible outcomes. It is new here. It is a true mean of the Gaussian process. And each other, it can be time varying. What is the nice thing about the Gaussian process? It is 0 or all of them. So, it is not time range for a write. Why does it feel very simple? It is the whole general definition. I am writing here for the sake of whatever we use for the write. The views are example of the simplest let us take one example. This is discrete process. Yk is equal to cos 25k by 29. Plus ek. Ek is a random process. It is a deterministic signal. What is the deterministic signal? Cos 2 by k. k is the time difference. Ek is white noise. 0 is white noise. What is the view of white? I can ask this question. Where do you get such a signal? Let us say you have some sinusoidal, you know, currents or voltage in the AC circuit. You are measuring it. The current, the true current will be here according to cos 2 by k by 29. But how to get measurement as a white noise? And we impose a model that is 0 means white noise. Then you get this kind of a signal. So, the question I can ask is what is the expected value of this? This is not going to be clear. White noise as we know it. But expected value of yk is not going to be clear. So, we take expectations. To take these expectations, what I am just giving you is written here. I do not have to compute any integrals. Just see here. So, integrals are written above the sake of defining. We can do some simple arguments and do not have to substitute any integrals. Now, yk, the expected value of yk is equal to the expected value of cos 2 by k by 25 plus the expected value of e k. We are going to appreciate this. If I write two integrals, then I can separate them and write this. Now, what is the expected value of the resultant signal? The signal itself. The expected value of e k 0. It is a theorem in white noise process. So, what is the expected value of yk? It is cos 2 by k by 25. That is the expected signal. That is the expected value of. It is a time varying e. yk is a stochastic signal which has a time varying e. You see this? That is the second outcome. Moving average process. We concept that the fabricated this process. We fabricated a simple process, a stochastic process with a moving average. Decided mean of this. So, mean of e k nothing but e k plus e k minus 1 is of e k and mean of e k plus 1. Each one of them is a zero mean random variable. What is the mean of e k? What is the mean of e k? And also show in the same manner that, being of water-egressing process, that is also being done. In water rebuesing process, but as I have instructed, just look at its derivation, now, please remember, when I am doing these arguments I am never constituting any indicator. So, do not be worried about the refuges which I have got. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. So what I am saying is that the mean is invariant to translate in time, that is what I am saying. of the last line, okay, go ahead to the other side when I put it, but I think in the whole line it is there, it is just simplifications that, you know if I have a stochastic process BK and if I have samples of this stochastic process, I can take the samples by to mean, okay, that is the mean of the, that is the estimate of the mean, okay, so estimate of the mean is very, very important, very, very important, it is an estimate, it is not going to be, when I write mu B, it is a true mean, you can do the estimate of it, how do you construct it, take samples, okay and what about auto covariance, if I give you a equation of a symbol, okay, you can compute mean, you can compute auto covariance, I give you some thousand values of some particular, okay, I give you the speech symbol, the auto covariance, okay, can put it on mu B, okay, very, very important, okay, so this, this symbol, you can find auto covariance, auto covariance, the problem that we are trying to solve is, first I estimate the mean, the estimated mean is used to find out this difference between BK and the mean, okay and then I take this summation here, okay, this summation will give me the covariance estimate, okay, that is why I have this little thing, okay, so same thing, if I have two stochastic processes, I can compute or I can estimate cross covariance, okay, I can estimate cross covariance, these are the well known characteristics, this is the implication of the one abstract type which I have to calculate density, okay, and then because I am taking this one is more fine, no, no, no, that is not good, okay, right now I have a signal because signal is perfectly taken as a same use time, okay, we are in the ideal environment, you are too much worried we are like, okay, so there are much more conflicts, okay, we are taking the situation there, so what is the, they are not synchronized, they are idealized computer, idealized sampler, where it is possible to pick up samplers exactly the same time, how do you do it or not, it really is not possible, right, this is the point, so when you go to a computer to do a question like, actually when you just ask, you talk about idealized sampler, where you can pick up particular measurements of the same use time, and then you say, once you have statistics, you can compute you know, what are correlations, we can compute cross correlations and so on, now there is a problem, I said that white noise process, okay, white noise process to have zero correlation, but if I do an experiment in natural, I construct you know some, generate some thousand random numbers, sequence of random numbers and I try to find out autocorrelation using this formula which I just created, okay, what I go from Jerry is that it should give me zero correlation, see when I compute I do not get zero, so what is zero, when you call, so I have to look for a situation that the autocorrelation is divisible, I want to come up and say that when I got this data, I did an estimate of autocorrelation, but this estimate is not equal to zero, but it is close to zero, what is close to zero? go to our system and say that something is close to zero, I do this experiment I will never get autocorrelation, in fact I will never get autocorrelation functions estimated on the real data, earlier I did the arguments which were you know integral rules and you wrote down those functions, here you are doing the data you are a judge that autocorrelation is small enough to be related then we use one property in your acceptance, this is the very big difference large separate distribution of autocorrelation function of a stationary process it can actually derive certain theoretical properties of autocorrelation function it can show that autocorrelation function is not an activation estimate of the autocorrelation function is the random data estimate is the random data, estimate see here I have started from data and I have constructed an estimate of autocorrelation estimate can change with the sample size, it can change with the sample size, so if I generate 20 different estimates which is slightly different now what you can show is that these estimates it have zero means, provided you have two process in the white noise process then there are zero mean And there are 0 mean with normal distribution standard deviation equal to 1 by root n, where n is the data level, so n is the data level, we got 1000 points then there is a distribution with standard deviation 1 by root output. So, sample space is, so sample space is if you take rho, take water polarization function rho beat out as if no value, then you know the sample space is all possible outcomes that you can, all possible estimates that you can derive from, what is the, what I do, if I take different realization of the process, I will give a different estimate, see we just, well that is what you see in the viewer, it is difficult to go by, it is also different realization of the same footprint, if you use one realization over the last few process, you will get one estimate, okay, if you take under realization then you run the estimate, now what we say is that each one of these estimates is actually outcome of a random variable, there is a true value and these are just random value very important, so there are no estimators, they will not be very far apart, all the realization are estimates, okay there is a random variable that we will come to this, so we will take some time to separate, estimate such a random variable, just think about this, so what I can do, I can develop a confidence interval on this, confidence interval or never to judge whether the estimated correlation is close to 0 or not, and you have to have confidence interval at some point, okay, so if you can appreciate that I have estimated this correlation, I am going to generate a confidence interval, I show you the actual problem there, see I took 200 samples, 200 samples of a random number generated in matter, this is what we can do, generate 100 random numbers, now we have estimated variance 1, standard deviation and V is 0, okay, you know matter of a random function is very important, random, you just say random 200, I use the formula of auto correlation, okay, I know these are cross-ambient random variables, okay, I should get auto correlation equal to 0, practically when I compute I do not get 0, so I get you see those bars I have drawn there okay so what I have done is I show you that see first of all you can see here each one of them is a random gradient if I take another 200 this is an estimate of the auto correlation at instant the estimate of the auto correlation at instant 5. If you take another 200 random numbers you will get the same estimate you will not be same you will get an estimate this is like a difference okay it could be positive here and if it is positive you will get value which is positive so what we are saying here is that these estimates themselves are random variables okay each one of them has a positive distribution and variance equal to 1 by 2 1 by 2 1 by 2 okay yes exactly anyways so that infinite points I will get 0 auto correlation but I cannot take infinite that is why I have to take a call 1 is close to 0 there it is you will never get infinite right so I have to make this call what is close to 0 what is 0. Can I call this signal of white noise I can call it white noise because all these correlation values are within this confidence interval they can be neglected really within this okay how did I construct this confidence interval here let us go back to the first one I have to find out the normal distribution you have to find out 2 points then you will get values between so I have decided to neglect all the values of estimate that come between these 2 points minus point 1 into 1 and plus point 2 into 1 so I get an estimate of correlation between these 2 values I am going to call it 0 I get a value which is higher than this I am going to call it 1 this is go back and brush up confidence interval you do not have any confidence interval for a discrete time white noise this is what I get so the question is what is the white noise because estimates are coming between those confidence interval for non-zero line what is the what is the what is this possible what to go against this with 0 so that is why I have to be equal to 1 okay but a white noise the what are the correlation function is like this if we contrast this to a real example in which I have to look at the difference between the two points I look from a peak edge okay I have just constant temperature in the peak edge water I have just look at the parameter I have recording data in my computer okay temperature in the peak edge constant all the 2 I have to do that is the group temperature whatever it is okay I do not know what it is because I am getting more than some data here so the measurements are varying between 26 and 29 what was what did you do okay so I have to estimate the mean is 27 and variance of this data is 0.663 if I find out error between the mean and the data and I put in the bins I get a histogram okay look like the errors are distributed something like possible what a bad approach right so larger errors there are very few larger errors there are many errors which are close to mean value okay but actually this is their data okay auto correlation okay I do auto correlation to find out whether this is white noise or not white noise okay this is auto correlation function which matters so these are the confidence interval bounds what does it mean I can neglect I can neglect everything that is coming here and this is maybe the white noise okay so I am getting a positive distribution in this data let us look at the global average probability I assume that I can look at it as a stationary process whether this is correct or not I am imposing a model it is a stationary process okay and then I did auto correlation function just the formula which I have given you here right summation summation you know you can have a little program in my track what can you say here now all these values all these values are much above the confidence interval which means there is a strong auto correlation after 20 years what is lag you know lag is gap of 20 years gap of 10 years gap of 5 years gap of 20 years what happened 20 years back okay and the strong correlation is what is happening now cannot you know we belong what has happened in the past we are going to hold you that is what it says okay this is auto correlation function just from the random data which we got the realization of the data which we got we are able to uncover something happening inside what is causing this we do not know but we know that what happened in the past I will tell you what is happening now the same data which I have shown you I have just like auto correlation yeah I have done the same but those models can be a few next time you know better what you can see is that there is a correlation in the temperature for the last 30 years so this simple tool simple statistical measure you are going to tell me that whether what is happening now what is happening what is happening in the past it is a data I just use auto correlation you know that there are periodic repetitions it is readable here periodic repetitions are visible here that auto-reversing process is a force to get forward so there are short periodic behavior and there is a large periodic behavior you see that in the real data and which is evident in the auto correlation so what is when you are talking about what is happening now then you should go ahead and ask cross correlation I will just go over this again but you know there are two signals southern oscillation index and pitch population you can talk about cross correlation talking about next lecture where you can show that there are cross correlation these are next these are auto correlation and these are cross correlation so if the temperature goes down after 6 months the pitch population will go down you can correct that you will get this possibility