 In this class, we will sort of revise and wrap up what all we have been doing in random vibrations. We will go through what all we did, some of them at least and then move to the complete use of these results of random vibrations for determining the tolerance of a human being for the vibrations that exist in a car. Maybe you have already done a course on ergonomics, I am sure that you know human factors, vibration, how what can be tolerated and so on, okay. So we will not go into those details as it is covered in another course, but we will understand this random vibration a bit more and see how we actually calculate or what is that calculations that we do in order that we determine the levels that the human beings can tolerate. So we will go through this, the whole of these things in one go. So we were looking at what is called as an observation, so in any probability sense, observation is an experiment and the result of this observation is the outcome. In our case, we are looking at the roughness of the road as we had seen. The set of all possible outcomes is what we called as sample space. So subset of the sample space is an event, we explained it with simple example and then the function or a map from an event to the real line is what we call as a random variable. So there is an event, there is a mapping to the real line which is called as a random variable and the map from this random variable to a probability measure can be done or we can directly give this probability measure to the event. So this in a nutshell is or the definitions that we need to know. We already saw that the random process is a family of random variables. Remember that we did this in the case of the roughness of the road as a random variable. So we said that we can take a number of measurements. So this family of measurements is now going to be used and we can call this as the ensemble. Remember that the random variables have two indices, I have put one on the bracket, in other words this r can vary from 1, 2, 3, 4 and so on and s is the distance and this indices 1 of s then zeta 2 of s and so on are the independent realizations as they are called or sample functions. So we have a number of such sample functions may be s and that we can call as zeta s and so on. So all these things, these roughnesses form what we call as the ensemble of roughness values. Now there are two ways in which we are going to look at this. We can look at it at a particular value of s n. Of course this s n can be or s can be replaced by time and you can define the whole thing in terms of time. Remember that we can look at this at s 1 and look at another one s 2 and so on. So we have now at s 1 a number of what are called as random variables or the measurements that we do at s 1 and we do similar measurements at s 2 and so on. Now this one for example I can say zeta 1 or zeta 2, zeta 3 and so on has a probability density function the most common amongst them being a Gaussian or a normal distribution. The Gaussian or the normal distribution function all of you know and that is the Gaussian or the normal distribution function. Remember that the difference between the probability density function and the distribution function, probability distribution comes out of integration of this probability density function or the probability density function comes from the differentiation of the probability distribution function. So for example if I have to calculate the probability of this zeta what is the probability that zeta is between a and b what I need to do is to just integrate this between the limits a and b, p, zeta gives the probability density function and if that is integrated you get what is the probability that this variable is between a and b. We are also interested in joint probability of two variables x 1, x 2, s 1, s 2 and so on. The joint probability is given by a Gaussian joint normal distribution function. Joint probability between two variables x and y is that look at that closely probability that x lies between or x of k lies between x and x plus delta x and y lies in a range. So in other words it is the second derivative or px, y can be written as dou by dou i of dou p by dou x or dou squared p by dou x dou y where p is the probability distribution function. The joint probability can be written as shown here and rho is what is called as correlation we will see that in a minute. We talk about when we talk about this example, we talk about what are called as expectations. So in other words it is the ensemble average is written in terms of expectation. Note that from a sheer definition point of view there is a difference between the time averaging and the averaging across or averaging of the whole ensemble. Time averaging simply means that I am averaging across here. So expected value or in other words ensemble gives us what is called as the ensemble mean or ensemble average and that is given by, this we had seen it couple of classes back that it is given by mean is equal to the value zeta 1 multiplied by the probability into d zeta. Now remember that we had seen, so the key factor here is this equation which gives you the expected value of say a function y is given by the expected value of y, so function of x or y is equal to g of s or function of s is given by g of alpha p of alpha into d alpha. So now I can replace g of alpha by any function. So I can replace that for example if I am looking at the expected value of the, you know what is called as the variance then we can determine the expected value of the variance by substituting that difference zeta s 1 minus the mean that can be substituted, that squared can be substituted instead of g and calculate what is called as the variance. So expected value of what is in this bracket that square of this term is called as the variance. You can operate the integral as you know it, you can substitute it, expand it and that is the right hand side is what you get. Let us now, so the two important parameters, the first two important parameters are what are called as the mean and the variance. They belong to one set say calculated at s 1 or calculated at s 2 and so on. So when I say it is mean or the expected value it is across this line vertical line. So if I talk about horizontal line though it is distance I would talk about time averages loosely talking it is a time average. So average can be taken along the vertical line or the average can be taken along the horizontal line. So those are the two things. The first is respect to one variable s 1. You can then look at how two guys who are separated by a distance say tau or time okay, how they are related. There are two things or two definitions which now become or characterizes this connection between the two random, I should not call two random variable but random variable at s 1 and s 2 okay because we have to be very careful in using these terms. This whole thing is one random process, please note that this whole thing is one random process. is characterized as one random process okay. So I can say in that way this is not a different random process belongs to the same random process and so they are the random variable. For example if I look at the acceleration levels in the seat okay then I can call that as a another random process. As we go along in these roads what is the acceleration level at the seat if I calculate okay that is called as another you know random process. So I have another set of these values there as well. Then I will combine both we will see that a bit later. So here we are looking at the same random process. So autocorrelation is the connection between those measurements that are done at s 1 and the measurements that are done at s 2. So that is the expected value okay it is expected value of zeta 1 zeta 2. So you can see what is that what is that the definition is there okay zeta 1 zeta 2 we are now replacing zeta 1 zeta 2 instead of g and so the probability now we see that there is a joint probability of zeta 1 zeta 2 multiplied by okay d zeta 1 d zeta 2. So the same thing yeah yeah that is the sorry that the second one is auto covariance you know there is a small I mean there is a mistake there it is actually auto then we have the what is called as variance okay auto covariance. So that is subtracting the mean and I think there is a small mistake in that as it is not cross correlation okay it is again the auto covariance between the two. So the second line that is the one okay. Now we can actually calculate what is called as the so what is the difference between the two the difference between this same as the difference between a mean and the variance one talks about how much the variable the variable is connected okay the other talks about how much the distance away from the mean okay how is that connected so both of them you look at one without mean and one with mean usually in most of these processes we would make sure that the mean value that is by adjusting the measurement that is being taken we can make the mean value to be 0 in which case both these cases both these things would become 0 or in the words that M would become 0 okay so that is the so the second one is actually the auto covariance so that not cross correlation in auto covariance and auto correlation and auto covariance is given by these two equations okay. We define what is called as a correlation coefficient what is what is the what is the connection you know why are we doing this what is the relationship between the two once we finish this once we finish a few more definition we will look at what this actually means okay. So I define another one I just normalize it normalize the variance okay this auto covariance by means of the mean and I get what is called as a correlation coefficient we will come back to this correlation coefficient in a minute we will let us look at we will now narrow down this process. Now if it so happens now look at this I define I am defining S1 and S2 if it so happens that that the statistics what we are defining defining okay does not depend on the particular value of S1 and S2 but depends only on say for example the difference in distance or in other words time okay does not and depends only on the time okay then we call this process as a stationary process. So the stationary process in fact can be classified into weakly stationary processes or we can call them as strongly stationary okay so but we will not go into that detail because we will then it depends upon whether all of them all the statistical measures or in other words all the moments higher order moments all of them they are the same okay or they do not depend upon where I put this my line does not depend upon where I put the line but depends only upon the separating distance or separating time okay all the higher order moments okay that is very strongly stationary if the first two moments depend upon this time tau then we call this as a weakly stationary process okay we will just go to ergodding then we will understand this whole thing you know what we are talking about please note again I am looking at it like this the vertical segment okay so it is independent of time in other words that vertical segment if I take and determine and do all my expected values and so on or I take this vertical segment they are the same or I take two other vertical segments and then separated by that same distance tau then all the statistical parameters that I calculate they are the same okay so this distance does not come into picture and that is what we call as a stationary process okay as far as we are concerned we are one interested in the first two and so that is the first definition of stationary process but a more interesting process because of the limitations and other things is called as an ergodic process it is a special case of stationary process where one sample represents the whole of the random process so here I have to go to a different roads and determine this okay so one sample determines the whole of this process so I take a sample if that sample determines all the statistics then we call this okay as an ergodic process more importantly the ergodic process we will see how it is whatever time averaging I do along this the regular time averaging I do along this that reduces to the end sampler expected values so the time averaging gives me the complete statistical parameter of the whole of the random process yeah we will come to that so one of the conditions of course is that the length of this sample we say that the length of the sample has to be big we will come to that in a minute what should be the length of the sample so we call this we already said that if the mean and the covariance you know function or independent of time and we also defined what is called as a strongly stationary process so we said that if the ensemble averages are determined from the time averages of a single process we said that is ergodic let us write down all the equations corresponding to that what am I doing I am only okay defining you can also write this as an expected value does not matter but when you write this in terms of time the calculation becomes quite simple so the first is the the mean definition look at that x tends to infinity okay so I integrate over a very large length it is very important that the lengths are large okay so that is the mean across that is the horizontal side so the mean variance and the third how look at how it reduces so we will we look at this from the probability distribution density function will be Gaussian and the third is what we called as much that autocorrelation okay autocorrelation and the fourth is mean square value and lastly we look at auto covariance okay so what essentially has happened is that that d psi d d psi 1 d zeta 1 d zeta 2 which we had put earlier okay that is now we have replaced it okay that time averaging and so it becomes very simple now it is possible to get the estimate of all these things we will not cover that in this course what are called as estimators we can do that estimators by looking at discrete values that are taken okay and then summing up the discrete values we will not look at what are called estimators for these things okay we will leave this as a continuous expression and maybe we will cover that in one of the later courses so these are a bunch of definitions which I am sure you understand if there is any questions we will take it okay now what does this these things mean you know what is the what does autocorrelation mean what does the autocorvariance mean and so on it simply tells you how far as a correlation you know the as a term indicates how far the signal that is happening at s1 okay the road roughness is how far does it affect something that happens at s2 in other words what is the memory of a signal this is not only for a roughness but it is also for any signal so it looks at what is the memory of the signal if this is white noise which means that say the we will talk about the spectrum or power spectral density which is distributed throughout then in other words it is purely a signal which is such that even the next guy or in other words delta t okay t if this happens at time t what happens at t plus delta t he also is not related to t in other words a white noise is where there is a stick standing at tau is equal to 0 okay you can if you go and substitute tau is equal to 0 in the expert in the third expression you would notice that that is nothing but that is nothing but if I if I substitute tau is equal to 0 in the third expression this is nothing but the fourth one or of zeta of 0 okay it is nothing but the mean square value of this variable okay which means that there will just be a stick and there won't it won't be connected at all if I mean of course this varies depending upon what are the frequency contents and so on okay what is actually ergodicity it is a very big questions can't be covered in this course there are what are called ergodic theorems and so on we will just follow a very simple method to just understand what this ergodicity is so what we said is that we have a time or a space average let's stick to that word time because that's what is used in most textbooks okay what we are saying is that the ensemble average that is the expected value that's the expected value is equal to the time average how does this come about okay let's look at this carefully now the expected value of the ensemble or the ensemble average given by that expected value okay we will substitute that okay expected value of you know in terms of what we had measured here what we substituted here this expression switch the expectation after all expectation is the integration now you can shift it okay and ultimately you would see that okay when I substitute that the ensemble average reduces to what is called as the line average or the time average right so what is important to understand is that these for a stationary process the ergodic process is also a stationary process so this very I mean this expression very simple expression you can see that the simple expression where I just took the expected value inside shows that the ensemble average can be calculated in terms of the time average or in terms of the distance yes now so these are the five things so which are important to us that's what we had defined okay let's define what is called as a Fourier transform we know that already what Fourier transform is not that 1 by 2 pi can be interchanged it can be root of that and so on okay depending upon the textbook so it is only a scaling quantity so let us look at the two values okay the what is called as the Fourier transform on the inverse Fourier transform okay these are well known this comes from your Fourier series okay you know you know that already we will spend some more time in one of the later courses on all these things so we define what is called as the power spectral density ultimately we come to a quantity which is called power spectral this is the sixth right sixth important quantity look at mean variance auto correlation auto covariance correlation coefficient which is nothing but normalized by means of the variance a square of the variance okay and then we come to what is called as the power spectral density power spectral density is defined as the Fourier transform of the auto correlation function so Fourier in other words what is that we are trying to do we are moving from the time domain to the frequency domain okay we are moving from the time domain to the frequency domain the auto correlation function is in terms of time domain and now we are taking the Fourier transform of it so we have now moved to the frequency domain right so the these are the two the Fourier transform on the inverse Fourier transform do an inverse Fourier transform of the power spectral density we get what is called as the auto correlation function okay substitute tau in that second expression you would get the meaning of what is power spectral density you know to understand what this power spectral density is what is this power or spectral density as sometimes called okay spectrum as it is loosely called okay all these what does this really mean the expressions which I had written there they are very straightforward okay they are just that the mean square value okay which is the auto correlation at tau is equal to 0 is what is defined in the third line that is this one we had seen this before if I now substitute tau is equal to 0 here it becomes that the mean square value is nothing but s omega d omega so in other words the mean square value or the power okay is now determined in terms of the power spectral density now let us understand this more carefully let us for a moment take that second expression for granted let us say that I have a process okay where I send in input and I get an output okay let us say that the frequency response function of this process is h of j omega and the impulse response of this is say h of t right let us consider that this process as a bond band pass filter okay with a narrow band in other words this has this is a filter which has the frequency response function such a function that it allows only a small band of frequency to pass through going to filter off the rest of them. So if I now other words let us say that centered amount around omega not okay usually it is practiced to do both so it is going to allow only a small narrow band okay now this pass band this is what is passed okay this narrow band of frequency is what is passed okay which I would also call as pass band if you do not understand this filter okay let us now calculate the expected value of the square of this output okay the expected value of the square of the output in other words mean square value of y usually the square of these any of these quantities the electrical engineers are formed to call them as power it comes from I squared r so usually the square of a quantity is called as the power okay so the mean square value which we saw in the last slide is now in other words this slide we saw it there as the last equation now since this narrow band is only passed this band pass only a narrow band is passed okay let us say that this is delta omega that integral from minus infinity to plus infinity is now reduced okay reduced because the rest of the places it is 0 so it becomes this is the integral limits now okay so that pass band is the power spectral density multiplied by d omega okay and you can as the band becomes smaller you can understand that the power that is passed okay in a very small range is nothing but is given by the power spectral density sometimes you know some of the softwares which are used in mechanical engineering they call that s y omega into delta omega as auto power okay so power spectral density because the word density is used because when you multiplied by by delta omega so it gives you the power so we that is why we that is why the density term is added many of the envy edge softwares okay you would see that they use the term auto power in what they simply mean is that they multiply this with a band of delta omega okay so that multiplication in other words what is inside the integral is what they call as auto power okay so in other words the spectral density or the spectrum gives you the energy content of the single signal and we are now looking at it in the frequency domain the energy content at various frequencies is what is determined okay by the spectrum okay one of the key factors though we will not be deriving it completely in this course take that is granted is the relationship between what comes out here and what goes in the relationship is given by the first equation in other words the power spectral density is a very key equation the power spectral density of a of an output okay output determined from a linear time invariant system okay which is characterized by a frequency response function okay that is given by the square of the magnitude of of this function multiplied by the input power spectral density so in other words in other words if I have a vehicle say for example we looked at a quarter car model and we had looked at various frequency response functions at various places now if I want to say that I want to find out what is the power spectral density say at the at the automobile what we call as the sprung mass say the base of the car okay the sprung mass the floor of the car okay if I want to find out that we already know what is the frequency response function we had written that okay now that frequency response function the magnitude square of the frequency response function if I now take note that it can be complex so we are looking at the magnitude of the frequency response function square of that multiplied by the input power spectral density gives me the output power spectral density so if I want to find out at the seat I put one more okay what is the frequency response function between the road and the seat I write it and then I can find out what is the power spectral density at the seat location okay that brings us to a very important thing or the properties of these of these functions the autocorrelation function by the shear definition remember that there was a xt and xt minus tau so from its shear definition okay is such that the it is a even function or in other words autocorrelation function of minus tau is equal to the function of tau okay from the shear definition so that is the first one. The second equation again comes from the definition go back and look at the definition okay you would notice that in e power minus j omega t substitute minus omega okay by I mean omega by minus omega so you would notice that the power spectral density in other words what is this nothing but the complex conjugate becomes the power spectral density of omega in other words the power spectral density is a real valued function okay power spectral density is a real valued function and that is what is okay important here that the output power spectral density and the input power spectral density both of them are related by the power spectral density which is a real valued function okay we can also look at cross correlation functions these are the cross correlation functions between two different random processes this is very important to understand okay what we are looking at this whole thing is a random process though we call this as one random variable and another random variable they belong to the same okay this is that is why we use the term here when we calculate it is auto okay so auto is what we used because it belongs to the same random processes so we can connect two different okay the cross correlation between them what happens when you give an input here how is this correlated with another random process which is at a time so in other words if I have another random process see I have a random process and I have two random processes okay the other random processes I let me call that as eta it can be it can be anything you know whatever it can be maybe what happens at the seat location okay or something else now the cross correlation is the connection the statistical connection between say for example that road input at S1 okay to what happens the statistical connection between that and at S1 plus say tau okay because of the fact that the distances do not now matter because we are looking at something else then so S has to be okay though we define the road in terms of distance it is proper that we come back to time rather than stick to only distances so please note that though I used it interchangeably note that it is always better to look at that in terms of time so we define what is called as a cross correlation cross covariance and cross spectral density okay so cross spectral density note the difference between okay yes the power spectral density eta eta to s zeta eta okay so note that difference the difference is at the same we are looking at when we look at the y y you know that is at this place what happens now there is a delay factor okay a tau which connects the cross correlation factors okay again through the definition we can see that the cross spectral density is a complex valued function obviously it is it is a complex valued function we also have to look at the phase what because we are looking at two different points okay and this is the frequency domain so we look at the phase as well and the cross correlation function is neither odd or even and again go back and put that in the definition you would see that that is the expression you would see one of the quantities see we saw the cross correlation coefficient okay the correlation coefficient rather sorry correlation coefficient one of the quantities that is in the time domain okay and one of the quantities of interest to us in the frequency domain a corresponding in the quantity in the frequency domain is called as the coherence is called as the coherence. So all of you know what is the correlation coefficient okay if the correlation coefficient all of you know varies from minus 1 to plus 1 okay rho is between minus 1 to plus 1 that we had normalized it by means of that variance okay so it is simple probability so which means that if there is a perfect correlation then you know that you can plot that the two variables and maybe all the points will draw will fall in a straight line and if there is a negative correlation you would also know that okay when one is positive the other one is negative and so on okay so one so the negative correlation means that one something increases the other factor decreases and so on. So that is the correlation which we defined okay in the time domain what is the correlation in the frequency domain you should be able to now imagine time and frequency in a very very similar fashion the x axis is time now the x axis becomes frequency it does not matter no x axis just becomes frequency okay so frequency domain where we look at this whole thing in terms of that f after we do the Fourier transform in fact today there are techniques that are available in order to look at say fatigue fatigue life in frequency domain so you do that in time domain or you can do that in frequency domain so coherence which is extensively used to look at how I mean look at the input and the output connection in many of the experiments say for example you would do some experiments in the vehicle dynamics lab in which case you would look at coherence all the time and look at how whether output what you measure is pure noise or whether it is correlated English word correlated with respect to the input okay that you look at it in the frequency domain if the coherence is not near 1 then okay suppose it is say 0.9 to 1 then you know that there is a good coherence between what you gave or in other words what you are measuring is due to what you gave as an input if there is no coherence then you would see that what you are measuring is noise there is nothing to do with what you have given as an input so coherence is an important quantity especially you would be using that in the experimental work one of the quantities of interest is what is called as the mean square value of the acceleration so why are we doing all these things okay ultimately I measure or I calculate what is the power spectral density at the seat okay we know that now we have defined quantities we know the connection between power spectral density at the road power spectral density at the seat we know that it is everything is done now okay why am I doing this what is it that I am doing this I mean what is it that I am going to get out of it one of the simplest thing you would notice is that the acceleration levels okay what you say for example feel when you do a drilling or something where you are subjected to accelerations are very important input of vibration to our body okay in other words in other words our tolerance levels depend upon the mean square value of the acceleration okay now what is the mean square value of an acceleration remember y squared the same thing okay mean square value of this now instead of why we say that it is the mean square value of the acceleration we know that this is nothing but the integral of the power spectral density this is exactly what we did in this problem so power spectral density in a band omega 1 to omega 2 okay in other words we have a central frequency like this and we take this band from omega 1 to omega 2 okay these band omega 1 and omega 2 is called as an one-third octave band is called as a one-third there are one-sixth octave band and so on one-third octave band if the upper frequency or upper bound this point is given by root 2 one-third of I mean power of root 2 or the lower frequency is given by as you see divided by third in a cube root of 2 and so on when you calculate that that becomes 0.89 to 1.2 1.12 omega 0 octave has a very important parameter and you will see that okay again next goes on noise noise and vibrations so the RMS acceleration is now calculated okay based on the power spectral density this RMS acceleration is related to our ability to withstand vibrations okay now we have various so I can calculate for various frequencies what is the RMS acceleration from that equation okay so in other words I can have a plot of frequency versus the one minute versus the RMS acceleration no it does not matter I am looking at it anywhere power spectral density so you can take that power it is a general expression so we are talking about power spectral density at the seat so yes okay yes no no no this expression here this expression here is now substituted in terms of the seat okay if you want to call this as eta eta again call this yes eta eta okay so suppose this is the road input okay and this is the seat why is the seat output at the seat and then if I calculate the power spectral density at the seat and call that as say yes eta eta or yes eta whatever it is okay which is nothing but remember that h squared value right so I am talking about this value so take it at the seat okay that is what goes inside that integral okay and take one minute let me explain this process and take one omega okay and then put that as a band and calculate one third octave band at that value so I get one value for a mid frequency of omega by integrating that value that is the RMS acceleration okay for a frequency that frequency so like that I will have RMS acceleration at various frequencies acceleration with function of frequency yes root mean square value of the seat position but note that I am calculating or I am measuring what I am measuring as an output okay that power spectral density of the acceleration say for example if I measure the power spectral density of this output okay is what I measure is yes so I can measure anything I can do I can measure whatever I want suppose I put an accelerometer usually what I do is to put an accelerometer and measure it and so what I get out is the power spectral density of this acceleration measured from the accelerometer and that is what I am going to use okay it does not matter the simplest and the best method is to put only an accelerometer and measure it okay now that is the so this is called as the RMS acceleration right so usually you can put an accelerometer at the seat you can put an accelerometer at the back okay and you can put an accelerometer at the floor and so on. Now a lot of tests have been done from the point of view of human tolerance human vibration tolerance okay and it has been found that there is a difference about you know the tolerance level of a human being whether we are subjected to say a seat vertical acceleration or an horizontal acceleration also I am sure you have been I mean this has been done to you and talked about a very famous standard 2631 okay in your earlier course on ergonomics because that is what essentially the whole of ergonomics is all about that standard 2631 it is a huge standard when we do not have time here to cover it but nevertheless once I know how to calculate that RMS acceleration you have what are called as weightage factors and these weightage factors are applied which are different whether it is a vertical vibration or whether it is a horizontal vibration and so on so with these weightage factors we will understand what is the what is the tolerable limit whether we are going to be comfortable or not okay we will talk about that in the next class the last of the topics that we need to cover which we will do in the next 10 minutes is what is called as the road roughness of course you had already seen what is power spectral density you already saw that power spectral density is very important input to this whole problem and that road roughness is actually the input remember that many of the equations you can either solve it in the frequency domain or in time domain okay but nevertheless if I have to give the road as an input as a power spectral density it is in the spatial domain in the words when you look at when you go and measure what is called as road profile okay this would be in terms of distance x and the actual power spectral density okay from which you can get the correlation order correlation and so on is actually measured in terms of time so we have to have a relationship between the spatial frequency and the temporal frequency in other words when the velocity of of the vehicle is V obviously the frequency with which they the vehicle gets excited would be a function of P so in other words if omega is the what is called as we will call this as a spatial frequency which means that this is expressed in terms of radians per meter okay and for the problems that you do you require the same thing in terms of say radians per second so I have to convert radians per meter which is a road characteristics that has to be converted into a characteristics as an input into the system into the vehicle and that obviously comes from V the velocity of the vehicle which is expressed in meters per second so obviously omega which is radians per second is given by capital omega spatial frequency multiplied by V okay so this becomes omega radians per meter meter per second so this becomes radians per second okay so this is the first thing that we need to know understand when we convert the spatial frequency or spatial power spectral density into a power spectral density which goes as an input I do not want to call this as temporal and confuse you just that as an input when at power spectral density when it goes inside all our calculations then it should be in terms of the omega should be in terms of what is called as the temporal frequency there is meters per second okay now so this is a very good nice way of distinguishing between the road and the vehicle so that brings us to the topic of how do you specify a road roughness in other words how do you characterize the road there are a number of ways in which you can write down a equation for the power spectral density of the road okay let us let me call that as the power spectral density of the road the power spectral density of the road is written in terms of capital omega which means that it is written in terms of radians per meter is given by a characteristic power spectral density phi naught multiplied by omega naught by omega whole power w the w actually indicates the waveiness and w varies from okay of course you have the wavelength to be 2 pi by omega obviously you know that and w actually varies from 1.75 to 2.25 practice is to put w is equal to 2 omega naught specifies a characterized you know frequency of your standardized spatial frequency okay it is a standardized spatial frequency we will see in the next you know equation how we are going to characterize the standardized spatial frequency this is one of the first equations that were written the equations were further modified later okay and there is an ISO specified power spectral density of the road so you want to calculate the power spectral density of for in terms of the temporal frequencies meters sorry radians per second we will do some small manipulation to that now the ISO came up with a small modification to this and they said that if you plot it in a log log plot then there are actually two different you know slopes in a log log plot of the frequency versus the power spectral density. So they gave this actually this is a log good to plot in the log log plot because of the type of equation you have so the equation that you get is something like this and for omega greater than omega not the equation is equal to the same similar type only thing is that omega to the power w2. So in other words there are two slopes okay and they are characterized by two equations or two w's w1 and w2 now the good roads and the bad roads are characterized by these w1 and w2 so if you really look at that for example for a very good road the standards are something like this for a very good road the range of this w1 w2 is that is less than 8 which means at 0 to 8 the geometric mean is 4 and this called a class road and for a b class road which is a good road this range is 8 to 32 okay and that is the the finite value the finite value okay and then C which is an average road the values vary from 32 to 128 and so on okay there is a poor road which is 128 to 552 that is 128 to sorry 512 and so on. The usual practice is to have w1 to be 2 and w2 to be 1.5 so in other words the lower bound for an average road if you want to do an analysis the lower bound for an average road you replace this finite by 32 okay that gives you the lower bound okay of course omega 0 is 2 pi okay and finite is 32 and for a upper bound you replace that finite by 128 okay and put that here right so you can have for an average road you can't I can give you one figure okay it varies from 32 into 1 by 2 pi omega whole power 2 and then again same thing whole power 1.5 when omega 0 changes this is a very standard so standard but people in recent times have issues with this the issue is that when omega the frequency okay if you really look at a graph the graph would be something like that with respect to omega versus this phi there have been issues with this equation the issue with this equation is that when that omega the omega value when it goes as it tends towards 0 the power spectral density now shoots up okay and goes to infinity the variance reaches infinity at omega is equal to 0 when omega is equal to 0 so that gives us a very unrealistic value when you reach omega okay when you move omega to be closer to 0 in other words in other words this would start this part the error is very large okay because of that fact that as omega tends to 0 okay this one tends to infinity so in order to avoid this in fact just one statement is very important that essentially what is that you do you put down an equation so if you want to like for example our road if you really want to model your vehicle in this road then you have to do a profile measurement and then you have to fit a curve okay which is given by an equation which you are going to write down which you have written down so it is very important that you identify a road with an equation of this form or an equation which we are going to put down because of this difficulty okay there are a number of issues for example this gives what is called white noise but the roads are colored noise where when there are a number of parameters we are not going to cover all that in this course but there will be a course on NVH if any of you want to take it may be next semester is after this two semesters romance is in the January session where the first part of the course will cover a lot more on random vibration as well as on signal processing and there we will be covering more detailed analysis of the road as well as how you know what is white noise what is colored noise what is shape filter what is shape filter which converts white noise into colored noise all those things will be covered there. So the only thing I am going to write down before I close this topic is that there has been changes in the way the power spectral density is written because of this difficulty that omega makes this omega as omega tends to 0 the power spectral density goes to infinity so in order to avoid that there have been other this kind of unrealistic behavior is avoided by using other types of equations or that is one type of equation the other type of equation that is written and so on. So these are other types of equation this which gives you a more realistic picture of the actual road measurements especially as you move towards you know omega to be small you give you get much better in a picture of this and of course when you want to convert it into omega substituting for V and so on so you would notice that we can write down V phi 0 into omega 0 say for example if w is equal to 2 you can write down that is the equation. There are lot more issues in random vibration because of lack of time in this being the last class I do not want to go into further details. People who are interested in some more details can slightly more information again look at Gillespie others if you want more details on shape filter white noise colored noise can look at the paper by Shilan in the journal Sadhana okay we will stop here and this course will end with this and we will continue that in the next course okay thank you.