 Welcome back towards the end of last class we had completed our discussion of some mechanical measurement techniques that are applicable to measuring external spray characteristics. We will spend some time today talking about non-intrusive measurement techniques to measure both external to measure external spray characteristics both macroscopic and microscopic. So we will learn a little bit about non-intrusive measurement techniques. So we will take the simple example of videography first we will see what a videography which includes photography what does it tell us and what does it not tell us. So if you go we are going to start by playing a very short video as you can see this is the classic perfume spray I am going to pause the video here the classic perfume spray that tells us a little bit about the nature of the spray coming out of a standard perfume bottle. You can see that there are drops far downstream you can actually see the drops but close to the spray nozzle itself you really do not see individual drops but again close to the spray nozzle you can discern you can understand the idea of a spray angle. So macroscopic measurements such as spray angle and maybe even spray pattern can be understood from a simple videograph like this or a photograph or even an individual photograph I will play this for a short while not only that as you can see right now there is this is a case of a steady spray in other words I had the perfume plunger depressed and that caused a steady squirt of perfume coming out of the nozzle at least seemingly steady to my naked eye as I was spraying but this video is a high speed video taken at about 3600 frames per second. So at that resolution at that time resolution you can see that there are structures as you can look at this little white patch I am going to show you can see that the white patch diffuses further downstream and there is another white patch that forms so it is not like a continuous spray it looks looks like there is some semblance of unsteadiness another example is right here you can see that there is a little inverted V like structure with the tip of the V pointing away from the nozzle these are classical so called inverted chevron structures that you will see in most sprays and we will come back to look at the origin of one of these structures later on but the point to note here is that in a simple video we are able to make some sense of a lot of these parameters now what other information can I get let us talk about spray pattern for a moment now this spray presumably is axis symmetric so what I am seeing is just the peripheral outside of the spray so if I have an axis symmetric spray I am going to see in a video or a photograph I am only going to see the outside not really going to be able to see much of the inside part of the spray so if I want to see the inside part of the spray if I want to sort of take a cross sectional view of the spray which is what an inline patternator does if I want to take or if I want to take a cross sectional if this is my nozzle back here and if this is a spray if I want to take a cross sectional view in this sense and look at uniformity just like a sector patternator would do I still need to figure out a way to make a cross section and that is why this kind of illumination where you know essentially this is like a set of lamps surrounding the spray that were used to illuminate this particular spray this is not particularly useful if you were to do that so let us make some notes on what is what works and what does not work so one could make spray angle measurements one could look at unsteadiness in the spray and you could also look at pattern if there was a way to take a cross sectional view so how do I take a cross sectional view by looking at the so-called plane illumination so I can take let us say a torch light that puts out a cylindrical beam of light of some diameter D and by a set of lenses create a height H of create a light sheet that is of a height some height H and the thickness would still be equal to the diameter of the beam but if I start with a beam that is of a very small diameter I can create essentially what looks like a rectangular sheet a sheet of light that is of rectangular cross section now if I use this kind of an illumination in a spray I am only going to see the drops that are resident inside the spray so if my camera in this particular view is normal to the sheet where our eyes are located currently then we would only see the drops as they propagate through this sheet of light so this plane illumination or light sheet illumination is a very nice technique to look at what is happening in a cross section now I could also illuminate I can have the spray be where my eye is in which case I can create so if I have the spray nozzle where my eye is I will essentially create a cross sectional view either this way either a meridional cross section or an axial cross section both are possible the only problem with this kind of an approach really is that the brightness of the light sheet is quite limited because you are unable to pack the brightness of the light sheet becomes a limiting factor therefore most of these techniques while they were feasible were not really practical until the advent of the laser the laser laser provided for a very nice way to pack a lot of energy into a beam so fundamentally what is different between a laser that puts out a cylindrical beam let us say a 1 mm diameter beam and a torch light that also presumably puts out a 1 mm diameter beam okay so let us quickly make some differences I will call normal call normal collimated light collimated is where the light rays are all parallel so the first thing is laser is monochromatic typically lasers are monochromatic that is all the photons that make up the light are of the same wavelength unlike say normal white light illumination now you could also get monochromatic torch lights so to say that are of a very narrow wavelength spread so you can essentially have a monochromatic light source but the biggest difference between a laser and a normal light sheet no normal light source is this idea of coherence the fact that all the light that is being produced by the laser all the photons are in phase that is if you imagine the simplest way to imagine this is to look at light as an electromagnetic wave that essentially if I have a light beam if I know the phase at one end of the beam I know the phase at every other point because they are perfectly correlated this is not true with normal monochromatic light so if this is a laser the wave that makes up this the electromagnetic wave that makes up this light is essentially entirely coherent in phase because of which all of the photons that are produced are only adding to your light intensity there is no destructive interference causing resulting in loss of light intensity therefore laser produces is a very nice source of illumination for photography and videography applied to space so if I create a light sheet just like I showed before if this is now not a torch light but a laser I can create a beam of diameter D now typically the divergence of this beam is usually very small but it is not non-zero it is a most laser beams do have a small but positive divergence fractions of a milli radian usually but the advantage is that it is a coherent beam so you can do a lot with coherent beams that you cannot do with incoherent light sources another feature of lasers is that it will always have a Gaussian distribution of intensity the beam itself has a is brighter in the middle than in the edges so this is so if I take a laser that has a Gaussian distribution of intensity and pass it through a pair of concave and convex lenses just like shown there that intensity distribution is still retained so the laser itself when distributed into a sheet would still be brighter in the middle here than at the edges okay. So now let us go back to our video and look at how this can be of use to us if I start off from some point over there you can see that there are these white specks at this point where at the at the instance where I have paused the video you can see that there are these white specks okay now those are presumably large drops that in this magnification look like white specks now if I take another frame just a short instant later I can look at those same white specks that have moved a distance downstream so by knowing the distance that these specks have traveled and the time spacing between the two frames that I just used to illustrate this point I can extract a velocity vector however the only problem with with doing it in a video like this is that I do not know if they were all moving in a plane or they may have moved towards me and away from me so when I do it with just diffuse illumination like I have used to make this video I do not know the direction in which they have moved but instead if I had a plane if I had a plane light sheet illuminating the spray then if the speck was visible in two successive frames that means it was inside that plane only so that becomes a precondition to extracting the velocity vectors without which you do not know which way they moved and so you are somehow you are getting a component of velocity but we do not necessarily know the exact plane in which they moved especially because with the lens that I had to make this video the depth of focus was fairly large I do need that otherwise most of the spray would look just blurred and a very small part of the edge of the spray would look like it is in focus and that is not a very useful video to have so if I want to illustrate the points the macroscopic measurement variables I do need a fairly high depth of focus on my video but if I do that I do not know which way the drops are moved so the answer the solution to this is use a light sheet a plane light sheet illumination plus let us say a high speed or an I put high speed in parentheses but a high speed is not necessary but what is necessary is double pulse camera that is I need a way to take two successive snapshots apart in time they are such that I know the time spacing between the two snapshots okay so if I did that and if I went back a short distance you can see these wide specs right let us some distance from the nozzle a short instance later all the wide specs have moved and presumably if this was a plane illumination then all of the distance moved by the specs are in the same plane as the illuminated as the illumination sheet itself and from knowing these two frames and the distance traveled we are able to extract velocity and this is called particle tracking velocimetry this is the simplest of drop velocity measurement techniques where you follow individual particles and record the displacement vector over a fixed interval of time so let us say I have a spray that I have used plane illumination so this is my light sheet and my camera is where my eye is currently located so in one frame let us say a drop was situated over here in the next frame if this is the position of the drop what I do know is that this is the displacement vector the velocity is given by the displacement vector divided by delta t the time spacing between these two between the two successive frames only problem with this is that when I take a video or an image like for example we will go back to the one way just from looking at this video if you did not know what a normal perfume bottle would look like do you have a way of extracting how big this spray is let us see let us take a simple example let us say I have I take this frame I can see the wide specs close to the nozzle I move a short distance a short time later I know the position the new position of the specs what I need is distance in millimetres or meters some length unit a picture like this does not really convey length units all I know is displacement in pixels because the camera works in pixels and in 3D walk cells so when you have pixels all I know is the displacement in the x direction in pixel units displacement in the y direction in pixel units so I have to a priori before I do this experiment have a way of figuring out what the how many pixels corresponds to a length unit so that is called magnification calibration so I need to calibrate this system for its magnification so delta s in pixels times magnification in millimetres per pixel so m is the magnification in fact typically given in units of pixels per millimetre but we can also you know obviously invert this and get a length per pixel this is required before one can extract quantitative length based velocity information otherwise all I know is some pixels per second now how would we do this we typically use a feature that is that is of a known length like in this case let us say I know some distance here let us like a plunger diameter or a height feature on a bottle like this that is that is known beforehand and by counting the number of pixels that make up that feature we are able to extract the pixels per millimetre now a lot depends on the kind of lens that is used in these kinds of photography and videography there are a class of lenses that where the magnification is relatively uniform all around the image so if I do this kind of a pixel per millimetre calibration in one of the corners of the image versus the middle of the image I will get a pixel per millimetre that is relatively the same but I could there are other lenses which are so as I go smaller on the focal length of the lens typically the magnification varies and across the image so you are always better off with a very narrow angle of view lens where the magnification around across the frame is relatively speaking constant you know if you are otherwise constrained to use a lens that has a very short focal length then you have to have a pixel per millimetre measurement at different points in the image and use the actual pixel per millimetre at that location to extract the velocity measurement. So in any case I do need this magnification in pixel per mm to convert displacement in pixels to velocity in length units let us say mm per second so this is the simplest of microscopic parameter measurements so let us say if I want to measure droplet velocity all I have to do is have create a sheet of light and look at the displacement of the pixel in the plane of the light sheet and that gives me the velocity vector in the plane of the light sheet if the drop was moving perpendicular to the plane if it had a velocity perpendicular to the plane I would not be able to measure that velocity component from this kind of a measurement ok. Now if I if the drop did have a velocity component perpendicular to the light sheet there is another thing that could happen that I could see the drop in the first frame and actually not see the drop in the second frame it just disappeared so I would not be able to measure the displacement of that drop so if I have to measure the displacement it has to be present in both the individual frames that are sometime apart for me to be able to extract velocity vector information. So this is the simplest of droplet velocity measurement techniques we will extend this now the only problem with this is this that if you look at what is happening like very close to the nozzle I just have a little grayscale image so if you look at this particular region here I can see these specs of white light or I could down here I can see the specs of white light but for the most part of the spray close to the nozzle especially I do not see individual drops I see like a smeared image where I am able to make sense of a group of drops like for example if you take a region down here I see that there are that there are drops but it is not where I am able to clearly follow one bright spec and look at displacement of a spec so in other words if I look at if I focus my attention down in a small rectangle where I am pointing right now and look at an image now which is frozen and a short time later I cannot follow individual drops they all look alike in this image I am I cannot follow individual drops this is going to be the case in most of the spray except in the very periphery of the spray where I see flying drops or except in the middle of the spray where I see bright specs occasionally coming out these are large drops that I can mark except when that happens most of the spray is going to look like a grayscale image with little dots and I need to get velocity information from that in order to do that we have to move to an algorithm to a to a technique called particle imaging velocity metric. The hardware part of a particle imaging velocity metric is still the same that I have a spray that I have a light sheet illuminating this spray and I have one camera where my eye is located right now and from that we get an image so let us say we are going to image this part I am going to draw it out in a bigger rectangle on the side so this is my field of view typically on a camera that is used for these kinds of applications we want the depth of field to be very small in fact there is no need to see drops that are outside the liquid sheet all I want is the depth of focus of this imaging system to be approximately equal to the thickness of the light sheet I do not need much more of a depth of focus to the lens and the camera system. Now in this field of view I have all these wide I have these specs of light presumably coming from the drops so the technique that is used here since let us say if I take if the drops are in this shown pattern and sometime later let us say the drops are all like this you can look at sort of the arrangement of the red specs as I have shown so let us say red is a later instant of time some t plus delta t and blue dots are current instant which we will call t so the blue dots were obtained at a time t and the red dots were obtained at a time t plus delta t if I look at the relative arrangement of these dots I am just to show you the I am going to draw this kind of like a star constellation just to illustrate the point if I do the same thing with the other one you see that there is a discernible arrangement to these drops you can so if I take a cluster of drops not really follow an individual drop because the two the drops all look alike that when I take two snapshots I do not know which dot in the second snapshot corresponds to which dot in the first snapshot but I can look at the relative arrangement of these dots say for example down here if I take these three dots there is a cluster arrangement like that if I take these three dots they are like this so there were three drops in that first image that were arranged sort of in this angle in the second image I can now recognize this feature even though I was not able to relate this drop to that drop I can now look at the displacement of this feature not necessarily the displacement of a drop so this is essentially what particle imaging velocimetry is so if I now look at this feature here I can draw a velocity vector in that direction if I look at this feature here I can draw a velocity vector in this direction so this is the basic technique that is utilized now how is it applied practically practically what is done is to first segment this image into smaller regions okay and we will take one segment of this field of view will do the same blue and red so let us say the blue dots were all sort of like that and the same segment of this image if I take if the drops were towards the left edge of the segment the way I have drawn them and this is the image at time t and this is the image at time t plus some small time delta t so if I now take the drops the drops were all in sort of one part of the image to the left edge of this segment and all of them have essentially sort of moved to the right edge of the image and I want to now look at how much is the displacement the way this is done is I take the time instant I take the segment at time t plus delta t and move it over to the image at time t and look at the amount of displacement that I have to make to the image both delta x and delta y so how much if I take the image at time t plus delta t and this is my image at time t let us say I will use my five fingers to sort of illustrate the idea of the relative positions being the same so if this is my first image and if this is my second image I know I have to move it up about that much and move it to the left about that much to get good coincidence now it is possible that some of the relative orientations have moved so the drops while they were like that in the first image may have sort of moved slightly in the second image but when I do this on average I am going to look for a displacement in the y direction and x direction that will give me good fit of the second image on top of the first image this process is mathematically called an auto correlation so delta x and delta y what does this do essentially if inside this pixel I have an I x, y which is like an intensity distribution you could imagine this as just being a grayscale distribution and if I take I x plus delta x and y plus delta y times I x, y so this is this is basically a correlation co correlation function of the auto correlation function of I over a distance where I go on the on dx a distance delta x or delta y and then delta x here. So this as I bring these two images closer and closer to each other I do reach a point where the fit basically increases and as I go past the point where they seem to match the fit parameter decreases again. So this is the correlation coefficient initially increases and then decreases so we choose the point corresponding to delta x and delta y that gives us the maximum value of the correlation parameter. So even though one odd drop may have moved so I can have this general arrangement of drops that has been displaced but in a stretched fashion slightly stretched fashion will still be able to pick out the point where the correlation coefficient takes on a maximum. So this correlation coefficient taking on a maximum gives us the displacement in the x direction back towards the original image and the displacement in the y direction back towards the original image that gives us the best fit of the first image over the second or the second image over the first. And so from here once I know delta x and delta y and knowing the magnification so this delta x and delta y will still be only in pixel terms and after knowing the physical magnification of the system we can extract a velocity vector a velocity component in both the x and y directions. Now so this is the basic principle of operation of a PIV. So if I look at the particle imaging velocimetry it is essentially two images that we are trying to correlate with each other and from the maximum peak that occurs in this autocorrelation function. So the autocorrelation function will call this alpha alpha in this particular case is a function of delta x and delta y. So the way I have defined this autocorrelation function we are going to choose a peak in alpha in both delta x and delta y that is essentially the principle of operation. Practically the way a PIV system is usually implemented we do not take two pulses or two images with the camera flashing twice or the camera sort of acquiring two images it is usually done by using a pulsed light sheet. So you can imagine the camera is on all the time but we have a light sheet that is illuminating across section of the spray and that light sheet is turned on and off and on again and off again. So if the pulse the light sheet intensity if you will is of that kind and by knowing this spacing by knowing the time spacing between the rising edge of the two pulses we are able to estimate the velocity. We are able to essentially take two images but they are essentially two images of some moving object but in the same frame. So if I take the field of view I will have the blue pixels and the red pixels the blue dots and the red dots all in the same image. Now you have to imagine that there is really no blue and red there is just gray but they are all in the same image. So you take the image and you take each segment and move each segment over on to itself until you get an autocorrelation peak and you do this with each little segment. So it is essentially the same segment it is like I have taken two frames and added them together. So I am going to see like a duplicate of each frame just like I have drawn here with the blue and the red dots but the blue and red dots will all be on the same frame. So if I took this say for example if those are the blue dots these are the red dots and if this is one of the frames we are going to move this segment over on to itself until we get an autocorrelation peak moving it both in the x and the y direction. So that gives me essentially this displacement delta x. So it is advantageous to pulse the laser sheet and not pulse the camera because you can get much faster pulse frequencies with the laser with the pulse laser than you can get with the camera. The primary bottleneck there is if I have a high speed camera a camera is usually got a sensor the old cameras had what is called a CCD sensor array which is called a charged coupled device sensor array. The newer ones are what are called CMOS metal oxide sensor arrays that once an image is acquired that data has to be dumped into a RAM location emptied and then you have to acquire a new image. So this time spacing between when the image can be dumped into a RAM location and ready for the next frame is too long for most PIV applications. So you leave the camera on all the time but pulse the laser so you have two images that are successive snapshots but you do not know you cannot tell one image from the other because they are all part of the same frame. So I can now move individual parts of this frame over on to itself until I get a high correlation peak. So this technique this is how PIV is practically implemented in a real system. We will continue this discussion in the next class where we will talk about other non-intrusive techniques starting with the LDV which is the laser Doppler velocimetry system and then PDPA which is called the phase Doppler particle analyzer. So that is now going to get us into the realm of being able to measure drop sizes we will stop here we will continue this in the next class.