 So last time we met we talked about iterative reconvolution and how to fit data. So today we will try to learn 2 things first is what do we fit the data to and second how do we know that the fitting is good or not. I mean if we look at the data looking at it we might be able to say that this is a good fit or this is not a good fit but the question is computer does not have eyes how does the computer know whether the fit is good or otherwise but before we go there let us discuss the data fitting models. So after all if you remember we had said that in the simplest case scenario we have a single exponential decay and we said that in more complicated scenario the most popular way of fitting data not necessarily always the correct way of fitting data is by a multi exponential function and I think we are more or less familiar with this kind of functions. The first one here I of t is equal to I at 0 multiplied by e to the power minus t by tau this is the simplest way you can fit the data I out at t is the fluorescence intensity at time t I at 0 is a fluorescence intensity at time of excitation or 0 time and tau is a lifetime. So this is essentially the integrated rate law for a first order process it cannot get any simpler than this but then we said that it is not necessary that life will be so simple and we can have more complicated data. So as of the first kind of complication we can think of is in the form of a multi exponential data. Let us say you have several independent decay pathways then what do you get I at time t after excitation is the same I at time 0 but now this is multiplied by not one exponential term but rather a linear combination or weighted average or weighted sum of several exponential terms I can have several values 2, 3, 4, 5, 6, 10, 100, 200 in principle it can have any number and then you will have as many exponential terms and as many amplitudes square of amplitudes as you know gives the contribution. Now the thing is if you increase the number of exponential terms generally you would get a better decay a better fit because there is something called over parametrization. So first question to ask is is my decay single exponential or is it not how do you know whether it is single exponential or not the best way to do that is to make a similar plot where y axis counts is in logarithmic scale x axis time is in linear scale and have a look at it what will the shape of this curve be if it is a single exponential decay is going to be a straight line and if it is not a single exponential decay then it will not be a straight line. So that is the first question to ask so if it is a straight line then actually if you fit it to 2 or 3 exponential terms then it will still fit very well but it would not make any sense. So first thing that one needs to do is a visual inspection and by visual inspection the first thing to ask is whether it is at all exponential single exponential or whether it is more complicated of course looking at the decay you will never be able to tell whether it is by exponential or tri exponential or what you can only tell whether it is single exponential or not. Now what are the implications of these terms let us start with the discussion of the single exponential decay tau we have discussed tau already so what is the meaning of tau can you tell me tau is called the lifetime that is right why is it called the lifetime because it is average time spent by the molecule in its excited state and that is something that was given to you as a homework you are supposed to work it out it is worked out in standard textbooks like that of Lakovic principles of fluorescence spectroscopy by Lakovic and then this lifetime tau is also related to some other quantity that we have discussed very early on in this course and what is that quantity so let us put it this way if lifetime is longer do you expect the fluorescence to be more intense or less instant intense then you expect to be more intense why because of this simple relationship that Phi f the fluorescence quantum yield is equal to K r multiplied by tau what is K r here it is a radiative rate constant what a question here very often in literature you will see people you say it is radiative rate but let us not forget that it is a radiative rate constant so please be careful and remember it is a rate constant and not rate alright now this radiative rate constant is related to some fundamental quantities that we might have studied in spectroscopic courses during our MSc or something can you tell me what the radiative rate constant is related with Einstein's coefficient which one actually if it is related to it will be related to be also Einstein say coefficient is the coefficient for spontaneous emission but then that is also linearly related to be and what is be related to be when you say be Einstein's be coefficient it is for stimulated transition between two states right upward transition or downward transition so what about upward transition is there a be associated with it they are actually equal right B12 equal to B21 in terms of experiment what is the experimental quantity that is associated with B12 where one is the lower level two is the higher level for absorption the Einstein's be coefficient for absorption which experimental quantity should it be related to yes louder please so what is it called okay of course it is related to transition moment integral but there is a that is something that you get from quantum mechanics what is it that I can get using some instrument experimentally without knowing any quantum mechanics perhaps that will be related to the radiative rate constant molar extinction coefficient molar absorption coefficient that would be related to the radiative rate constant okay and there is a relationship between the two which once again you can study we are not going to go into the detail right now so the good thing about knowing fluorescence quantum yield from steady state measurement and lifetime from a time result measurement is that you can work out the radiative rate constant more importantly 1-5f is equal to Knr multiplied by tau so you can work out the non radiative rate constant so as you go further in our discussions we will see that we will more and more want to know what is the rate constant associated with some non radiative process that takes place in the excited state of a molecule and this is how we will get the answer. Now the problem is we get the answer very nicely if it is a single exponential decay the moment it is multi exponential situation becomes complicated so when it is multi exponential what is the implication of AI what is the implication of tau I let us ask that question now and the answer is AI multiplied by tau I gives you the contribution of the ith component to fluorescence intensity. Now this point needs to be understood very clearly in order to go further ahead in the discussion of time result fluorescence spectroscopy so AI into tau I so I is some component right so let us think like this that will use this example once again a little later let us think that there are 2 components tau 1 and tau 2 tau 1 is because of a fluorophore that is free and tau 2 is due to the same fluorophore that is bound to say cyclo-dextrin or protein or something like that and tau 2 is longer than tau 1 okay what will be the intensity will the intensity be more will the intensity be less that depends not only on tau 1 and tau 2 but also on how much of it is bound how much of it is free let us say only 20% of the fluorophore is bound to cyclo-dextrin and let us say for the free form of the fluorophore lifetime is 1 nanosecond for the bound form lifetime is 10 nanosecond what will be the intensity if 20% is bound and what will be the intensity if 80% is bound naturally intensity will be much more when 80% is bound where does that come from that comes from here that contribution of the ith component to fluorescence intensity is actually AI multiplied by tau I tau I remember is an intrinsic quantity lifetime characteristic quantity but AI is the contribution and this can have actually severe implication suppose think of a nanoparticle that we have made which is almost completely non-fluorescent the only fluorescence that it is well only photo luminescence it has is due to some trap states right so let us say that the time for recombination of electron and hole in the nanoparticle is something like 1 picosecond 1 picosecond is a small time so fluorescence intensity should be low but let us say there is some trap state and concentration of trap states is really very low but lifetime of the trap state is 100 nanosecond what will the photo luminescence of this nanoparticle be due to mainly the trap state which is very few in number or the intrinsic bandage recombination of electron and hole which is taking place all the time in photo luminescence you will actually see a much greater contribution of the trap state because this lifetime is 100 nanosecond but this is only an example there are cases in which a small AI can be overcome by a large tau i like what we just discussed there are cases in which a small tau i can be overcome by a large AI think of an extreme case think of say warfarin is a very common fluorophore that is used in fluorescence study of protein lifetime of free warfarin is something like 100 picosecond lifetime of bound warfarin is about 2 nanosecond now let us say I have very little protein almost all the warfarin is free will intensity be high or low it will be low because then this tau i tau 1 100 picosecond that component will have almost 100% contribution AI will be large for it but when it is bound to protein even if say 10% of it is bound to protein then what will happen the contribution the fluorescence intensity will be much more because lifetime has now increased from 100 picosecond to 2000 picosecond 2 nanosecond 20 fold increase so what the composite intensity will be is governed by the relative values of amplitude as well as lifetime so AI tau i remember is the contribution of the i th component to fluorescence intensity so what is steady state that what is the steady state intensity see when we talked about a single exponential decay we could easily correlate the quantum yield which is a measure of steady state intensity with lifetime can we do some such correlation in case of multi exponential decay let us see in case of multi exponential decay will you agree with or for any decay actually with I hope you agree with me when I say that the steady state intensity is integral of intensity at sometime t from time 0 to infinity after excitation of course when I say 0 to infinity I write infinity only to make it a general statement it is not really infinity for all practical purposes what is infinity infinity is the point where the decay has become almost 0 see if it is an exponential or multi exponential decay it becomes 0 asymptotically but for all practical purposes suppose I at time 0 is 5000 counts and then you go to 10 nanosecond and there you see that the intensity has become 5 counts so 5 is much much lesser than 5000 so you set it to almost 0 so what we are saying is intensity of steady state is really integral of I of t dt for limits 0 to infinity of time or rather we can say that it is the area under the decay of course we are talking about a particular wavelength is this understood that steady state intensity at any particular emission wavelength is the area under the decay or it is the integral from 0 to infinity of I of t then let us substitute the expression since I at 0 is a constant it comes out and I can take this summation outside the integral so I get I at time 0 sum over I A I integral e to the power minus 1 by tau I dt okay and an advantage of setting the limits from 0 to infinity is that this becomes a standard integral solution of which is known and when we put the solution we get something like this I steady state is I at time 0 I miss that 0 in brackets here sorry about that this 0 has become small I at time 0 sum over I A I tau I or we can write I at time 0 is equal to I steady state divided by sum over I A I tau I so here is a correlation between steady state intensity and the lifetimes the take home message is that it is not enough to look at only lifetimes you have to look at their amplitudes as well contributions as well but actually it is better to stop here and not get over enthusiastic and take it a little further like say 80% of people do in fluorescence spectroscopy so what you see is that almost all the decays are fitted to multi exponential function and everywhere people happily work with what they call average lifetime this thing that you see average lifetime is sum over I A I tau I divided by sum over I A I dimensionally this is fine right because this will have the magnitude of time but this amplitude weighted average lifetime to be honest has no meaning other than steady state intensity so it basically gives you a measure of steady state intensity and if you are going to talk about steady state intensity only then what is the point of doing a time result measurement in the first place so as far as possible it is better to avoid using average lifetimes and also this is not really average lifetime what is really average lifetime is this intensity weighted average lifetime sum over I A I square tau I square divided by sum over I A I tau I so you see here the denominator is actually the total intensity so this average lifetime may have some meaning so it is related to the area under the curve but then from here trying to work out the radiative rate constant non-radiative rate constant is not a very sensible thing to do because after all you are saying that different lifetimes are associated with different processes which would have different non-radiative constants or radiative constants or whatever so if you take an average lifetime all that individual information and implication everything is lost so if we have to work with multi exponential decay if we have to use average lifetime for some reason then let us not try to take it too far and work out the rate constants in the first place now they are not completely useless this amplitude weighted lifetimes actually are used when you talk about say a forced resonance energy transfer that is where this amplitude weighted lifetime have some application but generally it is not really correct to talk to call this the average lifetime this is average lifetime lifetime if at all and it is not very useful that being said let us move over to something that is more complicated and therefore closer to reality many times so next model we want to discuss is distribution of lifetimes and this distribution of lifetimes is a much better model than sum of exponent but the problem is this when you do sum of exponential then what you imply is that you have that many lifetimes discreetly but sometimes that may not be the case suppose you have a range of micro environments you do not have a 01 situation you have some kind of a micro heterogeneous medium where you have graded polarity or graded viscosity say there is a polymer right and maybe at the core the polymer is very dense and on the outside it is not dense at all and let us say your fluorophore is distributed from core to the end many places now a multi exponential model is not valid if it is simply bound versus free then it is valid but even when you go back to this bound versus free model that we used think of some fluorophore that is bound to a protein it is not always the case that it is bound specifically to one side and experiences one kind of environment more often than not you can have non-specific binding and if it is non-specific binding then even bound fluorophores actually experience different kinds of environments or in other words the experience a distribution of environment when we say environment it might be convenient if we talk in terms of say polarity we are all familiar with say dielectric constant even though dielectric constant is not a good parameter of polarity in micro heterogeneous media but still for the sake of simplicity let us say dielectric constant. Let us say our fluorophore is bound to a protein non-specifically and it experiences a range of dielectric constants the model dielectric constant let us say is 20 and there is a distribution say 20 plus minus 5 that is a distribution and a distribution is going to have some kind of a shape it can be a Gaussian fungal distribution it can be a Lorentzian distribution it can be two-sided exponential it can be whatever but some distribution function may be there for such a case a better fitting function than the mundane multi exponential model is distribution of lifetimes and here you need to look at the function little carefully because it might actually look like a multi exponential function to the untrained eye I at time t is equal to 0 integral 0 to infinity alpha tau e to the power minus 1 by tau t by tau d tau please note it is not dt of course an integration is a summation but here alpha tau means distribution function of lifetime and we are integrating over lifetime okay I have not written the distribution function explicitly because you might have to use different distribution function depending on what kind of system it is but this is more often than not a much better fitting model than a multi exponential one see a multi exponential function might fit your decay I am not saying it would not fit because as I might have said in this course I have actually seen an elephant shape of an elephant drawn by a clever combination of 30 exponential functions using a sufficient number of exponential functions perhaps you can draw a self-portrait or also if you play with the amplitudes correctly and if you play with the ships correctly but that would not mean anything is an elephant made up of 30 exponential functions does it even make sense it is funny right it is laughable similarly just because your decay might fit to a multi exponential function does not mean that it is the correct model to use and if you are going to do a quantitative study if you are going to extract as much juice as you can from your lifetime data then it is important to go beyond the convenient multi exponential model and think what your system is like and think what kind of a fitting model would be appropriate for your system and fortunately this distribution of lifetime and all they actually come with commercial data fitting packages now in our lab we have two programs one is from Pico Kwan the other is from IBH which is now Kauriva Javini one both the programs I believe have this option of fitting to a distribution of lifetimes it is more difficult it takes more time it requires more playing around but it is doable of course if you use a better algorithm then it is easier to do it but maybe we will postpone that discussion until we talk about actual data fitting and goodness of fit let us move on this distribution of lifetime is often a better model to use depending on what kind of system you are looking at but as we discussed it is also more complicated model multiple exponential is easier in fact even fitting is easier so often what we do is and these programs usually have a provision of letting you do it often what you do is you try to get away with the trouble of using explicitly a distribution function like Gaussian Lorentz and etc by instead using a large number of exponential function of course at this point it might be a little confusing because 10 minutes ago I was saying not very kind things about multiple exponential functions and here I am saying that you can fit the data to a large number of exponential functions but bear with me for a while it will start making sense so what you do is you fit to a large number of exponential functions but what you do is that you tell the system what the lifetimes are so you fit to not 2 exponential or 3 exponential function fit to 100 exponentials if your computer and if your program are good enough fit to 1000 exponentials and use a wide range something like this use fixed lifetimes so the way you fit now is that you say that the lifetimes I have are 0.1 nanosecond 1 nanosecond 10 nanosecond 100 nanosecond so on and so forth usually they are arranged logarithmically not 10 after 1 but logarithmically so that you can look at small lifetimes as well as large lifetimes okay and you fit your data to this function where all these values are forcibly preset so what is the only play you have what is the only parameter that is going to change the amplitudes right AI so what you will get is you will get if you are using 100 lifetimes you will get 100 amplitudes now what you do is you plot the amplitude against lifetime right and then you get plots like this this is actual data taken from this 2000 cell molecule biology paper so here you see at the looked at different emission wavelengths 300 nanometer 320 nanometer 380 nanometer note the y axis amplitude note the x axis lifetime and here of course they are not going to 100 nanosecond rather they have gone from less than 0.1 nanosecond do not ask me how they did it using time correlated single photon counting up to say 10 nanosecond and if you look carefully at the x axis can you see that it is logarithmic it is logarithmic because you want to look at 0.1 because 0.1 nanosecond kind of lifetime as well as 5 6 nanosecond kind of lifetime if it is not logarithmic you are going to miss this so you see let us not worry about what is what fk bp 59 is some kind of a protein but what you see is at 380 nanometer you have 2 kinds of lifetimes something that is very small 0.1 nanosecond or so something that is quite large say what is this 2 3 4 5 6 6 nanosecond and there is a distribution about 6 nanosecond there is a distribution about 0.1 nanosecond as well that means that first of all there are 2 broad kinds of environments moreover within each of the kind of environment there are sub domains that is why you get this distribution and one reason why if you can if you have the capability of fitting your data to 100 exponentials one way this approach is better than using an explicit distribution is that how do you know what the distribution is how do you know that it is Gaussian look at what we see here is this Gaussian is actually log normal kind of distribution but there is no way in which I can know beforehand whether it is going to be Gaussian Lorentzian log normal what right so good thing about fitting your data to many exponential model where lifetime is fixed amplitude is varied and you make a plot of amplitude versus lifetime is that you do not care about what the kind of distribution is but it comes out automatically in your result right here you see that the short lifetime but of course this may not be log normal also because do not forget the x axis is not linear it is actually logarithmic so I do not know what it is but the point is I am not working with any particular kind of distribution whatever is the distribution is expected to show up in the process now when you go from 380 nanometer to 320 nanometer now what do you see now this 0.1 nanosecond kind of component is completely gone rather you have a broad in fact if you work out the area under this one and this one I do not know which one will be more I do not know even more because the scale is logarithmic but here you have quite a good distribution around say 0.3 nanosecond so the 0.1 nanosecond component is gone you get a 0.3 nanosecond component and you have this distribution there we have something new between 1 and 2 nanosecond and that also has a broad distribution and whatever you had earlier is there but now what it appears is that this thick edge that you had has given way to a completely new distribution that is there okay I do not know what the system is and at the moment I do not care just trying to show you some data and trying to discuss what this data would mean now when you go to 300 nanometer what do you see you see that this 0.3 nanosecond component that had come that is now the major component it has some distribution but it is not so much of course you can see that here full width half maximum is several nanosecond here full width half maximum is hundreds of picosecond of course you have to work out percentage so now this 200 300 picosecond component is the major one this long component has become very small and this one has also gone down compared to what it is okay so here I hope we have been able to convey that by doing this kind of data fitting we actually get a wealth of information that we do not get if we mindlessly fit our data to double exponential or triple exponential model and resort to your average lifetime that means nothing this actually tells you what your system is like okay so what we have learned so far is that more often than not you might have to work with a system where you have a distribution of lifetimes the one way of handling distribution of lifetimes is to use a specific distribution danger of that is that that may not be the case other way of doing it is go back to good old multi exponential function but this time plot amplitude versus lifetime and do not stop at 2 and 3 since you are doing multi exponential go all the way and fit to 100 exponential but for that you have to have a good computer you have to have stout algorithm we will talk a little bit about algorithm towards the end okay but here we take a break and we come back in the next module and continue with more data fitting models and there we also learn about goodness of print.