 So today I'm going to start first by introducing the thermodynamic functions and those functions I only need to introduce them once because they are fairly straightforward although the physical meaning of those functions I hope to make clear. So today's talk is about the thermodynamic functions and if you look at this picture it's just an ordinary fire that you see here okay ordinary fire this is a picture I took in Graz in Austria but you can find this picture in many places and back in the early 19th century people were wondering you know what actually is heat is it a substance that you know you pass from one material to another or what exactly constitutes heat and you know there were various theories that it was actually a fluid a substance and it wasn't until 1820-1830 that people started to think a lot more about what heat actually means and the sketch that you see over here is an apparatus which was constructed by Jewel and basically you know you have here a flask which is inside a container with water this is actually a steel container made with tin plate and this is a thermometer that actually he constructed himself so remember these are these are the days when you know you couldn't just go and buy a thermometer but if you look at his original paper he was able to calibrate this so accurately that he could measure 100th of a degree okay and he calibrated this against another thermometer which was calibrated to the boiling point of water and the freezing point of pure water and his goal was that look this is a mechanical device which pumps air into this chamber okay so he could he could calculate the amount of work done in pumping the chamber pumping this chamber and then work out how much heat is absorbed by the act of compression and therefore see whether the two equate all right so is the mechanical energy that it put in equal to the energy which is the heat capacity times the temperature rise in that material and lo and behold after taking account of any errors he could derive a mechanical equivalent of heat that means you know if I lift a weight by one kilogram what is that equivalent to in terms of heat so this was the first ever proof that energy is not a fluid of some sort which is passed on from one material to another but it actually is a quantity which you can convert to mechanical energy and vice versa so he discovered the mechanical equivalent of heat and if you look at his original paper which was in 1845 it reads very beautifully and you know the amount of thought that went into designing this and other critical experiments in that paper is amazing now how can you explain this you know how can you explain that you're converting mechanical energy into into something that is locked inside your gas in the material in the flask well you know the argument was fairly straightforward that we should think about heat as the vibration of particles and those particles vibrate more when the temperature increases and then it becomes clear how you could actually by applying a mechanical force and compressing the gas cause them to move more violently and therefore you have heat so he not only had a mechanical equivalent of heat but he could explain the mechanism by which you heat up the gas inside the flask here by compressing this and it isn't surprising therefore that jewel was the person who coined the phrase thermodynamic because dynamic refers to these vibrations of all these atoms which constitute temperature or heat of the material and thermal you know for obvious reasons so i believe that this is the first paper 1858 which actually refers to the term thermodynamic okay that's what our course is about so there were some really amazing discoveries made in those early days where you're trying to explain what heat actually is because if you go back a long way you know the four elements used to consist of earth water fire and air so it was a very ethereal subject with no particular proof of what's happening so jewel made an enormous contribution to the understanding of heat okay so he also laid the foundations of the conservation of energy the principle of the conservation of energy because he found that the amount of work done mechanically was equal to the amount of heat generated inside that flask okay so the principle of conservation of energy has never been violated okay now let's think about some functions that we need to define in order to work with thermodynamics well the first is internal energy which will use a symbol you and imagine that we have here a system and it is at a particular temperature T2 let's say and you transfer some heat into that material and by sign convention when we transfer heat into the material we call that positive as a consequence of that and we are maintaining the temperature constant as a consequence of this transfer of heat into this region the material expands and it does work against the environment and that work because it's done against a particular pressure P we label as negative so the change in the internal energy of our sample is the amount of heat we put in minus the amount of work done and then you express this in express this in differential form that du is equal to dq minus dw where dw is the work done against the pressure surrounding this object so that is the definition of internal energy okay if we look now at heat capacity and I mentioned heat capacity in the context of joules experiments but let's define it formally so heat capacity is the heat that is absorbed per unit change in temperature so since dq is the amount of heat that we transferred into this material and the temperature change was dt heat capacity is simply dq by dt and since du is equal to dq plus pdv we can write that dq is equal to du the change in the internal energy plus this term here which is the work done against the environment so at constant volume we define the heat capacity as du by dt keeping the volume constant okay so that's the meaning of heat capacity now how do we measure heat capacity because it's really a very important parameter it helps us to calculate thermodynamic properties you'll see that later we can derive terms like entropy and free energies by measuring heat capacities well very simple way of measuring the heat capacity is using a calorimeter of some sort this particular calorimeter it could be a differential thermal analyzer or a differential scanning calorimeter they both work on the same principle so what we have is two identical cans here two identical cans one of which contains a sample and the other one might be kept empty or it may contain an inert sample which doesn't react at all and we enclose this into in a furnace here this is a furnace and you know you try and maintain the temperature inside this object extremely accurately and we have inert gas to protect the samples you then ramp up the temperature at a certain rate and because the sample has a certain heat capacity whereas the reference here is an empty can they will reach different temperatures okay so the sample may be at a higher temperature if it is able to absorb more heat per unit change in the furnace temperature than the reference or the other way around so by looking at the temperature difference between the reference and the sample you can work out the heat capacity if the sample undergoes a reaction okay then you can also measure the enthalpy change accompanying the reaction but you haven't defined entropy yet so equipment like this is now quite routine and can be used to measure these thermodynamic properties such as the heat capacity now normally we are not working at constant volume you know if we are looking at the lump of iron and we heat it up then it's likely that it will expand in other words you're not measuring a property at constant volume so instead of internal energy we define a separate quantity which is known as the enthalpy so the enthalpy is the internal energy and the plus the product of the pressure and the volume of the system and that gives us another definition for heat capacity which is the change in enthalpy per unit temperature at a constant pressure so this is the more common heat capacity that we use in in most applications now supposing that you heat a sample between a temperature t1 and t2 then you can actually measure the change in enthalpy as a consequence simply by integrating the heat capacity versus the temperature so we get direct information on the enthalpy changes inside the material using that calorimeter device that i showed you earlier okay um given that cp is the heat capacity at constant pressure and cv is the heat capacity at constant volume it is quite reasonable that the difference between these two depends on the bulk modulus of the material so if your material is very stiff then it's not going to expand much against the pressure change pressure of the environment and similarly there will be a term in there which depends on the thermal expansion coefficient so i don't want to go into the detail of this actual relationship but the difference for solids between cp and cv is really quite small so here for example these are data for copper and tungsten which are often used as references and of course the heat capacities are a function of temperature so here i'm giving you the data for 300 Kelvin and you can see that it's not much of a difference between the heat capacity at constant pressure and the heat capacity at constant volume this is of course not the case if you're working with something like a gas okay now we ask a more fundamental question consider any reaction you know a plus b going to c is it the case that when the enthalpy is reduced in this reaction that means energy is liberated so the enthalpy of c is less than that combined enthalpies of a and b is it the case that that would define whether or not this reaction happens spontaneously or not okay so can we say that if the heat of reaction is negative that means you release heat then the reaction should go forward how do we decide whether this reaction actually occurs and it was clausius it was a canoe many many years ago who started to address this question and he noted for example that even when there is no enthalpy change reactions can happen okay so there's something something missing and he defined that as entropy this is the greek root of entropy which means to give a direction exactly the question i was asking earlier that will this reaction happen and how can i define a quantity which says this reaction will happen spontaneously in this direction and this is another piece of classical work which was done in 1824 basically heat and engines okay the the huge article is about heat and engines and what he did was absolutely brilliant you know he imagined in his mind because at that time you know people were making engines steam engines and basic question is you know what is the efficiency that you can get out of a steam engine so he did a virtual experiment in his mind and i'm going to explain explain that to you okay so imagine that we have here a system which is at a temperature t2 and we put heat q2 into that object okay now when i maintain the color as identical i mean they are at the same temperature okay so all the reds are at the same temperature and all the blues are at the same temperature so as i explained before when we add heat to this transfer a quantity of heat into this sample here it will it will expand in order to maintain the temperature because we've added heat and we want to maintain the temperature so isothermally it expands okay and it does a certain amount of work now that is isothermal expansion okay if we had done this adiabatically that means if i now completely insulate this and i adiabatically expand it then its temperature would drop to another temperature t1 and another quantity of work is done w2 so there's nothing complicated here here we are adding heat isothermally so it's natural that this should expand and here we are doing an adiabatic expansion so the temperature has dropped to t1 there's no heat being added or removed now i want to get back to this original state so first i do an isothermal compression of this and that lets out a heat q1 because we are maintaining the temperature constant if i compress this then clearly i've introduced energy here and i get rid of that energy to maintain the temperature t1 and then adiabatically compress it so that i recover my original system okay so the temperature is increased in this adiabatic compression to t2 and this is what we call a reversible cycle where you've gone exactly to the original point and we haven't lost anything in terms of say friction or irreversible processes now the total work done by the by the object is minus w remember our sign convention that if you're doing work then it's minus and if work is being done on the object then that's a plus w so minus w is equal to the heat that we put in and remember that the heat that we take out has a minus sign in front of it so the work done is the difference between these two quantities and therefore we can define the efficiency as the work done by the heat put in we're not concerned with this because this is the actual energy that we put into our fluid so minus w o q2 defines the efficiency of this cycle and of course minus w is q2 plus q1 over q2 now this is a really really important equation it's the thermodynamic efficiency of this Carnot cycle and what Kelvin did was he defined temperature using this equation so he basically said okay we cannot have an efficiency greater than one and therefore he defined an absolute temperature of zero at which the efficiency would be exactly one so t2 minus t1 if t1 the lowest part of the cycle is zero Kelvin then the efficiency is exactly one okay so that is why we talk about an absolute zero that in the Carnot cycle you cannot get an efficiency greater than one that means all the energy that is put into the material is converted into work so the absolute zero temperature is defined in terms of the Carnot cycle when t1 is equal to zero we have a thermodynamic efficiency of one and this is the this equation here basically is the maximum efficiency that you can get in a cycle like this if you have any irreversible processes happening such as for example friction that I mentioned or noise being generated which is dissipated then the efficiency will be less than this okay so when we talk about making aircraft engines more efficient by increasing the highest operating temperature we are not going to get anywhere near the thermodynamic efficiency simply because there are other dissipative processes happening which which means that we cannot achieve the maximum efficiency sorry I need to let people in or late this is one of the dangers of this method of lecturing Harry I am just making an announcement for smooth conducting of our lecture series do not admit anybody from you know next class on word after five minutes okay I think that's a good yeah that's a very good idea okay so I'm going to shut down the admission system okay okay so this equation is extremely powerful and it is the guiding principle for making a better steam steam to electricity generators in power plant better efficiency in aircraft engines and also a measure of how much energy we are wasting that we are not recovering at as work now if I take this equation here and I rearrange it so you you can try this for yourself rearrange this equation then you get q2 over t2 plus q1 over t1 equals zero and in differential form that would be dq2 by t2 plus dq1 by t1 equals zero and since we have recovered our original state by going through this cycle basically the integral of dq by t over that cycle is zero for a reversible process and this of course is entropy is the entropy that carno defined the carno entropy okay now this is for a reversible process where dq over t across that cycle becomes zero if we have an irreversible process for example here then what I'm doing here is I have a body at a temperature t2 and another body at a temperature a lower temperature t1 and they are connected here by something that allows the heat to flow through for example copper so the heat is flowing in this direction into this container this process is irreversible because we are actually dissipating energy from t2 to t1 and in such a process the entropy change according to that carno entropy that we derived earlier is q over t1 minus q over t2 this is the final minus the initial and that's the change and that clearly has to be greater than zero because t2 is greater than t1 so this is a smaller quantity so in any irreversible process so this is the second really important conclusion in any irreversible process the change in entropy will be greater than zero okay so entropy is a difficult concept to grasp but in terms of the carno cycle it's straightforward we managed to define the thermodynamic efficiency and also the fact that if you have an irreversible process you will get an increase in entropy but what does this diagram actually show what it's showing is that we had organized energy we had a sample which is hot and a sample which is cold by allowing the transfer of heat you are basically spreading out the energy okay so that is a greater this degree of disorder than having organized heat you know when you have an electrical battery in your phone that is organized energy and of course when you use your phone you are dissipating that energy and spreading it out and therefore the entropy is increasing all the time so entropy does really correspond to disorder being created and I will show you now another example of this which you know sometimes it's difficult to link the entropy description that I'm going to give you in the next slide with the entropy description of carno but you'll see that they are actually both dealing with an increase in disorder so imagine that we have a crystal here and the crystal has red atoms on this side and black atoms on this side and there are n atoms of type A let's say red and n minus n of type B where capital N is the total number of atoms okay so this is just saying that we have n minus the red of B atoms so this is highly organized we have all the red atoms on one side and all the black atoms on the other side but supposing we allow them to mix in some way okay so I've just given you three different arrangements that I can make without changing the composition or the size of the crystal these are simply three different arrangements so the probability of getting a disorganized array of atoms is much greater than of getting an organized array of atoms okay and we can work that out so if I were to place the first atom onto this lattice okay so imagine that all these sides are unoccupied and I place one atom I can place it at capital N different locations okay and the second atom will only have access to capital N minus one locations and you know we've got to make a correction for that second atom because we you know there are two two ways in which you can place it on on that lattice so we divide by a factor of two and if I continue and if I continue this process then I have this equation telling me the total number of arrangements now I don't think I explained this very well you know if I choose this atom first then the second atom would be placed at another location and if I choose one atom of that pair first then I would have another possibility and we can't distinguish those possibilities so we divide by two and similarly as we continue this process you you end up with an equation like this where we have this factorial capital N factorial over small n factorial times capital N minus small n factorial so that gives us the total number of arrangements that we can have in this system of atoms and Boltzmann was working on this and it was at a time when people were not absolutely certain whether matter was particulate that means consistent consisting of atoms there were really huge arguments about this and they were often not very pleasant but before he did that he came up with this equation so these are the number of arrangements the number of configurations of these atoms and he wrote entropy as being proportional to log of the number of configurations now the reason for choosing a log is that entropy is an additive property that means if I take two bodies with two different entropies put them together then the total is the sum of the two and therefore you know if I take the log of the number of configurations I can add up the terms to get the total entropy at k which is what we call the Boltzmann constant is simply the proportionality constant between this and log of that and if I substitute that equation telling me the number of configurations here into the Boltzmann equation I end up with something like this where the change in entropy when you mix the atoms is the concentration of a particular species times the log of the concentration of that particular species times r which is the Boltzmann constant times Avogadro's number is the gas constant I'm going to show you how to derive this and in order to do this I need to change the screen that I'm sharing to another device okay let's derive the logarithm of the number of configurations using sterling's approximation that y log y minus y is equal to log of y vectorial so this term here becomes n log n minus n and this is underneath so it will be minus small n log small n plus small n and similarly minus capital n minus small n log n minus n plus n minus n now we can get rid of these terms so our equation becomes n log n minus n log n now I'm going to rewrite this term here as n minus n plus n so I will bring together all the terms therefore and if I divide by Avogadro's number I get minus x log x minus 1 minus x log 1 minus x which is exactly the equation that we were trying to derive and that is a representation of configurational entropy okay and again it represents a degree of disorder because we started with a highly organized crystal where all the black atoms were on one side and all the red atoms were on the other side and then we looked at all other possibilities and that gives us the change in entropy when you go from a highly organized system that there is only one arrangement possible to many many possibilities you can imagine that if you have you know 10 to the power of 23 atoms in the system then the number of arrangements is very very large okay so that that was basically a substitution of this into sterling's approximation to derive this now in that particular example of the disordering of the crystal let's assume that there is no difference in the binding energies when we change from air bonds to BB bonds right in other words that this is an ideal solution that we form then what would drive that disordering process well you know the entropy change is zero if we say that there is no change in bond energy when we mix up the atoms but the reaction never the less happens when we mix the A and B atoms together and allow them to move so the entropy change is greater than zero and therefore the reaction happens in the direction of disordering so the enthalpy change by itself is not sufficient to decide on the direction of the reaction so we'll bear that in mind before I go before I do that you know we've got this equation for the differential change in entropy being dq over t and therefore we can work out the entropy change as we heat up the sample from t1 to t2 as simply cp over tdt so this is a quantity that you can measure directly using calorimetry so we need to define another function which actually tells us which direction the reaction can proceed in in a spontaneous manner and that is the Gibbs free energy here which is the enthalpy minus the temperature times the entropy so it is this quantity which must be reduced for a reaction to be able to occur spontaneously now this is at constant pressure if we weren't working at constant pressure but at constant volume then this would be replaced by the Helmholtz free energy which has a symbol f and this would be the internal energy u so you can see that the heat capacity is an extremely important parameter from an experimental point of view you can do measurements you can even measure the enthalpy change on that calorimeter that I described to you earlier and you know we routinely now use phase diagram calculations but all those calculations depend on thermodynamic data which at some stage were measured okay now what is the mechanism by which a material absorbs heat so you can factorize for metals you can factorize the heat capacity into three essential terms one is due to the lattice vibrations the vibrations become more energetic as the temperature is increased and that is a mechanism of absorbing heat secondly we have an electronic contribution because you know in metals the electrons some of the electrons are not bound to individual atoms they are able to move like an electron gas and therefore there's an ability to absorb energy and a magnetic term so if you have magnetic ordering and as you raise the temperature the magnetic spins tend to become disordered then that is also a mechanism for absorbing energy now I will repeat this many times in this series of lectures but in the case of iron body-centered cubic ion if we did not have this term here paramagnetic paramagnetic disordering as we go beyond the curie temperature then we would not have body-centered cubic ions stable at ambient temperature and pressure okay now just imagine your life without body-centered cubic ion it would be absolutely miserable okay so you owe magnetism for civilization in your lives now let me just comment on the magnitude of this and this okay so lattice vibrations follow what's known as the Boltzmann distribution so supposing you take a cylinder and you fill it up with ping-pong balls all right and they're all at rest and then you pump gas from underneath then all the balls will not rise to the same energy level okay there will be a distribution of heights to which the balls jump and that is a distribution which we call the Boltzmann distribution and there is no limitation on how many balls can occupy a particular energy state that is not the case for electrons you know for each energy band you can only have two electrons with opposite spins occupying that energy state and that's illustrated here so this is the Boltzmann distribution that I talked about earlier there's really no limit on how many particles can occupy a particular energy level but in the case of electrons only two can occupy any particular energy state so it is only the electrons that are close to what's known as the Fermi energy so at zero Kelvin you know these are all the occupied states and the maximum energy which is occupied at zero Kelvin is called the Fermi energy as we raise the temperature only those electrons in the vicinity of the Fermi energy can actually jump to higher energy levels so it's a very small fraction that can participate in the process of absorbing heat and this Fermi temperature for example for iron is something like 14,000 Kelvin all right so at that temperature you know you would have all the electrons participating in the absorption of energy but we are not going to get to 14,000 Kelvin so in general you know there's very little heat capacity from the electronic term the lattice vibrations the heat capacity follows a curve like this now this is what you might expect from a gas from a monoatomic gas which has different vibration modes okay three different vibration modes and therefore the heat capacity is three times n times k three degrees of freedom of vibration and it's at the divide temperature TD that a metal would begin to behave as if all the atoms are vibrating independently all right they're not constrained in a sense by the lattice but you know at lower temperatures one atom vibrating this way will obviously have an interference from other atoms around it so they tend to be a little bit more collective and therefore the heat capacity decreases as you go down in temperature now for iron the divide temperature is of the order of 400 degrees centigrade okay so this illustrates the point I was making earlier the dashed curve here represents the heat capacity due to lattice vibrations and the electronic term for alpha iron the body centric cubic form of iron and this curve here is all three terms the magnetic electronic and lattice vibrations and you can see there's a very large contribution from the magnetic term in the case of body centric cubic iron but the same applies actually for austenite that there is a magnetic contribution to the heat capacity which is quite significant so heat capacity is a really important parameter that we can measure directly and use in calculations these data actually are from carpentry and vice from the early 1950s where they showed you know bcc iron at ambient conditions then fcc iron and then reverting back to bcc iron which is very strange behavior and that is entirely due to the magnetic properties of iron so this is what I want to teach you today but learning thermodynamics can be a bit abstract so I'm going to show you now how you can immediately apply the knowledge from today to some of the latest research that's going on in the world so the first example I'm giving you is the so-called high entropy alloys okay so this was an idea originating I think from Brian Cantor where he wanted to work on alloys which contain equal concentrations of all the solutes or all the elements so there is no solvent and there is no solute all of the elements inside your alloy are equal concentrations and if you do that then you get a maximum configurational entropy assuming that all the atoms are mixed at random you get a maximum configurational entropy and there is a sense of confusion inside the alloy which kind of stabilizes a single phase even though we have a combination of 20 atomic percent of cobalt manganese iron chromium and nickel you end up with a single phase alloy and the reason why they're called high entropy is because if all the solutes are in all the elements are in exactly the same concentration then the entropy of mixing is at a maximum okay so it's very interesting this is the single phase in this complex alloy which is which has five different elements and even though the crystal structures are different so it's only nickel which has the FCC structure at ambient temperature this is cobalt is hexagonal close packed manganese has a unit cell with 58 atoms in it and we have body centered cubic iron and body centered cubic chromium even though the crystal structures are so different when you mix them in equal quantities you end up with a phase centered cubic structure exactly like nickel now that is good as we know you know there are many many slip systems and so forth so the materials should have good properties and this is just to show you that you know if i hadn't told you this was a high entropy alloy you might say it's copper or or just pure nickel but the properties are very good you know with a strength of one gigapascal a fracture toughness of 200 megapascal root meters and an elongation of 70 percent what is the next stage required in the development of this subject so my opinion first of all i should say that you know this this has been a story that it's a high entropy alloy but in fact this equation only applies to an ideal solution where there's no change in enthalpy when you mix the solids and that absolutely is not the case here so you will not get a random distribution of these atoms inside our alloy you will in fact get clustering or other phenomena so if you forget about the fact that it's high entropy but just focus on focus on the fact that we have equal concentrations of everything and we've obtained a single phase what is to stop us from exploiting these properties and that i think is where the subject is stuck and where much more research is needed is how do you scale up an alloy like this okay so that when you actually make it solidify you don't get segregation and many other complications which lead to precipitation and what application do i need this where i have such an expensive alloy okay so right now i would argue that there isn't a single application of high entropy alloys the science has essentially been done okay we can always you know dot the i's and cross the t's but if you want to be serious about this you need to be able to scale it up and to find an application which genuinely needs an alloy with this level of expense now the second example i'm going to give you and we are only using the information we've learned today okay so just after one lecture on thermodynamics so supposing that we have a defect for example a vacancy inside of a lattice then obviously the way in which we can make we can arrange the atoms onto this lattice increases dramatically because the vacancy can be located at any any of those sites and let's say that the energy required to create that vacancy is for one vacancy is delta g small g and n is the number of such defects this is a cost and the cost of creating a defect means that it doesn't really want to form however this term here actually favors the formation of that vacancy this is the configurational entropy which favors disorder that means more vacancies we put in there up to up to a limit the more configurations we will have and therefore you favor the formation of vacancy so these two terms here oppose each other so to find the equilibrium value you differentiate this with respect to the number of defects set it to zero and you get that the equilibrium number of defects is small n divided by the total number of sites I have here into the exponential of the energy of formation of a single defect divided by k at so what this shows is that you will always have defects at equilibrium right there is no way you can create a defect free material if you are at equilibrium and this of course is the reason why you get diffusion in the solid state right because it's the jumping of atoms into these vacancies which gives us the transport of atoms but much more importantly if you look at some work done back in 1956 on the strength of iron single crystals of iron then you can see that we can achieve strengths of the order of 15 gigapascals right by looking at single crystals of absolutely pure iron but the strength collapses as the size increases okay so this equation gives you a clue and is the two effectively a size here the number of entities as we increase the number of entities so will the number of defects increase and therefore the strength collapses to what we normally get for iron right so if you try to get strength by making something perfect and you know you can get a perfect crystal if you make it small enough in which case you have to rip the atoms apart and that's why it's so strong but as you increase the size the probability of finding a defect also increases and therefore the strength collapses and this is a lesson that was learned in 1956 and we could have saved literally hundreds of millions of pounds if people had studied this paper and some elementary thermodynamics because there was a lot of noise made when carbon nanotubes came about and the strength of carbon nanotubes was said to be 130 gigapascals which is basically the carbon carbon bond strength and a huge amount of money was wasted in trying to make carbon nanotubes as structural materials but of course there is a strength of 130 gigapascals when the tube is very very small but its strength collapses as you increase the size for exactly the same reason that very fundamental configuration entropy equation that we derived precisely the same applies to graphene as soon as you scale up the size of graphene its strength is lost so it's complete nonsense to talk about these materials as structural materials on the size scales that we are normally used to so thermodynamics is a very powerful tool with which you can actually influence your research and thinking and interpret news items you know because these stories made enormous on the graphene and the nanotubes made enormous news stories with ridiculous statements about how it is 200 times stronger than steel whereas an elementary thermodynamics equation would have told you that this is not actually a reasonable thing to say right now in the next lecture having defined all these quantities I will deal with equilibrium that means you know if I put two different compositions or two different lattices and compositions together how can we decide whether that will change that that initial configuration will change or not no matter how long I observe it so in the case of an allotropic transition in a pure substance we can we can plot the free energy the Gibbs free energy as a function of temperature and where they intersect they have equal free energy so there's no tendency whatsoever to change from one to the other and that clearly defines an equilibrium temperature but we very very rarely deal with pure substances so we need to think a little bit more about the nature of a solid solution in a multi-component system okay I'll end the talk now