 So, let us begin. We had started looking at the notion of differentiability of functions of several variables. So, we gave examples for a function f defined in a domain D in R2. So, there exist functions f say that f is not continuous, but its partial derivatives exist. So, that means the existence of partial derivative is not good enough. We need a stronger definition of what we should be calling as differentiability of functions of several variables. So, let us go back to one variable. For a function of one variable, we said f is differentiable at a point x is equal to say c. If we can write f of x plus h at c, so let us write at the point c, c plus h equal to f of c plus h times f dash of c plus some error, if this is equal to this for every h not equal to 0, where the error epsilon h goes to 0 as h goes to 0. So, this is the analytical way of saying the differentiability of a function of one variable and this was the existence of derivative. So, we will take this as a definition for two variables as follows. So, let us write definition for a function f defined in a domain, nice domain, so that everything is okay, but it can be an open ball around a point. So, f is a function of two variables is said to be differentiable at a point say x not, y not belonging to D. If the following happens one, both the partial derivatives exist f dash x at x not, y not, f dash y at x not, y not exist. Both the partial derivatives exist and second something like that error happens in both the variables. So, let us write what does it mean? It means that f of x not plus h y not plus k, so that is the value of the function at a nearby point should be equal to the value of the function at the point plus in one variable we had h into f dash. So, here there are two variables. So, each variable will contribute something. So, it is the increment in x variable is h. So, h times partial derivative of f with respect to x at x not, y not plus the second variable gives k times f partial derivative with respect to y at x not, y not plus error functions. So, for both, there are two error functions h times epsilon 1, h k plus k times epsilon 2 times h k exists. So, second condition is that the value at a nearby point can be written as the value at that point plus h times f dash in the direction of x, k times f dash in the direction of y plus errors like in one variable, but here that h times is incorporated inside this thing. So, where epsilon 1, h k their functions of two variables they go to 0 as h and k both go to 0. So, this is what the differentiability of a function of two variables means we are trying to take care of each variable and similar to the definition in one variable. So, at a nearby point, so this is the value at a nearby point the increment in the x direction is h. So, h times f dash in the direction of x increment in y is k. So, k times f dash partial derivative with respect to y x not, y not plus the errors with respect to both the increments which errors go to 0 as h and k go to 0. So, let us look slightly complicated, but it is very much amenable in the sense that it is very much similar to the one variable definition. So, let us have some observations. So, note, let us see whether this is good enough for definition or not. So, if f is differentiable at x not, y not, then this differentiability and if it requires that the partial derivatives exist. So, if the partial derivatives do not exist either of it at a point when the function is not going to be differentiable, implies f x at x not, y not and the partial derivative with respect to y x not, y not exist. So, as a consequence, this is a necessary requirement. So, hence if either of these partial derivatives do not exist, then f is not differentiable. So, this is remark one. So, let us look at some example. So, example, let us look at a simplest one. This is essentially extension of the one variable. So, let us look at the function f x y equal to mod x plus mod y for every x y. So, the function of two variables. It is clear that f is continuous everywhere let us say at 0 0 also in particular. It is a continuous function everywhere continuous. Is it okay for everybody? It is continuous function mod x plus mod y as a function of two variables. It is continuous. Let us look at the partial derivatives. So, what about partial derivatives? At the point 0 0. So, I want to calculate f dash of x at the point 0 0. So, what will be that? So, the partial derivative will be limit h going to 0 f of 0 plus h in the direction x 0 minus f at divided by h. There is a definition of the partial derivative at 0 0 increment in the direction of x minus the value of the function at that point divided by. So, what is this? This is limit h going to 0. What is f of 0 plus h? So, that is mod h that is 0 divided by h and that does not exist. Why does not exist? Depending on h is positive or negative this limit is either 0 or minus 1 or 1. So, left limit is not same as. So, it is mod x basically looking at it. So, then similarly f dash of y 0 0 does not exist. So, both the partial derivatives do not exist. So, this function hence f x y equal to mod x plus mod y is not differentiable at 0 0 because the partial derivatives itself do not exist. So, as per definition function to be differentiable first of all partial derivatives must exist. Let us make another observation look at. So, let us suppose let f be differentiable then we have with a nearby point x 0 plus h y 0 plus k. So, let me bring all the terms on the one side except this error terms everything else on one side like function of one variable. So, minus f at x 0 y 0 minus h times partial derivative in the direction of x minus k f dash in the direction y x 0 y 0. So, that is looking back. So, these terms I have brought to another side the right hand side is h is equal to h times epsilon 1 h k plus k times epsilon 2 the error function. So, epsilon 2 h k. I just taken some terms on the left hand side and at leaving error. Now, in one variable when we look at the derivative it is f at the nearby point minus f of x divided by the increment limit that is the limit of the secant slope. So, let us so what is the increment here in the h direction it is h k direction it is x direction is h k direction k. So, total increment is square root of h square plus k square if you take as a distance in R 2. So, let us divide by that. So, let us look at absolute value of f x 0 plus h y 0 plus k minus f at x 0 y 0. Now, let me put this in the bracket. So, that it looks very much similar to one variable thing plus h times f dash of x x 0 y 0 plus k times of f dash with respect to y at x 0 y 0 divided by x square plus k square. So, I have taken divided by this and taken the absolute value. So, that will be equal to so in the right hand side that will be equal to absolute value of h epsilon 1 h k plus k epsilon 2 h k divided by square root of h square plus k square just divided and I am trying to bring it something similar to function of one variable. Now, this thing on the right hand side I can separate that out is less than or equal to mod h divided by h square plus k square mod of epsilon 1 h k plus mod k divided by h square plus k square mod of epsilon 2 h k. You got tranquil inequality less than or equal to. Now, let us observe it does not matter actually, but mod h divided by this quantity is always less than or equal to 1 because numerator is always bigger than or equal to numerator. So, it is less than or equal to mod epsilon 1 h k plus mod epsilon 2 of h k because mod h by this mod k by this square root of h square plus k square is less than or equal to 1 and this goes to 0 as h and k go to 0 0. So, what we are saying is differentiability implies that this quantity on the left hand side goes to 0 as h and k go to 0 0. So, this is very much similar to the one variable definition and so it does not require and this does not require the knowledge of epsilon 1 epsilon 2 and so on. So, it says f differentiable implies that this quantity partial derivative should exist. Once they exist, you form this quotient and that should go to 0. So, that will tell that this is differentiable. That differentiability implies this and the interesting thing is one can prove the converse and say that if this condition is satisfied, then function is differentiable. So, let us we will not prove that, but we will just list it. So, f differentiable. So, in fact, at x naught, y naught is equivalent to saying the partial derivatives exist and this quantity x naught, y naught plus h times the partial derivative with respect to x. I am just writing again and again so that it gets clear and plus k times partial derivative with respect to y divided by square root of h square plus k square absolute value goes to 0 as h and k go to 0 0. This is not, we have just now shown one way, it is quite easy to show. Otherwise also it is not difficult one has to just manipulate a few things. So, we will leave that, we will assume that. So, whenever we want to check whether function is differentiable or not, we will check these two conditions are satisfied or not. So, this is like existence of partial derivatives and something goes to 0. Let us look at something more, consequence of this definition 3. So, let f be differentiable at x naught, y naught. Then once again, let us go back to the definition and look at this equation or let us go back to the original thing that was this one, condition 2. So, let us rewrite this. What I want to do is, I just want to bring one term on the left hand side, everything else on the right hand side. So, let me do that. Copy and write equation and without doing much effort, let us write it as this. So, this is what we have got. So, let us shift. So, implies f of x naught plus h y naught plus k minus f at x naught y naught is equal to x times something f x, I mean x naught y naught plus k times f y x naught y naught plus h times epsilon 1 plus k times epsilon 2. Now, let us take the limit of both sides as h and k go to 0, 0. What happens? So, limit of the left hand side. So, limit h k going to 0, 0 of the left hand side. What is the left hand side? f at the nearby point minus f at x naught y naught is h times something. h goes to 0. So, the first term goes to 0. Second term is k times f dash partial derivative. That goes to 0. x times k times everything goes to 0. So, this limit is equal to 0, because all the terms on the right hand side go to 0 as h and k go to 0, 0. So, what does this mean? You are saying f at a nearby point minus f at that point, the distance goes to 0. That means the function is continuous at the point x naught y naught. So, implies f is continuous at x naught y naught. So, this notion of differentiability does imply the function is continuous. So, this seems to be a good enough definition for differentiability. There is another way of checking something is differentiable or not, and that is only a sufficient condition. So, probably let me look at the slides and show you. So, this is one variable definition. So, this is the two variable definitions that we just now said. Here, the point taken is a, b. So, at a nearby point, there should exist error functions epsilon 1 and epsilon 2, such that the value at a nearby point is equal to the value at that point plus the increment in the x direction h partial derivative k partial derivative h times epsilon 1 plus k times epsilon 2. And we said that this is equivalent to by taking everything on the one side, dividing and taking the limit. So, it goes to 0. So, differentiability implies that this goes to 0. And conversely, it is also true. So, we will be assuming that fact. For example, here is another example. Look at this function f x y equal to x square plus y square square root. This function is continuous at 0 0. As x goes to 0, y goes to 0, this goes to 0 0. Obviously, it is a continuous function. If you want to write epsilon delta definition, you can write down, epsilon equal to delta. That will be okay. This is not differentiable at 0 0. Once again, because the partial derivatives do not exist. When you want to calculate partial derivative at 0 0, you will put y equal to 0 square root of x square. So, what will be that? That will be mod x. So, that is not differentiable at 0 0. Similarly, so this is not differentiable at 0 0 by the partial derivatives do not exist. We already had one example. Let us look at this example. This is f x y equal to x square plus y square. Sine of 1 over x square plus y square as x y not equal to 0 0, because we are dividing at 0, the value is 0. Is this function continuous at 0 0? Because the value at 0 is 0. If I take the absolute value, sine is bounded by 1, which is square root of, which is x square plus y square, which goes to. So, it is bounded by x square plus y square, which goes to 0 0. So, limit at 0 0 is 0, which is the value of the function. So, this function is continuous. Let us find whether this is differentiable or not. So, what about the partial derivatives? How do I find partial derivative with respect to x? So, y equal to 0. It is x square sine 1 over x square. So, is that differentiable with respect to x? So, x square sine 1 over x square, y is 0. So, minus the value at 0 is 0 divided by x limit of this, x goes to 0. That will be the partial derivative at the point 0 with respect to x. And that, this x cancels, still x is there. So, it is less than or equal to limit of x. So, x dominates. Power is 2 here. Sine is bounded by 1. So, which goes to 0. So, even as a function of one variable, x square sine 1 over x square is differentiable with respect to x at the point 0. So, that says this is also differentiable. So, partial derivative exists at 0 and is equal to 0 with respect to x for this function. Similarly, the partial derivative with respect to y also exists and is equal to 0 at 0 0. We want to check whether this as a function of two variables is differentiable at 0 0 or not. So, let us try to apply that criteria that we develop just now. So, I want to look at f at a near by point near 0 0. So, h k minus f at 0 0 minus h times partial derivative. I do not have to put dash. Actually, when I am writing partial derivative, both symbols I should not be putting. 0 0 minus k times partial derivative with respect to y at 0 0 divided by h k plus absolute value of this. If that goes to 0, then the function is differentiable at the point 0 0 as a function of two variables. So, what is this equal to? What is the function at a point h k? So, this is h k plus k square sine of 1 over plus k square. f at 0 0 is 0. Partial derivative at 0 is 0. Partial derivative at 0 is 0. Partial derivative at y is 0. So, square root of h k plus k square. So, this quantity is equal to this cancels out. Only square root is left. So, h square plus k square absolute value sine of 1 over square root of h square plus k square. Now, this is h square plus k square is square root of h square. So, oh, the square root does not matter actually. That is immaterial. That is an effect. So, this is. So, and that is less than or equal to sine is bounded. So, it is less than or equal to square root of h square plus k square. And that goes to 0 as h and k go to 0. So, this is 0, 0. So, this function, f is differentiable at 0, 0. So, this is a function as a function of two variables. So, we have checked it by checking that the partial derivatives exist. Both the partial derivatives exist and this quantity goes to 0 as h k go to 0, 0. So, this is a differentiable function. So, that is this function we just now checked is differentiable. So, you can check that it is differentiable.