 So, this will give me a consequence of this corollary that if I write d2 of x, y to be equal to norm of x minus y, x, y belonging to r2 and what is that? So, if x has got components x1, x2, y has got components y1, y2 then this is nothing but sigma mod xi square minus yi square i equal to 1 to 2, xi minus yi square that was dot product. So, this raise to power. So, that is the dot product is related to the norm. So, it gives you that this is a metric. I am defining d2 to be equal to this, d2 of x, y is defined to be the norm of x minus y. Like in real line, the distance between two points you define as absolute value of a minus b, same I am trying to copy now. I am trying to show that how you can, how does one try to extend various concepts from r2, r2 and so on. So, what we have done is, we have generated a notion of norm and using that norm like absolute value, we have gotten this thing, a notion of distance. So, this is normally called the Euclidean distance. So, this is called distance. But the interesting thing is, let us go back and look at what we have done is in r2. How does it change if I remove from r2 to rn? What is rn? r2 is a vector having two components. So, let us think of rn as a vector having n components. The dot product, so what will be the dot product? How do we define the dot product? We can define the dot product instead of 1 to 2, we will have 1 to n. So, that will give me the dot product. And again, dot product will be linear in both, because it is sigma ai, bi only. So, all these properties remain valid. The dot product has all linearity and everything. I can define the norm of a vector in rn by the same relation. So, this, if I take it to rn, the same proof Cauchy Schwarz inequality does not use the fact that you have got two components or three components. It only uses the properties of the dot product. The dot product is linear in both the variables and symmetric. That is only fact used. So, using that Cauchy Schwarz inequality remains true. And Cauchy Schwarz inequality remains true. That means, you also have the triangle inequality. So, the all the proofs, same proofs, everything is same instead of 1 to summation 1 to 2, summation 1 to n. If you write same proofs work, Cauchy Schwarz inequality works and that gives you the notion of distance in rn. Now, here is something interesting that is happening. Let us observe that. I want you to keep track of here is real line and here is rn. In the real line, we had the notion of absolute value for every x belonging to r and here we have defined. So, let me put here two here. Just to, later on we will see why that is. So, that is sigma of x i square i equal to 1 to n. If x has got component x 1, x 2, x n and that gives me both have the same properties. This also behaves like absolute value and here this gives me the notion of a metric and this also gives me the notion of a metric. So, that is the ordinary metric in our real line and here is the. Now, let us look at something more. You can think of this x i square as mod of x i square. That does not matter. Square of a number is same as square of the absolute value of that number. So, here is something interesting. Let us define. Let me put infinity below it. What is that? So, let us look at the maximum value of mod x i i between 1 and n. x is a vector with components. So, x is a vector with components x 1, x 2, x n. Let us look at the components absolute values, which is the largest of them. There are finite numbers of maximum of them. This also is similar to absolute value. Meaning what? So, let us write what that similarity means. 1 is bigger than or equal to 0, equal to 0 if and only if x is the 0 vector. Is that okay? Because if the maximum is 0, every other component is 0. So, all the components are 0. So, this property holds second. Alpha times x is equal to mod alpha times. That is also okay, because if you take components of alpha x will be alpha x 1, alpha x 2 and alpha x n. So, the largest of mod alpha times x i is same as largest of alpha times mod of x i. So, no problem. Claim that this also has triangle inequality. Because if you are looking at maximum of a and b, a plus b, that is less than or equal to maximum of a plus maximum of b. Only that property is required to prove this, because the left hand side will be maximum of x i plus y i. That is less than or equal to maximum of x i plus maximum of y i. So, that is again obvious. So, these properties are okay. So, we have, let us say, d infinity x y equal to is a metric, is a metric on R n. So, on the same set R n, we have got two different metrics, L 2, D 2 and D infinity. Let us look at one more. So, this was one metric and this was the second one. And let us look at another one, the third. So, how is that defined? You see, how absolute value of the components are used to define something? In D 2, we are taking squares and adding and taking the square rule. Here, we are taking the maximum of the components. Another one for x, x 1, x 2, x n in R n. Let us define, let me call it as 1 to be sigma mod x i. Instead of taking squares and then square root, let me just add the absolute values, nothing more than that. So, then let us check once again, is bigger than or equal to 0? Because summing non-negative quantities equal to 0, non-negative numbers sum is equal to 0. If each component, each term is 0, that is, each mod x i is 0. That means, if and only if x is equal to 0. Obvious property by definition itself. Second, alpha times x. So, we will be multiplying summation mod alpha x i. Alpha will come out. So, it is alpha times. Again, obvious property because in the alpha times sum, alpha comes out. It is sigma of alpha. So, third property. What about, what to be this quantity? This will be sigma mod of x i plus y i plus, but mod of x i plus y i is less than or equal to mod x i plus mod y i by a tangential inequality in the real line. So, it is less than or equal to norm of 1 plus. So, that again says, so hence I get a mu. So, d 1 x y equal to, is also a metric. So, we have got three different metrics on R n. So, this is, let me just write here R n. So, here is absolute value. Here is 1 and then we had 2 and then we had infinity. See, how one generalizes things? We are just copying. What do you think should be the next R? R n, what should be the next thing? If I want to generalize R to the power, natural thing is infinity. So, infinity. What does R to the power infinity mean? As a set, I have to tell what is it as a set. So, what is it as a set? One can say, it is a set of all vectors with infinite components. And that is same as, when you say infinite components, there is a first component, there is a second component, there is a third component. That means, it is a space of all sequences. Instead of writing as a component dot dot dot, it is a set of all sequences, each x n belonging to all the space of all real sequences. Now, let us try to extend this 1, 2 and infinity to them. So, what we will try to do? So, for x, which is a sequence, we would like to define what is, so let me what was this thing here? We took the component and we took the sum. When it was 1, we took the sum 1 to n of each component. Sum up all the components. So, sum up n equal to 1 to infinity. That should be the natural generalization. But as soon as 1 does that, you will end up into a problem. So, what is the problem? This 1 to infinity sum, what does it mean? It may not exist. So, if we have done series of numbers, you should understand that this is a series which may not be convergent. So, one cannot define for every x belonging to r infinity, one has to look at a subset. So, look at all sequences such that sigma mod x n 1 to infinity is finite. Let us call it as L lower infinity, that is a subset of r infinity. So, we cannot just extend taking the sum of all the components. That does not make sense. We have to restrict to those sequences x n such that mod of x n summation is finite. For every x belonging to L infinity, one can define to be equal to sigma mod x n n equal to 1 to infinity. So, it will have the same properties. It is bigger than or equal to 0 and this will be equal to 0. It is sum of a non-negative series. So, each term must be 0. Triangle equality obviously follows because of absolute value. So, d infinity x y equal to is symmetric, not on a r infinity on the subset L infinity. Yes, mod of x n. So, I have got confused. I just took it that way. So, it is 1. We are taking the sum. That infinity is r infinity 1 and this is also 1. Yes, you are right. We are taking the sum of all the components. So, d1 is a metric on L infinity. So, this is also 1. So, actually then we should not be calling it as L infinity. We should call it as that also we should call it as L1. So, let us go. Thanks for pointing out that we are dealing with this. So, L1 is the set of all sequences whose absolute values of the term summation is finite. Now, you can guess what should be L infinity. So, L1, you can define L infinity to be equal to all sequences such that maximum of I1 to infinity is finite. Supremum exists, infinite collection now. So, that exists. So, on this you can define to be equal to maximum of mod xi. So, you see how smoothly things go on, but you have to put appropriate conditions. So, d infinity x, y, you get a metric now on L infinity. So, that is x minus y infinity is also a metric. No proof, no change other than instead of saying 1 to n, you have to go to 1 to infinity. Same things essentially work. Now comes our L, we should also have L2 corresponding to the Euclidean distance. So, what should be? So, L2 is the set of all sequences xn such that sigma mod xi i equal to 1 to infinity square is finite. And whenever that is the case for x belonging to L2, define norm of x to be equal to sigma xi square 1 to infinity now Rn to infinity raised to power 1 by 2. Of course, this will be bigger than or equal to 0. It is equal to 0 if and only if each xi is equal to 0. Alpha times alpha square square root alpha comes out. The problem arises when you want to prove Cauchy Schwarz inequality. It is 1 to infinity. You want to prove Cauchy Schwarz inequality and then use that to prove your triangle inequality. So, this is 2 here. So, is less than or equal to. So, what one has to do? See, for R2 to prove for this property for R2, what was our root? Our root was prove Cauchy Schwarz inequality and using that Cauchy Schwarz inequality we proved. So, this is a proof of Cauchy Schwarz inequality and using Cauchy Schwarz inequality here we proved triangle inequality. Same root 1 follows to prove it for L infinity also or sequences whose squares are summable L2. Actually, much more generalizations are possible. So, I think it is a good idea to prove those things because they will be useful for you later on also. In L1, what you are doing? We are looking at the absolute value of the each component mod xi and summing it up. In L2, we are looking at squares of them and it was realized that we need not do only for 1 and 2. You can do it for any real number between 1 and infinity. So, let me define the problem and define the set and then prove for at one go for everything.