 you have seen the definition of norms. I already said that for norms to exist, the space has to have certain properties. Norms do not exist in all spaces. You cannot define a notion of distance in all sorts of spaces. But where you can is where we are working. I am sort of assuming that all of you have seen vector spaces in some form or the other. Even if you have not seen for more often than not, we are just working with R n. Folks in physics and all they tend to work with complex C n and all that. So, but we primarily work with R n. Sometimes we work with some more non-euclidean spaces which are not vector spaces anymore. But that is not part of this course. So, we have courses running of course in those areas also in SISCON. So, we talk of non-linear space is essentially a linear vector space which has a norm. That is it. So, just like we saw R n with the infinity norm or the two norm or the one norm. So, these are all non-linear spaces. Why linear? So, linear space and vector space are used almost identically. This is because vector space has linear structure. So, whenever we talk about linear structures in mathematics, we are invariably talking about vector spaces of some time. So, therefore we say non-linear space or you can also call it non-vector space. Great. Now, we just want to make quick proofs of why some of the norm satisfy the norm properties. I mean very quick. We are not going to spend too much time on it. So, let us look at the infinity norm first. So, infinity norm is essentially the max of the absolute value of the elements. So, obviously it is non-negative because you took the absolute value and absolute values are non-negative, done. And if the infinity norm is 0, it means that the maximum absolute value is 0 and because we are talking absolute values, therefore every other element also has to have 0 and absolute value. Therefore, all it is essentially 0 infinity norm implies that the vector itself is 0 vector. Great. This is obvious. The scalar multiplication, you do not have to do much. So, the only key thing, in fact in most of these proofs, the only thing that you really have to prove is the triangle inequality. So, what is it? How do we claim that we have the triangle inequality in this case? You look at the x plus y infinity norm and this is actually exactly equal to that by definition. It is just max 1 to n of i x i plus y i. This is exactly the definition of the infinity norm and I hope you understand that for some k, it is exactly equal to this. It is a finite length vector. So, obviously for some particular k, it is exactly that value. The max goes away and this is obviously less than equal to this thing because when I break the absolute values, I get this inequality because this is actually using the triangle inequality for absolute values. And so, this and of course, I can say that this is less than equal to max of this plus max of this. So, I am done. It is simple, nifty proof, very simple, very standard. Again, works because it was finite dimensional. So, I could get from here to here. If we were talking infinite dimensional spaces, big soup, not easy to prove. All right, great. What about the two norm? I cannot look at all the p norms. So, I am just going to look at the two norm, which is the most popular norm. Yeah. Okay. First three properties, very easy. Non-negativity obvious because it is not just absolutely I am even taking a square here. If each of them is 0, then again the same deal. Yeah. If two norm is 0, then because this is there is nothing negative here. So, each term has to be 0. Yeah. No two ways about it. So, it is a 0 vector. Right. And scalar multiplication is of course, yeah, too simple to talk about. All right. What about the triangle inequality? Okay. So, here we need a little bit of work. Okay. So, this triangle inequality. So, this is the two norm. Right. And what does it look like? It basically looks like this guy. This again by definition, exactly the definition for the two norm. And then I expand it. Right. Okay. I am just expanding it inside the bracket. Right. Then I am saying that this is exactly equal to this because this is actually non-negative. All right. That is true. Right. This quantity inside is actually non-negative. Yeah. It is actually x plus y square. So, therefore, it is exactly this guy. And now I am looking at the inner product of x, y. Why am I trying to use the Cauchy Schwartz here? Why? We will have to use the Cauchy Schwartz. That is for sure. And unfortunately, the equalities are not in the right place, which is creating some confusion. Please do not mind that. Actually, this is just this. Okay. This equality is simply this. Okay. This equality is just coming from here because it is xi square. So, I can simply write it as two norm of x squared and yi squared summation. So, I can just write it as two norm of y squared. Yeah. This is just breaking the summation, no problem. And then I am just writing, this guy is exactly the, I mean, I have written it as it is because I cannot do anything. Yeah. Then I am looking at the inner product. Yeah. Which is this guy. Yeah. By definition for Rn. Yeah. Or you also then write it as the scalar dot product, which is essentially this guy. And this I know is less than equal to this. Okay. So, I have not actually used the Cauchy Schwartz inequality itself. I have arrived at it using this sort of a contrived method. Why am I doing it? Because I have not actually proved the Cauchy Schwartz inequality itself. Yeah. The Cauchy Schwartz inequality is a more general inequality for general norm linear spaces. So, I have not proved it. So, I am sort of arriving at it in this sort of contorted way. Alright. And so, what do I have? I have that this particular piece is less than equal to twice norm x normal. And so, this is actually just equal to this. It is just norm x squared plus norm y square, sorry norm x plus norm y whole square. Alright. And then if I remove the square roots from both sides, I have my triangle inequality. Alright. So, by the way, this proof is only for Rn. Okay. If you want to do this for general vector spaces, the proof is a little bit more complicated. And you can look it up. It is available in a lot of places. Yeah. But this we have done just for Rn. Okay. Because otherwise this cosine theta and all does not make any sense. What is cosine theta? What is theta even? Yeah. So, if you have more general norm linear spaces. Yeah. For example, if your vector space is consisting of matrices, then talking about this kind of cosine theta and all that is not very clear. In those cases, you have to have a more generic piece. Yeah. So, that is also available. I have just not talked about it here. Okay. Alright. We also talk a little bit about convergence because convergence is what we are trying to do in this course. Right. We talk about asymptotic convergence in continuous time and all that. But eventually the notions are very parallel to series sequence convergence that you have seen in your typical undergrad mathematics course. Yeah. So, it is just an extension of that in continuous time. Alright. So, the ideas remain similar. Yeah. So, we look at convergence and of course we want one of the important things is we also want vector spaces where convergence is well categorized. Yeah. If convergence is not well defined, then those vector spaces are not nice enough for you to work with. Yeah. You can always create such funny vector spaces. We will talk about it. Yeah. Alright. Great. What is convergence of a sequence? All of you understand what is a sequence? It is just this notation essentially is a sequence. It is just you can think of it as a function from integers to real numbers or vectors. Okay. Just a function of integers to vectors. Remember. Yeah. So, this i denotes that integer dependence. Yeah. So, there is a first term, there is a second term, there is a third term, there is a fourth term. So, it is in a sense it is ordered. Yeah. Only in the integers. Not in the vectors itself. Yeah. Vectors could be anything. So, sequence in the end and the most important thing is a sequence is always infinite length. Anything finite length is not a sequence. Yeah. By definition, a sequence has to have infinitely many terms. So, all integers you know 1, 2, infinity have to be mapped. Yeah. Or whatever natural numbers, positive integers have to be mapped. Okay. So, sequence xi i equal to 1 to infinity in this normed linear space is said to converge to some point x0 in this normed linear space. If for all epsilon positive, there exists a number n in positive integers such that xi is less than xi minus x0 is epsilon close for all i greater than equal to n. Okay. So, this is your first, I mean not first, but in this course, this is your first introduction to epsilon and epsilon delta type definitions. Okay. I am not sure if all of you or some of you have seen this before. But this is how typical epsilon, epsilon delta, epsilon n definitions are made. Yeah. The idea is very simple. And because mathematicians want to capture it in a mathematical language, which is precise, this is required. Yeah. Otherwise, the idea is very simple. The idea is that how do you say it is there is convergence? It means if you go far enough in the sequence, how do you so that say that you are converging to some point x0? If you go far enough in the sequence, yeah, you will reach to the, you will be very close to this point x0. Okay. So, maybe not at the starting of the sequence, but if you go further, further, further, further down, you will start to get close to x0. Yeah. And it is consistent. Yeah. It is not like the stock markets. Yeah. If you, once you get closer to x0, you are going to remain close to x0. It is not like you are going to just start bouncing back and go very, very far away from it again. No. Yeah. Once you get to this point, say, say this far in the sequence, beyond it, everything will be very close to x0. Yeah. So, this is exactly what is, what is sort of characterized using this epsilon. Okay. So, the epsilon is a user given. This is how we say it. Yeah. Depending on how the sequence of this statement is, this epsilon is something that user given, that is user given, that is you give me an epsilon and I give you an integer n. Okay. Beyond which every term is epsilon close. And how do we measure epsilon closeness? Norms. That is exactly why we introduce norm. That is the whole point of introducing norms. Okay. So, this is how we make epsilon delta epsilon n definitions. It is really capturing normal intuition in mathematical language as simple as that. Okay. Now, what is, what is an example of a convergent sequence? Okay. Xi equal to 1 over i. This is a convergent sequence. Remember, we are talking about sequence convergence, not series convergence. Okay. Sequence is each term. Yeah. This term is, this term signifies a sequence. So, when do I, when do I say that the sequence converges? It means that this term starts going toward the point. Series is completely different. Series, we are talking about summations. That is much harder to achieve, by the way, as you can imagine. Here, I am just saying that the term goes to some value. In a series convergence, I have to say that the summations converge to some value. Okay. Much more difficult to achieve as you can imagine. Okay. So, sequence convergence, that is what I said, converges to x0. Here, Xi equal to 1 over i. It is very obvious that it goes to 0. Right. I can even find an epsilon. I mean, I mean, there is a computation here, but not difficult. How do I compute? If you give me an epsilon, yeah, and I want my terms to be less than epsilon, has to be epsilon close to the equilibrium. In this case, I know the equilibrium is 0. Right. Then I want each Xi to be less than epsilon. Okay. And when will each Xi be less than epsilon? I can compute that. Right. Then 1 over i is less than epsilon. Okay. Therefore, i has to be greater than 1 over epsilon. Okay. But 1 over epsilon is not necessarily an integer. So, I apply the sealing function. So, if 1 over epsilon is say 100.37, then the sealing of 1 over epsilon will be 101. The sealing function. Yeah. Floor function is 100. Sealing function is just taking the higher value. That is it. Okay. So, that is it. This is how you compute. If I ask you how to, you give me an n, this is how you compute the n. All right. Great. And you can see that it is very obvious that n has to depend on epsilon. Yeah. n will depend on epsilon. Almost impossible to have n doesn't depend on epsilon. Right. Okay. Now, one of the key problems with this kind of a definition is that you need to know where you converge. Yeah. You need to know this x0. Without this knowledge of x0, I cannot do any computation. Yeah. And this may not always be the case. And so, there is this Cauchy sequence. Okay. What is the Cauchy sequence? Exactly what you were saying. A sequence is said to be Cauchy if for all epsilon positive, there exists integer n such that xi minus xj is less than epsilon for all ij greater than equal to n. Okay. So, here we are no longer comparing with any particular point of convergence in the vector space. Okay. Here we are actually just comparing the individual elements. Right. So, what am I saying? You give me an epsilon again. Okay. And I give you a integer n such that if if your i's and j's are greater than equal to this n and you compare them, they are epsilon close. Okay. This is exactly what is Cauchy sequence. Okay. So, there is a little bit of a problem. But again, this example in this case also everything works out fine. It is also a Cauchy sequence. Yeah. But there is a problem in the sense that convergence implies Cauchy, but Cauchy sequence does not imply convergence necessarily. Okay. I can always cheat and create these problem examples. How? So, suppose I have my vector space, and this is a valid vector space as this grid. It is the 0, 1 open interval. 0, 1 open interval is my vector space. All right. And I take my sequence as this grid. Okay. Xn is 1 minus 1 over n. Okay. So, where does it converge to as n tends to infinity? 1. But I cheated. 1 is not in my vector space. Okay. 1 is not in my vector space. So, this sequence is not convergent as per my definition because the point of convergence has to be in X. But this is a Cauchy sequence. Remember, not difficult to see because as you increase n, all your terms start bunching here on the right hand side. So, this sequence is definitely a Cauchy sequence. I mean, I am not verifying it formally, but you can easily verify it formally. That as n becomes, I mean, if you give me any epsilon, I can give you an n such that your terms start getting epsilon close to each other. Because everything starts bunching up here. You can start seeing that it gets very dense here in the corner. So, sequence is Cauchy, but not convergent because I created a funny vector space. And many such funny vector spaces you might encounter. It is not difficult. Yeah. If I give you a doughnut or if I give you a disc with no origin, create all sorts of such weird vector spaces where your Cauchy sequence and convergence will not be equal. Yeah. We do not like those. So, we define complete norm linear space or what is called a Banach space. Okay. What is a complete norm linear space? It is basically a norm linear space where every Cauchy sequence converges. Okay. So, the two notions are identical. Then it is a Banach space or a complete norm linear space. Okay. So, we usually work with Banach spaces. Yeah. Of course, notions of Banach space and all that there, you will see them more prominently in PD analysis and all that. So, of course, but the notion is pretty straightforward. The idea is Cauchy sequence and convergent sequence notions have to be identical. Yeah. So, that is complete normed linear space or Banach space. All right. As you can imagine, RN with these norms is a Banach space. Okay. Because I excluded nothing. So, whatever sequence you create, if it is convergent, it is definitely Cauchy. If it is Cauchy, it is definitely convergent. Yeah. All right. Similarly, similar to normed space, you have the inner product space also. Okay. Again, for RN, these things are more or less become the same. Right. So, for an inner product space, essentially you are saying it is a linear space with some inner product. Okay. As simple as that. Just like you said, norm linear space, linear space with the norm, here you have a linear space with the inner product. And similar you have these criteria for any function to satisfy an inner product. So, what is an inner product? The inner product is actually a function which takes two elements in the vector space and gives you an element in the field. Okay. F is the field. All right. So, what is, I mean, simple examples, basically field is what each component of the vector space is made of. Yeah. In our simple cases, it is RN cross RN and it maps to real numbers. So, inner product space you already know for typical vector spaces, the typical vectors, in fact, it is just that scalar dot product, right, x dot y. Yeah. But again, matrices and all, you have to think more carefully what this is. Yeah. But still, they are well defined. Yeah. Even for matrix vector spaces. Okay. So, the properties are basically symmetry, a distributivity, the scalar multiplication property again, and the fact that inner product of the vector with itself is non-negative and this property. In fact, the ras property should remind you very, very much about the norms. Right. Right. This very much looks like a norm property. And this is why every inner product linear space is actually a normed linear space. Okay. You can always use an inner product to create a norm. Yeah. Just like in real numbers, in RN. Yeah. So, you can actually see this. Just like in RN, what is the inner product? This guy, x transpose y or x dot y or whatever. Yeah. This is the inner product. And this immediately gives me a norm when I replace y with also x. So, basically when I take inner product of the vector with itself, I get this norm, which is actually what? Inner product of x with itself and a square root. This is exactly the two norm. Okay. So, the inner, the norm that you get from inner product is a particular norm. It is not necessarily covering all the norms. Right. Obviously, you cannot get a infinity norm with this inner product. Yeah. But it will give you one specific norm. So, in this particular case, you get the two norm. Right. So, that is not something you can control. It depends on the inner product you had. Okay. So, just like you have a norm linear space and Banach space and all that. Similarly, if you have a complete inner product space, then it is called a Hilbert space. Just terminology. Okay. You had a complete normed linear space. It is a Banach space. Complete inner product space. It is a Hilbert space. We can, we can assess completeness here also. Why? Because the inner product gives me a norm. Right. And therefore, I can check convergence and Cauchy convergence in that norm. Right. As simple as that. Okay. So, you have Banach space, Hilbert space. So, these are essentially the nice, nice, very, very nice spaces that we always work in. In fact, not just in this course, but also in PDE control and so on. If you do any course with Vivek, you will see that there also you require these assumptions. Okay. Very difficult to work without this. PDE. So, partial differential if you are talking distributed parameter systems and partial differential if you are looking to do something like a boundary control on those systems. This is something that my colleague Vivek, he specializes in and teaches a few courses on. And there also you make these assumptions. Of course, there the vector spaces are significantly more complicated. They are function spaces. So, infinite dimensional. So, notions are more complicated. But what I am saying is that the same assumptions have to go through there also. Alright.