 So with this we'll move on to the next topic which is matrix norms. So now matrices are basically rectangular arrays of numbers, and we want to, let's say, define the size of a matrix and compute it in this form of, I call it a norm and compute the norm with matrix. Now, one thing is that if I take the restrict to square matrices here. Excuse me, sir. Yeah. Sir, if we start with the pre-norm, can we get directly what would be the dual of that norm? Yeah. So the dual of the dual of a pre-norm will be a norm. It may or may not coincide with the pre-norm. It will coincide with the pre-norm if the pre-norm was actually a vector norm. If not, it may be a more restricted version or a different version of that pre-norm. But once you find the dual of the dual norm, that norm and this dual norm will alternate with each other going forward from there. Does it answer your question? Yes, sir. We can't directly get it from pre-norm, like by adding something to it. Anything. That is not clear. We'll have to look at. So you're asking whether the dual norm of the dual norm of a pre-norm can be written in terms of the pre-norm itself by adding something or making something. I don't know that there is a simple modification to a pre-norm that will make it satisfy the triangle inequality and therefore become a vector norm, which is the same as the dual of the dual norm of that pre-norm. I don't know that there is such a thing. So as you can see, the wonderful thing about matrix theory is that it's very easy to ask a question for which it's in fact very difficult to find an answer. So we will discuss a lot of very deep results in this course, but always keep in mind that if you're lucky and later in life or in your research, if you're faced with a problem which coincides with one of these deep results that we will be looking at, then you can use those results and you can say very interesting things. But if it doesn't coincide with the results that we are discussing, it's oftentimes quite difficult to say what happens. So that's one of the very interesting things about matrix theory. If you go slightly off the beaten path, you may quickly be off in some uncharted territory where you'll actually have to derive new results in order to understand what's going on. So what I was saying about matrix norms is that if I take a matrix, which is of size n by n, what I can do is I have all these entries, I can simply vectorize them and I can construct an n squared length vector. And then I can say that now I can convert this into a big vector, a big long vector of size n squared by 1. And then I can say I'll use some vector norm. Now, this is one way of measuring the length or size of a matrix. Let me just say length rather than size because size can also be taken to mean the dimension of the matrix. But this is one immediate way by which you can measure the length of a matrix. However, matrices have multiplication defined on them. And so we want to, we really look for matrix norms that will help us relate the length of A, B to the lengths of A and B, the individual lengths of A and B. And that is an extra property that is desirable when you're looking for matrix norms. And so therefore the definition of a matrix norm, in fact, involves a condition involving products of matrices. So here we go. So the function, and as I told you in a previous class, I'm going to use three bars to denote a matrix norm. And this is a mapping from r to the n cross n to r. Is a matrix norm if for every A and B belonging to r to the n cross n, we have one, the norm of A is greater than or equal to 0. Two, or actually one A as we called it earlier. Norm of A equals 0 if and only if A is the all zero matrix. Two is the homogeneity property for every c and r. And three is the triangle inequality. So so far we have not deviated from the definition of a vector norm. In fact, if I vectorize a matrix like this, and then I compute any vector norm on this vector, that will satisfy all these four properties. And so the last property is the one that distinguishes a vector norm from a matrix norm, which is called the sub multiplicativity property. So why do we need this? Will be clear when we discuss matrix norms going forward, where this property turns out to be very useful, showing that a matrix norm defined like this does turn out to be quite useful. And so again, just for the sake of completeness, this is what we call the non-negative. This is what we call the positivity property. This is what we call the homogeneity property. This the triangle inequality. And this is called a sub multiplicativity. Okay, if only these properties are satisfied, then this is called a generalized matrix norm. And if one a is not satisfied, then as usual we call it a semi norm. Okay, so this is the definition of a matrix norm. It has one extra property besides what we used for vector norms. So I can make a few immediate observations. The first is that if I take the norm of a square, a squared is a times a. And by the sub multiplicativity property, this is less than or equal to norm of a times the norm of a. So I can write it as norm of a squared. And this is true for any matrix norm. So if you take the length of a matrix, the length of the square of the matrix is at most the square of the length of the matrix. And that's true for every norm. And in fact, if a is such that a squared, if a is such that a squared equals a, what do we call such matrices? I didn't put it. I didn't put in matrices, exactly. If a squared equals a on the left side also, I get norm of a. And so I can cancel this norm of a on both sides. And I can then say that norm of a should be greater than or equal to one. And for any a such that a squared equals a, then norm of a is greater than or equal to one. And in particular, the identity matrix is one such matrix for which a squared equals a. And therefore, norm of the identity matrix is greater than or equal to one for any matrix norm. And the second point is that if a is invertible, we have that identity matrix equals a times a inverse. And once again, if I use my sub multiplicativity property, the norm of the identity matrix, which is equal to the norm of a, a inverse is less than or equal to the norm of a times the norm of a inverse, which means that if I ask, what is the length of a inverse, it is at least equal to the norm of the identity matrix divided by the norm of a. Okay. Now, there are many more properties. We'll discuss them as we go along. Basically, as I mentioned, we can always vectorize a matrix and then compute a norm on it. And some of the vector norms that we've looked at are in fact matrix norms when you apply it to r to the n cross n, but others are not. So I'll just make a remark here. So if I take, for example, if I take the L1 norm, and if I define, so I'll define it with two bars here. This is just a function and I'll define it to be sigma ij equal to 1 to n mod aij. And I ask, is it a matrix norm? And the answer is this is a matrix norm. Now, clearly this, because it's an L1 norm, it already satisfies the first, these three properties, norm a greater than or equal to 0, this, this and this, triangle inequality. The only thing you need to show is that it will satisfy the submultiplicativity property also because the L1 norm is in fact a vector norm. It satisfies all these three properties. So we'll show that the next time and continue discussing other norms. And so, norm, and I'll say while the, if I define a infinity to be equal to max 1 less than or equal to ij less than or equal to n mod aij. So instead of taking the max, I'm taking the max of the vectorized version, this quantity is not a matrix norm. So we'll stop here for today and continue the next class. Sir, I had one question that I mean, this matrix norm translate to the tensors as well. So we have, it's a 3D matrix as we have there. A lot of it does. Just like we're discussing here, some parts translate, some parts don't. But the analysis of tensor norms is beyond the scope of this course. So I won't be discussing that here. I'll discuss about matrix norms, but if you're interested in tensor norms, I can point you to some references. Actually, the one of the, one of the reference textbooks does cover quite a lot about tensor norms. You can look at the textbook and learn for yourself about tensor norms.