 So, the next topic is linear transforms. So, suppose we have u and v, these are two vector spaces over a field f. Then a mapping from u to v is said to be a linear map if two conditions hold or a transform if 1 f of u1 plus u2 equals f of u1 plus f of u2 for all u1 u2 in this first vector space. And the second is that f of lambda u is equal to lambda f of u for all lambda belonging to this field and u belonging to the set u vector space. So, these are the two conditions that define a linear transform. Often this is written compactly as f of a1 u1 plus a2 u2 is equal to a1 f of u1 plus a2 f of u2 for all a1 a2 in this field f and u1 u2 in this vector space u. But this is just another way of writing it. So, you may or may not agree that this is a more compact way. To me, this is a slightly more intuitive way of writing it. Okay. So, some properties of a linear transform are that the values of f over on some basis of u completely defines f. Okay, that is because every u in this vector space capital u is a linear combination of the basis and f is a linear transform. So, if you can write it as a linear combination, then if you know what values f takes on the basis, then you can find its value for any other u that belongs to this vector space capital u. Second thing is that 0 always maps to 0. So, one way to see it is if you just substitute lambda equals 0 here, you have lambda times f of u, whatever f of u is, this is 0 and lambda times u is always the 0 vector. And so, one very important property is that any a r to the m by n is a linear transform from r to the n to r to the m and vice versa. And another property is that matrix multiplication, this is something I mentioned in the previous class also, matrix multiplication is the corresponds to a composition of linear transforms. Sir, one doubt, this previous property you told that the linear transform from rn to rm and vice versa, what is meant by that vice versa part means rn to rm? No, I don't understand that. Yeah, so that's a good question. So, any matrix a is a linear transform from rn to rm and vice versa means that any linear transform from rn to rm can be represented as a matrix a. Okay, thank you. Sir, can you explain that part again, the value taken by f on some basis of you completely determines f? Okay. So, suppose this u has a basis u1 through u11 and it's f of un. So, suppose I know these n vectors, then any u in capital U and u equal to alpha 1 u1 plus less alpha n, we've seen that this representation is unique. There's a unique linear combination, which gets you to u when u1 to u1 is a basis for this space capital U. So, by linearity f of u is f of alpha 1 u1 plus alpha n un, which is equal to alpha 1 f of u1 plus alpha n f of un, which is equal to alpha 1 v1 plus alpha n vn. So, this is what I mean by saying that the values taken by f on the basis completely determines f on all vectors in capital U. Is it clear? Yes, sir, thank you. Sir, is there any relation between basis of u, basis of v and the function? Suppose if u and v are of same dimension. Yeah, so that's what I'm going to talk about now. Vector spaces associated with linear transform. So, you might have heard this rank nullity theorem. Okay, and this is closely related to that. So, I'm going to talk about that now. Okay, before that, maybe I'll make a couple of remarks. So, one thing is that the way I define this linear transform here, it's a transform from u to v. And it satisfies these properties. Now, it's possible that there are some points in v which are not reachable as f of u for any value of u. What I'm trying to say is that the way it's defined here, u and v are two arbitrary vector spaces. And not every small v in this vector space capital V is required to be reachable by taking f of u for some u in capital U. So, that is one point. Another point is that as you can see from here, if u has this basis, u1 through un, and I take v1 through vn, then any u in capital U can be written as a linear combination of these vectors. And so, if I consider only the points that have an inverse map in capital U. So, as I mentioned, not every point in capital V needs to have an inverse map small u. But if I only consider the points in capital V that do have an inverse map small u, that you can show is also a vector space. And now, that vector space, you can see already that any point in that vector space can be written as a linear combination of v1 to vn. But now, it's not clear that, of course, it's clear that u1 through un are linearly independent. There are bases for capital U. But it's not clear that v1 to vn are going to be linearly independent. Moreover, any v which is reachable from this vector space u can always be written as a linear combination of v1 to vn. So, v1 to vn certainly span this set of vectors that are reachable from this set u under this linear map f. But this could be an overcomplete set. In other words, it's possible that there's a subset of these vectors which span that space. But it cannot be more than n. So, the dimension of the space that can be spanned by this u under this linear map f can be at most n if n is the dimension of u. You cannot increase dimension by using a linear map. But it's possible that you end up decreasing dimension. So, does this mean that the v need not be a basis? v1 to vn need not be a basis for the space spanned by u under this transformation f. That's correct. Thank you. So, the first space associated with the linear transform is what is called the range space or the column space. So, now I'm going to refer to these linear transforms as matrices because this particular thing is easier stated using matrices. So, and because any matrix can be viewed as a linear transform and any linear transform can be viewed as a matrix, we can also discuss about matrices. So, range space, which is also known as the column space. So, that's defined as r of a, which is the set of all vectors in r to the m. So, let me say a is in r to the m by n, y is equal to ax, x is in r to the n. It's the span of the columns of it. It is a subset of r to the m. Every vector here belongs to r to the m. In fact, it's a vector space. Now, the dimension of the range space of a is less than or equal to min of m and n. Why is that? Because it is a subspace of r to the m. It's a subset of r to the m. So, it can have dimension at most m and this matrix a has only n columns. And so, if you write y equal to ax with n columns, you can have at most n linearly independent vectors in the span of the columns of it. This dimension of the range space of a is also called what? p. Exactly. Thank you. So, for example, if I have a equal to 1, 2, 3, 5, 4, 3 and 3, 3, 3, then the range space or the column space of a is the set of all linear combinations of any two columns of a. Notice that if I add these two columns and divide by 2, I get the third column. So, these three columns are not linearly independent. So, only two of them are any two of them are linearly independent. So, you can take the first and second, second and third or first and third any two columns and the set of all linear combinations of those two columns is the range space of a and its dimension is 2, which in this case is actually strictly smaller than the min of m and n. m is 2, m is 3 and n is also 3 here. This is a map from r to the 3 to r to the 3. And the dimension for the rank of this matrix a is 2. So, here is a small thing you should do on your own is actually, maybe I will say the subspace of r to the m of r to the m. So, in other words, it satisfies those two properties we discussed the previous class. Okay, it is Hotel California. You can never leave any two vectors in r of a. If you take their sum, that also belongs to r of a. And if you scale a vector in r of a, that also belongs to r of a. Yes, go ahead. Sir, I have a small confusion. So, given a two pair of vectors with a different dimension, I can always find a matrix a which will transform, let us say, vector b to vector a. So, I mean, then is it mean that I mean, transformation is always linear? So, you are saying that given v 1 say in r to the n and v 2 in r to the m, you can always find such that v 2 equals a v 1. Yes, sir. Yeah, so that is correct. So, in fact, you can find many such a's. But so what's the point? What is your confusion? So, my confusion is that is transformations always linear in nature? I mean, there is because there are some conclusions. No, what this means is that you can find a linear transform that maps v 1 to v 2. It doesn't mean that you can map v 1 to v 2. And secondly, there can be some non-linear transformation, right? I don't want to look for that right now. It may not be easy to find a non-linear transform, but okay, here is an example. Suppose I had v 1 equal to 1, 2, 3 and v 2 equal to 1, 4, 9. Okay. Now, I can always define a transform f of v equal to I am going to use Matlab's notation here dot power 2. What this is doing is it's taking the square of every element in v. This is a non-linear transform which clearly maps v 1 to v 2. Okay, so it need not be a linear map. But at the same time, we can find matrix A as well to do this transformation, right? Yes. In fact, you can find many such matrices because you only have one constraint. You are giving me one point in R to the n and you are giving one point in R to the m and you are asking me to find a matrix A that will map v 1 to v 2. Of course, this is going to map other vectors to some other points. But in fact, I can find an A that will map if you give me a set of vectors in R to the n and another set of vectors in R to the m. I can ask, is it possible to find an A that will map the first set of vectors one by one to the second set of vectors? Okay. So to answer that question is not immediate from what we've discussed so far. So you should keep that question in your mind and revisit it later. So I'll write it here. You can think about it later. So given, say, u 1 through u k in R to the n and v 1 through v k in R to the m, can we find A in R to the m by n such that u i, sorry, v i is equal to A u i, i equal to 1. So this is a very basic question but I don't want to answer it right now. It's something to keep in mind and you can revisit it later. Okay, sir. There are conditions under which this is possible and under some conditions, it's not possible. Okay. Sir, may I ask a question? Yes. Sir, the thing that he told that the elementary square operation that you showed as an example of nonlinear transformation. Yes. So if we want to take any v 1 and get to any v 2, maintaining the relationship that you told that elementary square, so it is not possible to find any unique A, right? Because A is not a nonlinear transformation. Of course, this kind of nonlinear transform, first of all, you cannot express it as multiplying by a matrix A. So this can never be written as A, V for some matrix A. Yes. For general case, we can't find any such A.