 Now in my series on doing deep learning specifically for people interested in health care related problems But for anyone who really is interested in deep learning and really struggling just to get to grips with What this all about what the mathematics is all about. I just want to make this video I'm gonna make another one as well this one that about just about Let's just call it a video about tensors Now a tensor we're going to build our framework of deep learning or we're gonna build our deep learning models on the framework Called tensor flow from Google. So what is this tensor now tensors come in ranks? And you actually know quite a few tensors So let's just start off with a rank zero tensor Rank is zero tensor and that's nothing other than what we call a scalar and a scalar C a c a l a r scale now That's nothing other than a number like four or seven or minus one. Those are all numbers. That's a rank zero Tensor now we move on to a rank One tensor and a rank one tensor is nothing other than a vector now. Let's think about vectors this for a minute vectors can represent many things One way to to visualize this representation of a vector is to think of a plane Most importantly, we have these two axes and they are orthogonal In other words, they add a right angle to each other and that represents can represent two dimensions And I can have a point in two dimensions and I can represent that by this arrow With its base here and its head there Representing a vector and that vector in these two dimensions will have a very specific You can say an address for instance And that would be this x axis value and this y axis value in other words this vector Let's call this vector V. I put a little line under it to demonstrate that it's a vector It has these values for x and y and there's different ways that we can write this for instance I can write vector in this format an x value and a y value and this is called a Column vector and you know from from spreadsheets. This is a column and those are rows So this column this column vector will have a size. It has a size of two rows Times of one column. It's a two times one always row first then column. So it's a two row one column vector Well, we'll call that a matrix And we'll get to that now, but a Column vector has size two one. We could also just what we call transpose this And if we transpose this it becomes x and y and that we call a row vector This for you to become slightly familiar with these terms That will be one row and two columns wide and we say we transpose We just turn the rows into columns and the columns into rows. That's not what it's about We're representing a vector as this point in here. It's two-dimensional space So a vector here a rank one tensor will be a vector and that is a vector and Vector depending on how many elements they are can represent points in space if it's One-dimensional space that will just be a point on the line. That's two-dimensional space in three-dimensional space Which we usually draw like this. That's orthogonal. So I mean for instance, that's on the floor that's standing up So we usually make that the z-axis the x-axis and the y-axis by convention and Any point in space here will have this these values here and This value there and remember it stands out It stands it can be in the in the y z plane there But it can also stand out in other words. It will be some point that stands up in Inside of the three-dimensional space and it'll have a z component to here making it a three by one column vector and Depending on how many elements they are we can represent multi-dimensional space much more than we can just perceive in our three-dimensional world By just adding more of those Then we get a rank Two tensor Ring two tensor and that's actually just a matrix And that's a very interesting thing We've seen matrices before just think of all the rows and columns in a spreadsheet So I can have values three two minus four one six seven two minus three four and I can put it like this and that is a matrix a matrix here of three columns So I can see this as three column vectors Combined as many ways these matrices are so useful. They can present so many things again I count the rows and columns which makes this a three by three matrix and then we get a rank Three tensor now becomes really interesting because all I do now is I add layers behind this one so I would have This representing here and then be behind it another one and behind that another one This will make this if that was three by three as well a three by three by three rank Rank two tensors a matrix in the rank three tensor is Where I have these multiple layers of these matrices behind each other and so I can go on to rank in tensors and It comes very complicated, but you can just add in more dimensions More spaces what I don't want you to get confused with though is how many elements you are gives me how many dimensions in space That is different from what we have here rank two tensor Rank one tensors a vector that can live in multi-dimensional space. That doesn't mean that's one dimensional This is just the rank of the rank of the tensors and That is what we're going to work with our data is going to be represented in the way that we Manipulate values the way that a deep learning network learns these Parameters that we're going to look at they are all going to be inside of some other rank of of a tensor I'm going to clean the board and we're just going to do a few What we call linear algebra operations on Specifically these two rank one in the rank two tensors to so that you become familiar with it Now the first thing we're going to look at is the systems of linear equations. Let's put that here systems of Linear equations not strictly necessary for deep learning, but it helps us to understand some of the concepts So we are going to deal with systems of linear equations now linear equation Remember those of the three simple algebraic things. Let's let's construct our own one. Let's make it Let's make something like one and four. Let's put in Let's put in Minus two and one there and let's make this three and three And so that's a plus and we make this let's make that Let's make that two and two and three and three Let's see where that ends up Let's say that is two That's going to be two minus six that equals negative four and we're going to make this that is eight Plus three That's eleven So I'm just cheating here. I'm creating my own system of linear equations now Let's imagine that these that we know the answer for we know that's two and that's three Let's change those into two unknowns. We'll call that unknown x unknown y. So what we have there is one x minus two y and That is minus four and we have here four x Plus y and that's going to equal 11 That's why I mean by a system of linear equations a system because there's more than one and I can get a Solution for both of these and that solution it will satisfy both of those It will satisfy both of those now How can we solve that that we get values for x and y that? That solve both of these by the way, this is linear because I don't have x squared or x times y It is some constant multiple of x and some constant multiple of y and this instance four times x in this instance Negative two times y this is constant multiples of that and I know if I plug two in there And if I plug three in there it is correct, and if I plug two in here, and I plug three in there it is correct So let's try and solve this what what are things we could do well one of the first things we could do We can just swap these two around. Let's have that four x plus y is 11 and I have x minus two y equals minus four So I've swapped these two around. That's one thing I could do I could multiply this throughout by a constant So let's still have four x plus y equals 11 And here I'm going to multiply this throughout by negative four So that gives me You know if I multiply throughout I'm not changing it I'm multiplying this side by negative four and that side by negative four nothing has changed So I'm going to have negative four x I'm going to have positive eight y And I'm going to have 16 So here I swapped out here. I have a constant multiple of one And now what can we do remember these are equations meaning this side equals that side So if I do something to this side I must do exactly the same thing to that side So let's do something to that I'll stay with four x plus y equals 11 in my first row But in my second this second bit I'm going to add this to that But this is also equal to this so I can also I can do this to that That still means I did the same thing to either side because these two things are equal So if I were to do that if I were to add this to that That's going to cancel out. This is going to become So that makes it a zero x and this makes it nine y and on this side we're going to have 27 I have these now. Let's carry them on here. So I have four x Plus y equals 11 And I have this but I can simplify this one by just dividing throughout by nine zero divided by nine is still going to be zero x Nine divided by nine is going to give me y And 27 divided by nine gives me three nine times three is 27 And lo and behold, I have a solution for y and we know we write because we cheated in the beginning and we constructed it that way So that gives us an a solution for y which I can now plug into that. So I have four x And I know y equals three So four x plus three equals 11 So four x equals 11 minus three which is eight and x equals two beautiful But look at what we've done. There's three very elementary things. We call this elementary row operation. So we did we swapped things around We multiplied throughout by constants here. We multiplied throughout by one over nine And we added this constant multiple of one row to another one Remember these two things are exactly the same. That's why there's an equation sign So if I took this side and I added that Then I can take this side and add to that then I've done both things The similar thing on both sides of the equation sign because these two things are equal to each other So let's clean the board and let me just show you if we use Rank if we use rank two tensors How we can also do that So let's use this and write this as what we call an augmented matrix. So an augmented rank two tensor So I'm going to just have x there. Remember in minus two y Let's write that just the coefficients So I'm going to have one and a minus two and that side a minus four And on this side, I'm going to have a four and I'm going to have a one and I'm going to have 11 and that is An augmented matrix. I'm just short hand. I'm leaving out the x and I'm leaving out the x and I'm leaving out the y And I'm leaving out the y What can we do one of the elementary row operations? We just swap these around for 111 technically I didn't have to do this but I just wanted to show you there are three And I'm going to write this one down below But I'm also going to multiply it out by a constant which was negative four which was 16 And it's just short hand for what we did before. Those are the elementary row operations So I can stick to the first one And for the second one I can add the first one Elements of the first one to the second one So four plus negative four is zero one plus eight is nine and that gives me 27 I can multiply this through out by one over nine Which gives me four one eleven zero One three And that's how we got to read off that The y value remember this is x this is y so one y equals three y equals three Now I can multiply this through out by negative one So that'll give me zero negative one three and I can add this to the first one four plus zero is four One plus minus one is zero And eleven minus three is eight and I can multiply through this through out by negative one again So I'm back with zero one three And then I can multiply this first one through out by a quarter that'll give me one zero two zero one three There's my solution It is what in what we call row reduced echelon form In other words, there's just leading ones And below and above every leading one there is a zero so that I can finally Get my solution one x equals two and one y equals three Exactly what we have there so elementary row operations So they have shown you the system of linear equations. Let's just get back to some simple things Let's just add Let's just add rank one tensors rank One tensors so to add rank one tensors. They must just be in a similar dimension So I can have a three dimensional rank one tensor for three minus one And I can add to that second one. Let's have a zero one four And that's very simple. It's element wise operation. So it's three four plus one is five negative one plus four is three And there I have my solution very easy to do rank one tensor which are vectors A vector addition. They must just have similar dimensions Now let's multiply Let's multiply a matrix times a vector Now there's various reasons why you would do that. We can call this an operation on a vector if we look at physics Doesn't matter what we do. We just want to do and I want to show you that there's just Something that must be kept in mind and this is very important when we get to tensor flow and when we get to To designing our thinking about at least our deep neural networks If I have a matrix three one, and let's make it two two So what size is that that's two by two matrix two by two rank two tensor And I'm going to multiply that by a vector Two one and I'm going to put that down there and that is two one The reason why I put that because it makes the actual doing it on paper very easy But what I'm doing here is I'm multiplying a matrix by a vector very important that this second value the column number And the row number of that one they must be equal if they're not equal you cannot do that multiplication These don't matter. It doesn't matter what these two values are But if I have this matrix and let's call this matrix a and we call that vector v if I'm doing a times v The matrix comes first the vector can technically also come first but There's something else about that But the second number here must equal the first number there So the column number of this one and the row number of this one must be equal Otherwise you cannot and the solution will be what is left the solution will be a two times one tensor And you know depending on what what rank it is all these things from a single number to all these multi-dimensional things Yet the bottom they all tensors. So this is called everything a tensor So the result is going to be a two by one tensor and what we do is very simply I've written it like this because it makes it simple. I'm going to look For this space we've already seen it's going to be two by one So those are the two values and it's very simple because I can look along this one And along that one to get to this one. So it's going to be three times two is six Plus one times one is one six plus one is seven and two times two is four Two times one is two four plus two is six and there's my solution seven six That is a matrix times a vector Very important and very important that you understand That that number and that number must be exactly the same Otherwise you cannot do this multiplication and that is going to be a very important first step in In a neural network By the way, if it sounds like the world is coming down right outside my office They're building new neuroscience center and it is very very noisy And really driving me nuts So a very important thing this matrix times a vector I can also have two matrices like to multiply them with each other And as long as say for instance, this one is of size four by five That means this one must be size five by whatever it makes that seven It doesn't matter those two must be the same in that order the result will be a four by seven Tens the result will be a four by seven tens and as I say that those are very important things So that's a brief recap of what we call linear algebra There's a very basic concept in linear algebra And we know that this is very important for us in In designing neural networks now, you don't have to know more than that if we write the lines of code Using tensor flow or keras as long as you have these basic concepts in the back of your head You really don't need this full understanding of this just in case you didn't have not done this before You did it a long time ago Just a little bit of a refresher of what these things are all about We're going to make another short video just on partial derivatives because it is linear algebra Is the one important part of deep learning And the other part is just derivatives You've got to be able to do derivatives once again We're just going to write a line of code in the derivatives are going to happen automatically But I think it's it's still important that you just have some basic concept of What happens with linear algebra specifically this and what derivatives are