 Welcome back everyone. Today, we're going to continue on with our lecture series based upon the textbook linear algebra done openly, and we're in section 4.4 today, and we're going to talk about affine transformations. Like always, I will be your professor, Dr. Andrew Misaline at Southern Utah University. So we've talked in the past about linear transformations. Those transformations preserve vector addition and scalar multiplication. Today, we're going to talk about a generalization of linear transformation known as an affine transformation. Before we do that, I want to make a connection to the inner products we've been talked about in the previous three lectures with angles and geometry. We're going to talk about what the so-called law of cosines in the context of vector algebra, vector geometry. So whenever you have two vectors, let's say u and y here, there's always associated to these vectors a triangle, and we've seen this before. Let's take the vector u and let's take the vector y, and these are vectors which we're going to place the arrows tail to tail. So once you have these two vectors, we can actually form a triangle by taking the heads of the two vectors and connecting them. The one pointing from y to u would give us the vector u minus y. We always get this triangle right here, and so I'm interested in the angle formed between the two vectors. We'll call that the measure of that angle theta. Now in a standard trigonometry class, one often talks about the law of cosines, which could be thought of as generalization of the Pythagorean theorem. So if we have an oblique triangle, one where the angles are not necessarily right angles here, we get something like the Pythagorean theorem where if we take this side, u minus v, that is this side right here, that's opposite the angle, u minus v, the length of this side squared will equal the length of this side squared, the length of this side squared, but then we have to subtract from it two times the length of u times the length of y times cosine of theta there. And so there's this correcting factor we have to play. Now, of course, if our angle is a right angle, notice that if theta equals 90 degrees, then cosine of 90 is equal to zero. And so if you have a right triangle, cosine of 90 is zero, it just cancels off that correcting factor. And so the law of cosines degenerates to the usual Pythagorean equation. Now, if you remember the proof we saw earlier of the Pythagorean theorem in the context of vector spaces, there's this part that creeped up inside that proof. Well, I should say this part right here was creeping up in that proof. But if we were to carry through a similar argument, you can actually show that these two things are equal to each other, that u dot y is equal to the length of u times the length of y times cosine of theta. And so this is the correcting factor that comes from the parallelogram rule. And if we try to make that connection to right angles right here, if cosine is equal to theta, then well, this right here would equal zero. And so the right hand side would equal zero, which would enforce that the inner product u dot y would equal zero as well. That is to say u is perpendicular to y. So this idea of orthogonality of vectors is equivalent to the perpendicular angles you might know from a previous geometry class. And so this really is a generalization of the law of cosines one might have seen previously. And so also this equation right here can help us compute angles between vectors because if we solve for cosine theta on the right hand side, you'll get that cosine theta equals u dot y over the length of u times the length of y. And if you take our cosine, you can actually calculate the angle between vectors. So I want to do that in this two-dimensional example. Take the vector u to be six and negative one, take the other vector to be one and four, and these will live inside of r two. So by the previous formula, theta, the angle between the two vectors will be arc cosine of u dot v over the length of u times the length of v, v in this case. And if we calculate these things, the dot product of these vectors, you're going to get six minus four on top. If we calculate the lengths of the vectors, you get the square root of 36 plus one for u and you get the square root of one plus 16 for v. And so simplifying this a little bit, you get arc cosine of two over the square root of 37 times the square root of 17, which those aren't perfect squares, so we can't simplify them anymore. We probably want to use a calculator to help us simplify this expression right here. And so using just your usual scientific calculator, I'm going to have mine in degree mode. And so this would give us as approximately 85.43 degrees. Your answer would look a little bit different because it's in radians. Cosine inverse will always output an angle, but it should accept a ratio that's between negative one and one. So these two vectors are almost a right angle, about five degrees off. And that simple calculation can give us the angle between any two vectors.