 Suppose A is a transformation we apply to a real object, for example a rectangle with sides made of actual materials. If we stretch along one axis, there will be a compression along perpendicular axes. Could we use this insight to represent our transformation in another way? So suppose we want to find a matrix A representing a stretch by a factor of p in the horizontal direction and a stretch by a factor of q in the vertical direction. We note that our transformation will take points x, y, and output points p, x, q, y. And so we have, consequently, our transformation matrix will be, let's consider our transformation of a real object again. We can construct the transformation by rotating so the axes of stretching and compression are aligned with our standard axes, stretching along our standard axes, and then rotating back. The sequence of transformations can be expressed as A being r negative theta lambda r theta, where r theta is the matrix corresponding to a rotation of theta, and lambda is a diagonal matrix. It will be convenient to treat r theta as the inverse rotation, so if r theta is r inverse, then r negative theta is just r, so we can write A equals r lambda r inverse. And so the question we want to ask is, can we write all matrices this way? And the answer is, no. Well, let's see what we can do about that. For matrices we can write as r lambda r inverse, we have one matrix giving a rotation, one diagonal matrix, and one matrix giving an inverse rotation. So if we want to try and generalize this, we can maybe loosen the requirements. So let's think about that. If r corresponds to a rotation, then r inverse corresponds to the transpose of r. So if we try to generalize this, r inverse will be the transpose of a matrix corresponding to a rotation, so we'll let r inverse be the transpose of some matrix corresponding to a rotation. And again, r is the matrix corresponding to a rotation, so if we're generalizing it, we'll allow it to be a different rotation, so we'll call it U. And lambda is still a diagonal matrix because we like diagonal matrices. Remember, diagonal matrices are very easy to work with. But since lambda makes us think about eigenvalues, we'll actually call this diagonal matrix sigma. And so the question you've got to ask is, can we write all matrices in the form A equals U, sigma, V transpose, where U and V correspond to rotations, and sigma is a diagonal matrix? And the answer is, yes. We define the singular value decomposition of a matrix A is the factorization A equals U, sigma, V transpose, where U and V are matrices corresponding to rotations, and sigma is a diagonal matrix. This always exists. We won't prove it here. How can we find the factorization? If A is symmetric, we can find U, sigma, and V transpose directly. Because if M is a symmetric matrix, then we can write M as r lambda r transpose. If A is not symmetric, well, we know that A transpose A is, and A transpose A will be... Now remember, the transpose of a product is the product of the transpose as taken in the reverse order, so we can rearrange this. Next, remember that U and V are supposed to correspond to rotations, and in that case, the transpose is the inverse, which means this product U transpose U is just the identity. And so our product simplifies. And because we now have a symmetric matrix, we know that one solution is some rotation matrix, the basis of the eigenspace of A transpose A, where sigma transpose sigma is lambda, the diagonal matrix of the eigenvalues. We also have A A transpose is U sigma sigma transpose U transpose, which gives us U is our prime, the basis of the eigenspace of A A transpose. Now since A A transpose and A transpose A are non-defective, there is an eigenbasis, and so we can always find U, V, and sigma, regardless of the dimensions of A. And what this means is that A does not have to be a square matrix in order to find its SVD factorization. This can be done for any matrix of any size. And this means that decomposition is universal, and as we'll see, it's an incredibly powerful way of representing a matrix.