 One of the distinguishing features of linear algebra as an advanced math course is that we're not only interested in finding a solution, but we're also interested in whether or not a solution exists. And this viewpoint is expressed in a certain joke which begins a mathematician, a physicist, and an engineer. We're staying in a hotel, but I won't tell that joke here. So let's suppose I have a matrix M and I want to find a right inverse of M. Well, first of all, let's see if we can set down and solve the equations necessary to find D. And since it's possible that this right inverse might not exist, let's see if we can identify any conditions that are required to be able to find that inverse. So if we want D to be a right inverse of M, we need to be able to multiply by D on the right. Since D is a 2 by 2 matrix, when we multiply it on the right, we'll get a product matrix with two rows. And since the identity matrix must be square, this means that what we have to get is the 2 by 2 identity matrix. And that means that the matrix we're multiplying D by has to have two columns. So what we have to do is we have to multiply D by a 2 by 2 matrix. And this has four entries, which we don't know, so we'll call them x1 through x4. Now, since we do know how to multiply two matrices, we'll multiply row by column to get each of the entries in the product. And if we compare the entries of the two matrices, we will get the system of equations. So if D is a right inverse, then its entries must satisfy a system of equations. So now we can row reduce the augmented coefficient matrix. And as long as our leading coefficients are not equal to zero, we'll be able to use back substitution. And so as long as AD minus BC is not equal to zero, we'll be able to use back substitution to find our values x1 through x4. And so our system will have a solution provided AD minus BC is not equal to zero.