 Simultaneous equations models can be estimated in multiple different ways. While understanding the technical details of these different estimation techniques is not required for every researcher, understanding some of the basics and understanding the big picture can be useful. Let's take a couple of examples of common misconceptions. So this is from Barry and this is from Lotialla on co-authors. These articles claim that they apply generalized method of moments estimator to deal with endogeneity. The problem here is that GMM itself does not do anything about the endogeneity. It is the model that GMM estimates that takes endogeneity into account. This is a very common misconception. The same model, of course, can be estimated using other techniques like maximum likelihood or perhaps in some cases two-stage least squares. Another common misconception is that dealing with endogeneity requires the use of two-stage least squares and for example could not for some reason be done with maximum likelihood estimation. So we need to understand that models deal with endogeneity, not estimators, and then we need certain estimators for certain models and certain estimators make different assumptions. These estimation techniques that I cover in this set of videos can be used for estimating all these models. In the simplest possible case we have the normal least squares regression analysis. So normal regression analysis can be used to estimate the model A and model C, no problem. And doing something more complicated is largely useless and also it can confuse the readers and if you do something that is unnecessarily complicated then you run a risk of doing an error yourself. If we want to have a small improvement over OLS we can go for seemingly unrelated regression estimation or that belongs to the family of feasible generalized least squares estimators. And that too can be used for estimating this model A and model B consistently. It's slightly more efficient than OLS, it makes a bit more assumptions. Then we have different kinds of instrumental variable techniques that can be used for estimating all these models. So the simplest possible estimation technique is two states least squares and then we have three states least squares which basically combines two states least squares with seemingly unrelated regression. And then we have generalized method of moments that I also mentioned which basically generalizes all these estimation techniques into a more general framework and it is more efficient than the alternatives. So the GMM basically are in a way obsolete at least three states least squares and maybe are seemingly unrelated regression as well. And then we could also use maximum likelihood estimation. It adds more assumptions to GMM but it allows you to model certain things that you cannot model with GMM. For example some models are not identified if you try to estimate with GMM but they can be estimated with ML because you can add additional constraints to identify them that are not available in GMM. As a downside maximum likelihood add assumptions about distributions but you gain some efficiency. So they are trade off between different estimation techniques. What is important is that none of these techniques are particularly required by these models. So we can't say that for example this model must be estimated with GMM because all the instrument of variable techniques can estimate this model consistently. And another thing that we need to understand is that some of these are obsolete. So they are obsolete in the sense that when they were introduced for example GMM was not available when three states least squares was introduced. And now three states least squares is covered by econometrics book because it is still being used and it's also covered for historical reasons. So whether you want to use that or not I probably wouldn't I would go for GMM but it's useful to know how these are related. And then there is the fact that maximum likelihood estimation while it has been around for a long long time it has not been computationally feasible. Because these other estimation techniques you just take the data you do some calculations some matrix algebra and you get your results. In maximum likelihood estimation there is no closed form solution so the computer has to find the maximum of the likelihood by searching iteratively and that takes a lot of computational effort. So until recently some of these computational problems took so long time that doing maximum likelihood estimation was not practical. But nowadays ML estimation of these models converges in a fraction of seconds so it's not really a relevant concern anymore. So that is another reason why some of these techniques are obsolete. So they were initially introduced because for example ML was computationally feasible and we did a simpler approximation.