 Instrumetavarabletekniikkiä voisi olla käyttäneet seuraavaksi simultainosikkuusimustelua. Se on tärkeää ja se on tärkeää, että nämä tekniköitä on 2-staiset liikkuu. Seuraavaksi systemilla on, että kaikki exogenous varableteet valitettavat asioita asioita asioita asioita asioita asioita asioita asioita asioita asioita asioita asioita asioita asioita asioita asioita asioita Basically what we do is that we estimate each equation, each y separately using x1 and x2 as instruments for every equation. Now, one question that may come to person's mind is why are we not using a y1 as an instrument for y3 because when we established that model identification using the block recursive rule Ourson is an instrument for y3, and therefore the model is identified. There are reasons why we're not using y1 as an instrument, is that these instrumental variable estimators when applied to a system, they make also the assumption that these errors here are un-correlated with these errors here. So we are making this hidden assumption. Of course if we know that these errors are un-correlated, sitten se on yksinkertaisu. Mutta se on jotain, mitä esim. se on yksi tärkeintä. Ja se on se, että ei voi käyttää Y1 kuin esim. Y3, koska esim. ei haluaa esim. Y1 on yksi tärkeintä. On esim. model, jossa system-wide two-stage least squares on, kun sanoin, equivalent to running 2-states least squares for each model separately. And here's a small demonstration using Stata. So we generate random variables. We generate Q first to generate correlated random variables. Then we generate x1, x2, y1, y2, y3, y4 that have some correlations. And then we estimate with systems estimator. We store it and then we estimate the single equation 2-states least squares. And then we compare these estimates. We can see that they are the same. So running a systems 2-states least squares is the same as estimating each equation separately. So 2-states least squares is one option. It's a simple option, but there are also other alternatives. And one alternative is the 3-states least squares. So the idea of 3-states least squares is that we take the 2-states least squares we are and then we take seemingly unrelated regressions and we combine them. So 3-states least squares is basically 2-states least squares plus an additional step that estimates the feasible generalized least squares equation using the 2-states least squares results. So instead of like we in seemingly unrelated regression we normally use OLS regression for calculating the initial covariance matrix, here we use 2-states least squares. So we just add a third step. But this adding this third step making this estimator or a systems estimator comes with a caveat. And the status user manual is pretty good at pointing out these kind of things. So they point out that while this provides a small efficiency advantage, it comes with a cost. And the cost is that because 3-states least squares estimates the full system at a time and we can no longer estimate one equation at a time, then if one of those equations is mis-specified, that mis-specification can affect the estimates of all other equations as well. So this is one advantage with 2-states least squares and generally any other limited information techniques that estimate the model one piece at a time is that mis-specifications in other parts of the model will not affect what the quality of the estimates in one particle part of estimates. So when should you use these techniques? 3-states least squares may be more efficient and the difference may be small. So that's going for that estimator. Then for 2-states least squares that should be used because it's simpler to apply. So if your sample has a large enough that you're going to be efficient enough and you have strong instruments, then using 2-states least squares is probably a better idea because the added value of the added efficiency is pretty small. It makes less assumption, it's more robust. So 3-states least squares we have to assume that all those are exclusion criteria of every variable holds even across equations. And also that readers of your article are more likely to understand this method. I think I've seen more incorrect explanation of 3-states least squares in applied literature than I've seen correct explanations. Also you need to consider that these two are not the only options. So we also have maximum likelihood estimation and generalized method of moments which are more modern techniques than for example 3-states least squares. I personally consider 3-states least squares to be largely obsolete technique and I would use GMM or maximum likelihood instead. The 2-states least squares has a use while it's old and that use is in diagnostics. So because 2-states least squares estimates one equation at a time, if you have a problem with your full system, for example, you cannot get your maximum likelihood estimates to converge, then you can try estimating one equation at a time. So make a bigger problem, a series of smaller problems and then you can probably figure out what is wrong instead of looking at one big problem where it's difficult to see what happens. And then also this is more robust. So 2-states least squares, as I said before, does not rely on assumptions from other parts of the model. It only relies on the assumption that the part that we are currently estimating is correctly specified. So if there is a mispecification in the model, its effects are local instead of spreading throughout the system.