 students, this is the multivariate multiple regression part 2 previous, we have done multivariate multiple regression then, we will look at its properties we have checked its least square model then, we will look at its least square and further, we will study its decomposition now, here is the least square first is the model previous, we have checked now is the least square estimation in this subsection, we will apply the least square method to estimate the model parameter B a model parameter, you had a model basically, we have the model Y which is equals to X B plus E this is the multivariate multiple regression model and we have to estimate the parameter B in the multivariate multiple regression now, parameter B in the multivariate multiple regression the algebra of the method is the same as that was in the multivariate multiple regression as you know that, there is no difference in the algebra we have the same algebra as we found in the multivariate regression similarly, we have also found in the multivariate regression and basically, we have studied the multivariate regression in the previous lecture we have studied all the mathematics further, here we will check the results results are here because we have done the univariate now, in the multivariate, we will just use that result that is minimizing the residual sum of squares you know that, what we have to do minimizing the residual sum of squares to obtain the estimate basically, we have to estimate the model parameter we will estimate that model parameter which will minimize the residual sum of squares here, we will minimize the residual sum of squares the cross product to obtain the least square estimate of P same we have done in multiple regression also same we are doing in multivariate regression also but we are not doing in mathematics we are developing in mathematics which is just the result we are using here by assuming the estimated multivariate multiple regression model now estimated this is the multivariate multiple regression model now we have to use its estimate what we will do for the estimate you take its expectation and you know that the expected value of the error term which is equals to zero so what is the estimated model so estimated model y cap which is equals to x b cap okay this is the estimated model when we take its expectation now this is the estimated multivariate multiple regression model now in the matrix form of the residual error you know that the error we have is equal to y minus estimated value y cap the matrix of the sum of square and the cross product of order m into m is this now here is the error term e prime e e which is equal to the residual sum of square e which is equal to y minus estimated y y minus y cap prime y minus y cap now further what we have done is y cap y cap if you look at the value then previous we have said which is equals to x b cap just its value entered similarly here also we have entered x b cap y cap now multiplication first we have multiplied first you know how to multiply to cross 2 we have to multiply then here we have transpose so y transpose y first came then for this you have to transpose how we will do in its multiplication that before b transpose then x transpose and y similarly y we will multiply from here so y transpose x into b cap and last minus multiplied by minus which is equals to plus b prime b cap prime x transpose then x b cap this is called the equation number one now e prime e say we have result generated equation one i equation one now what we have to do partially differentiate equals to zero and estimated b value generate in order to obtain the least square estimator of b estimator chahiye hame b ka we partially differentiate equation one with respect to b cap and equate equals to zero so abhaap ke paaz beta kaam peyam yani b ki value sorry b hamaar apas the b ki value ka haan peyam first second or third in three mein hamaar apas b ena isko differentiate karna to first apne differentiate kiya yase b eliminate hogya x transpose y second hamei kiya uska apne differentiate kiya then x transpose y third is portion ko hame differentiate karene to b cap b cap kya hojaya ka to b cap to b cap aap ke paaz aagaya then to pehle aagya basically b square hoge na b cap square iska jab aap differentiate kareon with respect to b to to pehle aagya then x transpose x and b cap diya kye aap ke paaz b into b kya ban kya b square ban kya tha which is equals to zero first derivative ko equals to zero hamei rakke estimated b ki value nikalne so abh b cap kis ke equa logya which is equals to this yaha pe aap ke paaz ye dona factor kiya hoge this is equals to two kye minus minus aap ke paaz a minus two x transpose y which is equal to the two minus two x transpose x b minus two minus two cancel out b cap aur hiye vali term denominator me aap ke denominator me jaayagi na to denominator ko agar numerator me leki aayi to kya hoga iska inverse x transpose x inverse aur ye factor multiply hoge ya then x transpose y estimate hoge aamei rapaz b b ki value aagaya then further it is interesting to note tha the least square estimated b cap of b can also be obtained by using the following technique further aam isse bhi usko find kar sakte the acha aap aap ko ye bhi pataona jaaye ke least square estimate me hamei rapaz expected value of b wo kis ke equa logya expected value of b cap which is equals to the b teke same vo hi usne ka hain note tha the least square estimated b cap of b can be obtained by using the following technique from the single response variable regression we have the least square estimator of the model parameter b abhi hamei drive kya collecting the univariate least square estimate ab univariate least square estimates aapko pata hamei bhi wo model kya previous model me b cap tha kya tha first b one second variables ke saath b two up to stone bp aap further isko likha kya se ye to aap ko jaa sist me explain kya further kya likha hai this is the x transpose x whole inverse x transpose y y kya tha hamei rapaz previous lecture me hamei dekhaya ke y hamei rapaz kya tha y one y two and up to stone y ye hamei rapaz transpose previous tha usko hamei le liye aap ye y ki values teke further hamei rapaz result aagya so b hamei rapaz kya gya x transpose x whole inverse x transpose this is the y now the third property is the decomposition of the total sum of scale and the cross product matrix decomposition me hamei kya decomposition kya total sum of scale and the cross product matrix now we give the matrix analog of the univariate decomposition of the sum of scale consider the multivariate multiple regression models so we have the multivariate multiple regression model and the estimated model is this kya se estimated model aagya expectation aange expected value of error which is equals to zero to estimated model kya gya y cap equals to x b cap now the matrix of residuals similarly abhi previous hamei b me cheki hai second property jo hamei least square me cheki thi estimated matrix of the residuals is the e so e which is equals to y minus y cap so y ye haan se y ki value determine kya hai y ki value kya oge abhi estimated ko equality kyu side palayenge so positive sign y which is equals to y cap plus e then total sum of scale total sum of scale kis ke equal hota tha this is the y prime y so y prime y prime kya y cap plus e ye haan ke pas y hai na to iska prime kya oge y prime aand here is the y aap further isko multiply karne then a multiply koro first term multiply by first term then first term multiply by second term second term multiply by first and second term multiply by second term to ye haan ke pas multiplication ki multiplication ke baad hamei repas ye final result aagya y prime now we know that the y prime e which is equals to e prime y which is equals to zero so this is we know that mathematics is ki yaha pe nahi ki baad hame idea hai aur abhi haan isko further nomadikali bhi chek kar lenge ki e prime y cap equals to zero kya se hain previous ye multiplication jo hain saari multiple regression me solvoi bhi hain to yaha pe haan ne just result y scheme so we have the total sum of skier this is the regression sum of skier and here is the residual sum of skier all we can say that the total sum of skier and the cross product predicted sum of skier and the cross product and the error sum of skier of the cross product ye aap mai repas model aagya decomposition in a decomposition me this is the regression and this is the residue the error sum of skier and the cross product can be written as this now finally the error sum of skier and the cross product can be written as this e prime e which is equals to y prime y and this is this is ye value aap ke pas kya hain y cap and y cap prime this is the y cap prime into y cap so ye aap ke pas result aap kya kya aap ke pas estimated value aap yaha se aap dekona e ki value ye aas se agar mai is equations the e ki value nakaal no to e prime e which is equals to y prime y aap ye tam kya hain ye tam yaha pe equality ko side pe jaake minus hojhe ki minus y cap prime y cap to y cap prime we know that the estimated value y cap prime which is equals to x b cap ke prime e leek liye then y cap x b cap to phu dhar hain yaha pe yevali do to tamso as it is aap e x B Capp ko y capp kya re now the final error sum of skier and the cross product times ho maa repas aag hain aap phu dhar issko hum nomaarikali chek karenge now the total sum of skier decomposition now the total sum of skier which is equals to the regression sum of skier and the error sum of skier to phu dhar ab hain me sku nomaarikali chek kar अन कडाए्चा लेज़ पन् जब नट्ढ़ह हैं। सनार्यो-। यावे.. मद्ँब याव..