 Students, in previous module we have performed the multiple linear regression analysis in SPSS software. Now we are going to check the assumptions needed to be fulfilled before interpreting the results in SPSS. So let's check the assumption one by one and first assumption which we are going to talk about which we are going to talk about its multi-colinearity. Multi-colinearity is a possibility of strong relationship between independent variables. Multi-colinearity is the concept that the independent variables can have a strong significant relationship for linear nature. So if there is a strong significant relationship between the two, then it will affect your independent variable to explain the variance of the variable. So it is very important that we check this before interpreting the results of MLR. So there are various ways to check this. The first way that we check this multi-colinearity in multiple linear regression is that you have ticked the descriptives in the options. When you ticked the descriptives, it gave you the correlation table. The table you are looking at is the Pearson correlation in which it gives your dependent variable which is your DV and the rest are your IVs in which making new social ties maintain existing social ties, seeking and sharing information, recreation or entertainment, self-documentation, self-expression and personal information disclosure. What are all these? Motivation to use a social media and their motivation to use online bridging social capital. This we have to explain in the model of our multiple linear regression. And this table particularly tells you that between these independent variables, Pearson R correlation coefficient is so strong. So if you look at the first one, then this is a relationship of IVs with the dependent variables. So we will not look at this, we will look at the relationships below this as 0.208, 0.436, 0.348, 0.434, 0.413, 0.236, these are all in the range of 1, these are all in the range of 5. So here any relationship between IVs, if we look, is not 0.80 or 0.70 or even 0.60. So this means that there is no possibility of multicollinearity here. Because theory says that if the correlation coefficient between your IVs is 0.80 or above, then there is multicollinearity again. But in our dataset case, this assumption is fulfilled that there is no multicollinearity here. Now the second way of checking the collinearity in multiple linear regression is to look at the values of its tolerance of every IV. This is an output of MLR. So in this we will talk about coefficients and standardized coefficients later. Now we are looking at the tolerance level of every IV. So all these values of IVs are greater than 0.50. And theory says that if IVs are less than 0.10, then there is a lot of possibility that multicollinearity exists. So our value is greater than 0.50. So this means that we also know from this test that the independent variables of our dataset are not multicollinearity. And there is another value with this which is written here, VIF. This is a variance inflation factor for each independent variable. Now if we look at its value, then it is between 1 and 2. And this is the roundabout between 1 and 2. So theoretically it says that if its value is greater than or equal to 10, then there are chances of collinearity. So in this parameter, we also get to see that there is no multicollinearity in our dataset. After this, our last test here is the value of Durban Watson test. So this test checks the auto correlation between the terms of independent variable. So we are getting to see that its value is 1.92. So it is very close to 2. So it is said that the value of the auto correlation is 2 or it is close, then there is no auto correlation. And if its value is 3 or it is smaller than 1, then there is auto correlation. And if it is around 2, then it is an ideal value. So in our model, its value is 1.92. It means that there is no auto correlation in our datasets. So by viewing all these statistical parameters, we conclude that in our analysis, there is the no possibility of multicollinearity between independent variables.