 So as we have shown in the previous video, the line of best fit or the regression line, the orange line going through the predicted values, shows the predicted relationship between the independent and dependent variables. You can see that some of the actual results, the blue dots, give it from our predicted results using our line of best fit. In this video, we will introduce a measure to evaluate how well our model can predict the actual results of the experiment. This measure of evaluating how accurate our model can be for predicting the reality is known as the R squared value. R squared is the proportion of total variation in the dependent variable, which is caffeine content in our example. That can be explained by the linear regression model or the line of best fit. Because it is a proportion, it takes on a value between 0 to 1. Our R squared value of 1 means that our model is capable of predicting all of the variabilities around the sample mean, meaning that we know exactly how much the independent variable will affect the dependent variable, and then there are no error terms. R squared of 0 means that our model can do no better than the sample mean in predicting the actual results. You can use the R squared value when analyzing your own data from experiments. It's important to remember though that it is just a statistic based on your own data, and R squared value won't tell you whether your experiment results are correct or not. Only gives you a rough idea on how well your model, or in this case the line of best fit, can predict the actual results. If R squared is low, our line of best fit might not be what its name suggests. And we can probably find a better line or model that will become the line of best fit. You also need to use your own analytical skills to draw your conclusions.