 In this video, we will talk about logistic regression. So, this can be a part of non-linear regression. There are other ways to do the non-linear regression like polynomial multiple polynomials square x square x cube. But let us see the logistic regression in this video. It is a non-linear regression and it is actually not a regression for continuous for the classification like classifying the other variable is in this x1 or y1 or y2 or the student will pass the exam or not pass the exams. In a linear regression, if you substitute the marks and attendance, you might able to predict the student's performance in the finance course say 75.5, 79.6 something like that. But in logistic regression, it is not that it is actually say the student will pass the exam or not pass the exam. So, it is a classification. So, we saw the difference between classification and regression in our previous lectures. So, basically logistic regression is a sigmoid function. What is sigmoid function is this exponential of t divided by 1 plus exponential of t. What is exponential function? This is just broader the exponential function to just show what is exponential function. It is not needed because everybody knows that what is exponential function look like. So, this will never be 0. For infinite, it will be 0, but this kind of rise is exponential. So, it is exponential function. So, it is always good to make exponential steps. And we see this exponential curve in recent cases, you know where I am talking about. So, it is exponential curve is possible. So, if you have this exponential curve in this particular formula, that is called sigmoid function. Somebody know the log odds or logit might able to connect this. Actually, it is a inverse logit function or log odd or logit function. You know in logit or log odd, you have to consider not only the probability of getting the true value, also the non true values actually 1 minus probability of true. So, we do not want to go detail about logistic regression and logit or log odds, do not worry about that. Let us try to understand how this logistic regression and what is the function and how the classification differ from regression that is enough for this video. So, if I compute in this exponential terms, let me compute this. If I compute that for sigmoid function applying E equal to 0, for example, let us see what is equal to 0. For 0, this value is 1. For 0, this value is 1. Let me do this here. So, consider the sigmoid of 0 equal to e power 0 divided by 1 plus e power 0, 1 divided by 1 plus 1, 1 by 2, it is 0.5. The 0 will be 0.5, somewhere here. So, this will be like that. This will be 0, this maximum will be 1, which will never be 1 basically. So, let me, it is not so correct. So, if I want to put that in the regression, probability of y being s or no or probability of y being pass or fail can be given as equal to x1, x2, x3 and b0 and e power t. It is e power t divided by 1 plus e power t plus 1. The t value is exactly what we computed in a regression formula. You see the regression like y equal to mx plus c, right. The b1, bn is a weight, x1 is like independent variable and b0 is the interceptor. So, given a data, the model tries to fit this particular equation. Given a lot of training data, this model try to fit into this equation. After fitting this equation, we can use this equation to predict the future training data. That is simple as this. So, do not worry how this fits into the model. That is not, let us not go into detail of elastic regression here. Instead, this is the formula, there is a model which tries to fit the given data into this particular model, creates the answer for that, like what is x1 and x2 and what is the b0. That is it. So, it is similar to the linear regression where we get a weights, weight 1, weight 2 or weight n and intercept value is the trained model. What is the trained model? The y predict equal to intercept plus weight 1, x1. The weight 1 intercepts our trained model. Similarly, in logistic equation also, you will get a weights b0, b1, b2, bn. Given that value, you can use that value to predict the future things. For example, compute the weights coefficient using the training value and use that to predict the future values. If the future value property is greater than 0.5, you can put it as yes. If it is less, it will be 0. What is 0.5? We saw that 0.5 in a sigmoidal function is at a 0th value. The 0 can be moved. That is called the intercept is coming into picture. So, 0.5 is the value above probability 0.5. You can group them into one particular classifier below 0.5 can be other classifiers. You remember that we saw to compute the metrics in machine learning, we use the probability classifiers is used widely and they are using these classifiers to predict the probability of outcome. And we can adjust the threshold of probability to improve the recall value and pressure value plot at area under curve. If you remember that things, that is how this probability is used. So, you knowing what is logistic regression and linear regression. In linear regression, the plot is linear. In a logistic regression, the line is not linear. It is non-linear function. In the both model, you give input data, you get a trained data in form of intercept weight 1 and weight 2 in both model. But linear model fits simply into linear equation and in logistic regression, it fits into a sigmoidal function. That is the difference. But what is the other difference between linear regression logistic regression? Can you list down? After listing it down, let us in the video to continue. So, it is simple. It is a continuous versus categorical data. In linear regression, it is a continuous and it is a categorical S or no pass or file or multiple classes, not just two classes binomial or more. Do not think logistic regression will be only two class. The multiple ways to do the multi-class classification. And linear regression is easy to interpret. It is just a line you can say this weight of this particular value means keeping this all other variables constants, the x1 as x1 is dependent on y1 in this particular scale. But in logistic regression, also it is very easy because it also gives you the weight 2 and weight 2. But it varies based on the functions you use, the trained function you use. But it is also easy. It is not tough. And error minimizing method in linear regression we saw is least square, least mean square, right? We saw the difference between predicted value and actual value is used to predict the best objective function. But in logistic regression, it uses a ML method, maximum likelihood method to do that. So, that is the difference. So, we are not dealing with the training of logistic regression in this video. So, in this video, we introduced what is logistic regression that is the purpose of this course. We do not want to go detail of logistic regression. Again, if you want to know more about logistic regression, please go and watch Introduction to Machine Learning by Professor Andrew Anshi. Video is available freely in YouTube. It fits it very, very good. Also remember that most of the data science or data prediction happens today by using logistic regression. Although we are very sophisticated softwares or sophisticated classifiers or deep learning for better image classification or voice recognition. In a basic data classification of fraud detection or spam or anti-spam or lot other things logistic regression is used. Thanks for watching this video. Thank you.