 In this video, we will discuss Descent Tree Classifier. So, it is also supervised classifiers. Descent Tree is also can be used as a regression, that is why it is called CART, C-A-R-T. Check it out what is CART, Classifier and Regression Tree, but that is called CART. So, Descent Tree can be used as a classifier also. If you have multiple trees and the mean of them is regression value, but it is a classifier easily to do it. And it is very, very popular and widely used to most of researchers report Descent Tree, because it gives a good result and all of things. And it is one tree, we will talk about what are the trees, a simple tree with a root and branches in upside down. But if you want to read, that is a Descent Tree Classifier. If a lot of trees, instead of one tree, I want to construct say 10 trees and pick the best, what is the class based on the voting mechanism, ensemble classifier method, then it is called a forest, it is called random forest algorithm. So, it is good to go for random forest algorithm also, but let us understand what is Descent Tree Classifier in this video. So, this is sample Descent Tree to see whether student decides to attend a class or want to skip the class. So, let us see the two decisions, skipping the class or a sudden the class. If you have an exam, if you have an exam in that particular class today, you will attend the class, high probability. If yes, no exam, that is no, and you make sure this particular course, my attendance is compulsory. If it is compulsory, then the must question comes, if my attendance is compulsory, do you have a, they have a required percentage of attendance, like I have greater than 70 percentage, then I can skip the class. The student might think, oh, I do not want to attend a class. So, there is no exam today and attendance compulsory in this course, but I have already more than 70 percentage, so I can skip the class. Or no, if attendance is not compulsory, it is very simple. So, this particular course, I have no exam today, no attendance. Now, it is decision of whether you feeling sleepy or not, right. I feel sleepy, so I will skip the class. No, I am not feeling sleepy, I can go, I can attend the class. It is all about the mood of the particular child or particular student. So, decision to a classifier is very famous just because this is how the humans make decision. Before you go to some particular place, how do you want to go, before you take any buying something, you make a decision whether it is important or not, whether what is the cost and all these things you compare, right. That particular decision tree can be, that decision can be made as a tree, that is called decision tree. So, since it is intuitive, it is also how we do. So, this become very popular and it is very easy to explain to others also. So, it is a tree upside down, root is in the top and the leaves, leaf nodes are on the bottom, right. So, this is the parent node for these two childs and the two childs have branches, again two childs and the siblings and again they have leaf. So, you have to have leaves only in the last, the last layer, the top place root. So, it can be binary like yes or no, like skip the class, attend the class or it can be more categorical, like what is the percentage marks you get out of 4 beans or 5 beans or it can have more than 2 branches, not only 2, it can have multiple things, it is not just yes or no, these branch also can be binary or by branch also can be categorical threshold or level of splits happens. So, decision tree is upside down tree and the branch can be binary or non-binary, also the leaves, the labels can be binary or categorical variable, that is a parent for each child node and root is the root and there is a leaf node, there is no child for that, that is the last node you want to predict. So, how do you create a decision tree for a given values or the data you collected from the students interaction with the system or something like that. So, the decision tree algorithm is developed in 1986 called ID3 side trade algorithm and most recent algorithms like C4.5 everything has been developed from the ID3 or the variant of that. So, the tree is top down tree and the search algorithm is greedy search algorithm will be used. So, that is you have to search every possible branches if not go to an S branch something like that. So, the question here in decision tree is which node to choose as a root node. Suppose you have to, there are 4 features in a previous example, the feature is having exam. How do you choose having exam as a root node, which node to choose as a root node is very important and where to stop, when to stop is like what is the final leaf node, what is when to stop the tree, can it further continue more conditions and more logics but when to stop it. These 2 are the key questions in the decision tree algorithm. Let us see that in the next video but before that can you list down the advantages of using decision tree, can you list down 1 or 2 advantages compared to other algorithms you have seen in this class. So, decision tree is simple and easy to explain to others because I said that this is how humans beg decision. So, we can easily explain to others they can relate to the daily decision making. And it can handle both nominal and categorical variable like the values, the past or continuous variable also the categorical variable like student past exam meet attendance levels low high meet all these things, which means there is less data processing and also no need to scaling or normalizing the values. It is also a non-linear classifier, it is not like a regression classifier. And the hierarchical structure of the tree is easy to interpret because the path to decision making is traced. So, how the particular decision has been made for each thing can be traced from the root to the leaf and that reduces the ambiguity in the user when you look at the decision tree, when you visually look at the decision tree how the particular decision has been made. So, you know you can provide the informed adaptation, informed hint or informed recommendation to the students. Also each feature is considered to make a decision that is also advantage also a disadvantage. Only one feature is considered at a time other features not considered but each feature is actually considered to make a decision that is actually advantage. So, in this video we just described what is decision tree and one picture of it and what are advantages of decision tree. In next video we will talk about how to create a decision tree. Thank you.