 Hello everyone, today we are going to discuss on machine learning algorithms called Bayesian algorithms. Let us see first learning outcome of this algorithm. At the end of this session, students will be able to apply Bayesian algorithms to solve the real world problems. First of all, introduction of Bayesian algorithm. Bayesian algorithm is a supervised learning algorithm in the machine learning. It is a part of artificial intelligence and it is totally based on the Bayes theorem. So what are the Bayes theorem states that? Bayes theorem states it is a simple algorithm where all the data set comes together and we build a model based on the Bayes theorem. In that we use the Bayes classifier which is a very simple and most effective classification algorithms which helps in building the fast machine learning models that can make quick predictions. Bayesian algorithms can be used to solve classification problems based on the real world application. In this case, we expose to more and more data in case of Bayesian algorithms and this algorithm is able to create increasingly accurate data. Talking about Bayesian algorithms, who will use these Bayesian algorithms? First one is data scientist, data analytics and data science engineers. They use Bayesian algorithms because these algorithms allow data scientists to encode prior beliefs about what the models should look like independent of what the data states. In a given data set, we have to show the prior belief based on the hypothesis and observation. Based on the observation and hypothesis, we build a model which is really an accurate model which is independent of what the data states are. These algorithms are especially useful when you do not have a massive amount of data to confidently train a model. This model requires a less amount of data to build an accurate model. When we use a massive amount of data, then complexity increases. But especially in this algorithm, we use a small amount of data to build a confident model which we use the accurate solution. It is also known as Bayes law because Bayes theorem is totally based on the Bayes law. What the Bayes law indicates is the probability of hypothesis with the prior knowledge. And in this, something is dependent on the conditional probability. So those we are going to see now with the help of example. Before that, we discussed Bayesian algorithms. The formula for Bayes theorem is given as based on the algorithm is probability of A and B. So here A and B are the two events. So probability of A and B is equal to probability of A divided by probability of B. So probability of A B is the posterior probability. Probability of B A is the likelihood probability and probability of A is the prior probability and probability of B is the marginal probability. So these are the terminologies where we are using in the Bayesian algorithms why because this will gives the observation as well as hypothesis based on the conditional parameters. So first probability of A B is a posterior probability where probability of hypothesis A given an observation event B. Probability of B A is a likelihood probability of the evidence given that the probability of hypothesis is a true and probability of A is prior probability where hypothesis before observing the evidence and probability B is a marginal probability where probability of evidence counts will occur okay. So these are the formulas these are the parameters we are going to apply on the real world problem. Let us see real world problem based on the Bayesian algorithms. So here we have a data set weather conditions and we have an outcome variable as a plane. So what this algorithm suggests based on the weather conditions whether play should happen or not or player should play or not okay. For this we have designed a problem that if the weather is sunny then the player should play or not to play. To solve this algorithm we have the steps. First step is convert the given data set into frequency tables. Second generate likelihood table by finding the probabilities of a given features. Here we are creating two tables one is probability of occurring events and probability of hypothesis okay before observation and after observation okay. So let us see the first table. This is a data set where we have all the events 0 to 13 that is total of 14 events then we have a weather outlook weather is rainy sunny over cost okay or rainy and play is yes or no okay based on the probability of occurrences the observations was given in the third attribute as a play okay. If the weather is rainy what is the probability of play is yes. If the probability of sunny then what is the observation of the play is yes like in this way we have given the data in the original data set. Now based on this original data set we generated first table frequency table for the weather conditions. First attribute weather and in weather we have calculated over cost rainy sunny and yes and no these are the two variables where we count the number of s of the weather and number of no which is indicated in the weather. So over cost we have indicated 5 times yes and no 0 times rainy is count is 2 and no is 2 sunny count is 3 and no count is 2. So here we have agreed a total of yes that is 10 and total of no is 4. This is the first table second table is likelihood table weather condition again we counted here weather outlook that is whether it is over cost rainy sunny and how many times they have observed and they how many times they have probability no 0 times yes 5 times rainy no 2 times yes 2 times sunny no 2 times and yes 3 times. Now here horizontally and vertically we have added all these count okay divided by number of events. So 4 divided by 14 vertically count of no is 4 divided by 14 is equal to 0.29 and yes is 10 divided by 14 is equal to 0.71. What is the value of horizontally? Horizontally if you add this over cost is 5 divided by 14 0.35 rainy 4 divided by 14 0.29 sunny 5 divided by 14 0.35. So these are the values we get based on the likelihood and frequency table. Now after getting this value how we apply Bayesian algorithms on these values yes now we are using the formula probability of yes sunny is equal to probability of sunny yes into probability of yes divided by probability of sunny okay. So probability of yes of hypothesis given an observation okay probability of sunny yes means it is a probability of observation on a given hypothesis into probability of yes means probability of observation divided by probability of unconditionally this pretty of particular hypothesis okay. So P of sunny of yes so in the previous slide we have calculated probability of sunny yes equal to 3 divided by 10 equal to 0.3 probability of sunny 0.35 and probability of yes is 0.71. So probability of yes sunny equal to 0.3 into 0.35 into 0.71 so it gives 0.60. Second algorithm we have calculated as probability of no and sunny. So probability of sunny observation no hypothesis into probability of observation divided by probability of hypothesis. So probability of sunny no 2 divided by 4 0.5 no is 0.29 and probability of sunny 0.35. So probability of no sunny is 0.5 into 0.29 into 0.35 is equal to 0.41 this is the value of probability of no and sunny. So calculation how we calculate this is probability of yes sunny should be greater than probability of no sunny. So here we calculate probability of yes of sunny is 0.60 and probability of no sunny is 0.41. So of course 0.60 is greater than 0.41 here we find adequate solution hence on a sunny day player can play the game. So this is what we got the solution from our problem statement. So after discussing all these things there is a small question based on Bayesian algorithm. The question is the previous probabilities in Bayes theorem that are changed with the help of new available information are classified as four options first one a independent probabilities b posterior probabilities c interior probabilities d dependent probabilities. So think on this question and give an answer yes the answer is b posterior probabilities because in the posterior probabilities which the things which are changed with the help of new available information which are classified only in the posterior probability but not in the independent probabilities. So the answer posterior probability is the answer. This is a reference where I have taken the information regarding this Bayesian algorithm. Thank you.