 Hello everyone, this is Alice Gao. In this video, I'm going to discuss the clicker question in lecture 8 on slide 17 I'm showing you right here So we have a perceptron and the activation function is the step function The question is asking you what should the three weights be such that the perceptron represents an end function In the main lecture video, I told you the answer that I came up with Minus 1.5 1 1 and there are also other possible answers. Now, let me show you how I derived these answers for this question, we are Essentially trying to learn a perceptron using a tiny data set, right? We have a data set of four point data points This is our truth table so the key idea is to realize what the data set looks like and Also, what the perceptron is trying to do to the perceptron is essentially a linear classifier linear function Which we're trying to come up with to separate the positive and negative examples Let me first produce a graph which can help you visualize what the data set is trying to do We have only one positive example, which is at the point 1 1 So let me use this shaded circle to represent that then all other three examples are negative So 0 1 0 0 and 1 0. These are all negative examples So I have three negative examples one positive examples Then the perceptron is essentially a line that we're trying to come up with to separate the positive from the negative examples As you can see, there is a fairly huge gap between These positive and negative examples, right? So we could come up with many different lines in this gap to separate the two Now to make our derivation easier and the numbers nicer I'm going to use one of the nicer lines. So I'll choose one Let me just add a few more things to make it easier to draw So I'm going to choose one way which Exactly goes through the middle of the gap. So it sort of goes through here Roughly here roughly here Like that Right now, we just need to derive an expression for this line So this line goes through the middle of here, right so the And the middle of here as well So if you do some calculation You will realize that this point is 1.5 and this point here is also 1.5 And this is exactly a 45 degree line. So the slope of the line is minus 1 so Equation which can represent this line would be if you think about this is why if you're not used to thinking about x1 x2 if you think about y and x this will be y is equal to minus x plus 1.5 a slope is minus 1 and y intercept is 1.5 But we can convert this to x1 x2 that becomes x2 is equal to minus x1 plus 1.5 and Let's move everything to the left-hand side This will make it easier to Convert to our perceptron later on So that would be x1 plus x2 minus 1.5 is equal to 0 Okay, so so far this equation is representing the line But remember that the perceptron is trying to do a classification So it's trying to choose one side of the line, right? So we need to make sure eventually we came up with the ways to choose the correct side of the line So how do I do this? Well, first of all, we want to choose the side with the positive example So we want to choose this top side right here So how do I usually figure out figure this out? I usually pick a positive example and plug it in to see if we have the correct number coming out from the left-hand side So let's try this the positive example is one with one one Right, so let's plug in x1 is equal to one x2 is also equal to one one plus one Minus 1.5. This gives us two minus 1.5 which is 0.5 If we apply the step function, this is a positive number So our output value would be one and that's the correct output value, right? Which means the correct inequality that we want is x1 plus x2 minus 1.5 is greater than zero Right, there could be the other case if you do the Calculation on this line right here and if you get a negative value That means we have to flip all the signs of all the coefficients Okay, hopefully we encounter an example later on in your when you're doing your own practice and you can practice that but for this example this Inequality is the correct one and it correctly selects the side with the positive examples So finally We can use this to figure out. What are the weights? All right, so the weights we have W11 that's the weight for x1 is one the coefficient is one W21 that's the weight for x2 The weight is one and then W01 The weight is the constant coefficient, which is minus minus 1.5. All right, so this is the entire process I used to derive the weights and By drawing the picture, hopefully you realize that there are many possible solutions available I just chose one of the nicer ones to make the numbers a little bit nicer To summarize, how do we derive weights for perceptron to represent the logical function? we can start by visualizing the data set and Our goal is to come up with a linear function that separates the positive and negative examples Once we come up with a line Then the expression for this line is going to tell us the weights of the perceptron That's everything for this video. I will see you in the next one. Bye for now