 Hello everyone, this is Alice Gao. This is a short video discussing the question in lecture 8 on slide 23 How can we represent the x-word function using a three-layer neural network? In the main lecture videos, I already discussed how we can use a perceptron or a single-layer neural network to represent some simple logical functions like and or and not and so on so forth So it turns out this question is not super difficult because all you need to do is take the x-word function and break it down into an expression which only involve the logical functions we already know how to represent and then take that representation and convert it into a neural network, then you're done So let's channel our inner logician and try to remember what we learned in the previous logic course like CS245 how can we represent x-word as Combination of and or and not or all these other simpler logical functions. We can do the following So the x-word function between x1 and x2 is equivalent to x1 or x2 That includes three cases, right? But we need to exclude the case when x1 and x2 x2 are both true So and it's not the case that x1 and x2 are both true So you can see by having this expression with broken down the exclusive or into an expression just involving and or and not and turns out We are able to represent all of these using perceptrons So in my opinion having this expression is sufficient for you to figure out the weights in this three-layer neural network But just so that you have all the information in case you want to review or verify your understanding later I'm going to create a new page and draw the entire neural network with all the weights and then also Write down a procedure just just to verify that the neural network is representing x-word And also if you remember from the main lecture videos Then you realize that the solution presenting to is not a unique solution There are many ways many possible ways which can be used to represent the simple functions like and or So these are the ways that I came up with by the way I forgot to mention that we are still using the step function as the activation function because that's just the simplest choice that we Can use to demonstrate these concepts So next I'm going to write down all the procedures to verify that that this network indeed represents the x-word function This is an expression for h1 And if we write down the truth table, you will realize that h1 is representing x1 or x2 All right So here are the expressions for h2 which turns out to be a negation of the end Function and then o1 which is a function of h1 h2 turns out it's h1 and h2 So putting everything together, we will see once again that o1 is indeed the exclusive or function This is just a procedure to show you that it we can derive these weights Mathematically using this process and verifying that these weights are correct. You can come up with other Possible weights that we can use to represent the x-word function as well This is everything for this video. I will see you in the next one. Bye for now