 Hello everyone, this is Alice Gao. In this video, I will discuss the forward-backward algorithm. Now that we understood forward recursion and backward recursion, let's put them together in the forward-backward algorithm. I'll explain the algorithm using a hidden Markov model having four time steps. The algorithm works for a hidden Markov model with any finite number of time steps. The purpose of the forward-backward algorithm is to calculate the smooth probability at each time step. For our model, we want to calculate these four probabilities. If we treat this model as a generic Bayesian network, we have to calculate each of the four probabilities separately. For example, by using the variable elimination algorithm. This approach is inefficient since there is no way for us to reuse any intermediate calculation results. For every probability, we have to start the process from scratch. The forward-backward algorithm allows us to calculate these probabilities more efficiently through two passes through the network. Let's take a look. We will first perform a forward pass using forward recursion. Start from time zero and go forward in time. At each time step k, calculate the message f sub zero to k and store the value. After the forward pass, we have four values from f sub zero zero to f sub zero three. Next, we will perform a backward pass using backward recursion. Start from the last time step time step three and go backward in time. At each time step k, we will calculate the message b sub k plus one to t minus one. Then, we can combine the stored message f and the message b to derive the smooth probability at each time step. That's everything on the forward-backward algorithm. Let me summarize. After watching this video, you should be able to describe how we can calculate the smooth probabilities efficiently using the forward-backward algorithm. Thank you very much for watching. I will see you in the next video. Bye for now.