 Hello everyone, this is Alice Gao. In this video, I will discuss some common inference tasks for a hidden Markov model. There are four common inference tasks for a hidden Markov model. Futuring, prediction, smoothing, and the most likely explanation. Futuring cares about what's happening today. Given observations until today, what is the probability that I am in a particular state today? Mathematically, given observations from day 0 to day t, we want to calculate the posterior distribution over the state on day t. I'm using the notation small o sub 0 column t to represent the sequence of observations from day 0 to day t. This is equivalent to o sub 0 and o sub 1 and dot dot dot o sub t minus 1 and o sub t. As an example, we may want to estimate the probability of being in a state on day 9 given the observations from day 0 to 9. Prediction cares about a future state. Given the observations until today, what is the probability that I am in a particular state on a day in the future? Mathematically, given the observations from day 0 to day t, we want to calculate the posterior distribution over the state on day k where k is greater than t. For example, we may want to estimate the probability of being in a state on day 15 given the observations from day 0 to 9. Smoothing cares about a day in the past. Given the observations until today, what is the probability that I was in a particular state on a day in the past? Mathematically, given the observations from day 0 to day t, we want to calculate the posterior distribution over the state on day k where k is at least 0 and less than t. For example, we may want to estimate the probability of being in a state on day 5 given the observations from day 0 to 9. Finally, for the most likely explanation, we ask the following question. Which sequence of states is the most likely one given our observations until today? Among the four tasks, smoothing might be the most unintuitive one. You might be wondering, why do we want to perform smoothing at all? As we progress in time, we could perform filtering at every step and derive an estimate for the state at every time step. Isn't this sufficient? Pause the video and think about this for a few seconds. Then keep watching. The main reason of performing smoothing is that making new observations can give us more information about past states. Although we perform filtering for every state in the past, since then we have made new observations. Our estimates are no longer accurate given the new information. We should update our estimates for all the past states given our new observations. Next, let me discuss the algorithms for performing inference. First, remember that a hidden Markov model is still a Bayesian network. Therefore, we can perform all four inference tasks using the variable elimination algorithm. However, a hidden Markov model is a special type of Bayesian network. The particular structure of a hidden Markov model allows us to come up with specialized algorithms for performing inference. And for hidden Markov models, these specialized algorithms are more efficient than the variable elimination algorithm. There are two main algorithms. We can use the forward and backward algorithm to perform filtering and smoothing. We can also use the Viterbi algorithm to derive the most likely explanation. That's everything on the inference tasks. Let me summarize. After watching this video, you should be able to do the following. Describe the four common inference tasks for a hidden Markov model. Explain why we need to perform smoothing in addition to filtering. Name the two algorithms for performing inference in a hidden Markov model. Thank you very much for watching. I will see you in the next video. Bye for now.