 Hello everyone, this is Alice Gao. In this video, I will continue with constructing a hidden Markov model for the umbrella story. In particular, I will discuss how to construct the transition model. Next, let's construct the transition model for the umbrella story. We need to ask the following question. How does the state change from one day to the next? How does the state today depend on the states in the past? We're reasoning about events over time. There's one state for every time step. In general, the current state may depend on all the past states. Mathematically, we can express this as a conditional probability distribution, a probability of s sub t given s sub 0 up to s sub t minus 1. Unfortunately, this model has a significant problem. As we advance in time, the size of this conditional distribution increases. If we're modeling an arbitrary time step in the future, the conditional distribution will be unboundedly large. An unbounded distribution is problematic since we have to store this table somewhere to perform inference. Let's solve this problem by changing our assumption. Instead of assuming that each state depends on all the past states, we will assume that each state depends on a fixed number of past states. Using our new assumption, we can define a k-order Markov chain. Each state depends on the k-previous states. The simplest case is the first-order Markov process. Each state depends on the previous state only. Mathematically, the original transition probability is equal to the conditional probability of the current state s sub t given the previous state s sub t minus 1. Graphically, this model is a single chain. In some cases, we may not be happy that each state only depends on the previous state. Perhaps the two previous states both have useful information to determine the current state. We can define a second-order Markov process. Each state depends on the two previous states. For example, s sub t depends on s sub t minus 1 and s sub t minus 2. We can generalize this to any fixed value of k. In a k-order Markov chain, each state depends on the previous k states. Let's model our umbrella story as a first-order Markov process. This model makes a key assumption called the Markov assumption. The Markov assumption says that the current state has sufficient information to determine the next state. We do not have to look at older states in the past. I've always remembered this assumption using this sentence. The future is independent of the past given the present. We do want to live in a world with the Markov assumption. I certainly would. Living in a Markovian world is wonderful because our slate gets wiped clean every day. Every day is a new beginning and we can start fresh. Forget about the past, we need to seize the moment and do the best we can for today. What happens today determines what will happen tomorrow. Given the Markov assumption, how many conditional probability tables do we need to specify the transition model? In general, the transition probabilities at each time step may be different. We potentially need a separate table for each time step. To simplify our model, we can choose to make it stationary. A stationary process doesn't mean the world does not change over time. The world still changes from one time step to the next. The word stationary means that how the world changes remain fixed. In other words, the transition probabilities are the same for every time step. There are several advantages to using a stationary model. First, it's simple to specify. We can specify one conditional probability table and use it for every time step. In other words, a stationary model allows us to use a finite number of parameters to define an infinite network. Our umbrella story can run for an unlimited number of time steps, but we can model it using a finite number of transition probabilities. Second, a stationary model is a natural choice. When we're modeling real things, the dynamics often do not change. You might be wondering, what if I encounter a situation where the dynamics change? In that case, there's probably another feature causing the dynamics to change. If we model this feature explicitly, then the dynamics in the new model become fixed again. This is a partial model for the umbrella story. The states form a first order Markov chain. The transition model has a single conditional probability table for every time step. We also have a prior distribution for the state at time zero. That's everything on the transition model for an umbrella story. Let me summarize. After watching this video, you should be able to do the following. Define a k-order Markov chain. Define a Markov assumption and describe some intuitions about this assumption. Define a stationary process. Define some advantages of choosing a stationary model. Thank you very much for watching. I will see you in the next video. Bye for now.