 So the dynamics of learning clearly matter. So let us think a little bit about these dynamics. So we have a linear system. So in a way, the dynamics will be linear, which is exactly the domain where we can have good iterations. So let's develop the iterations for learning in a simple linear system. Let's say what you want to learn is y hat is x in one dimension. We have two layers. c is w1x and y hat, the estimate of y, is w2z. And we start initializing the weights very close to zero. What's going to happen? Well, the gradient is also going to start very close to zero. And the gradient increases as activity increases and learning gets faster and faster. And you can easily see how if we'd have a deep system like that, it will be exponential. Now, it's well understandable in linear systems. So let's try it. Take a simple one-dimensional system and see how it learns. And explore this initial exponential or exponential like growth that we'll see.