 If you take a look at the sample sentences generated after every epoch, you can see the generated sentences get more coherent. Pretty slick, right? If you want to train this further for better results yourself, just construct the model, load the weights, and you're good to go. I'll leave a link to this code in the description down below. A few things to take away from this video. Recurrent neural nets are basically feedforward neural network layers just copy and pasted. They learn parameters through the truncated back propagation through time algorithm, which is basically backprop but applied at every time step.