 Welcome to week seven. Today, we will talk about more modern state-of-the-art confnets and transfer learning, two of my favorite topics in deep learning. So let's first talk about what we did last week. We learned how to build a confnet. We learned how to write the neural network itself and how to write the training loops. We learned about convolution layers and max pool layers and dropout layers and how to train with them. We learned about data augmentation and regularization, how useful they are. And we learned a bit about the general domain where confnets are useful. So let us talk a little bit in each of the parts about last week. Talk with the part about what you learned last week. What do you still hope to learn? What is unclear? What was most surprising? So, I hope you'll have a good discussion.