 5 p.m. New York City Live. Welcome to my channel and also to this fall 2022 edition of the deep learning course. We are back online and this version, this time we're gonna have a incremental version, which means it's based on the spring 2021 edition, but I will show you just the videos that are different such that you don't have to watch content that you already watched previously. This past year I spent a lot of time writing the textbook and so actually I start also teaching it, teaching through it, teaching from it a little bit in class. So you're gonna have a preview of a few parts of the things I've written so far. In particular, we're gonna be looking at the back propagation circuit where we're gonna be learning different aspects of this algorithm. We will be introducing the energy concepts right from the beginning when we talk about classification and therefore we will talk about the first instance of contrastive learning. Moreover, we're gonna be going further in the exploration of the generative models seen from the lens of the energy perspective. We will be talking about the architectural techniques, for example, K-means and under complete auto encoders, the regularized one, so sparse coding, variation auto encoders, contractive auto encoders. And then finally, we're gonna also talk about some contrastive techniques. So the noisy auto encoder and generative adversarial networks. Perhaps we're gonna have more content. I have no idea because I haven't yet put the content online but as we go through this virtual edition and online and virtual edition of the course, I will decide what to put it in the class and what not. Anyway, for all the necessary information, check out the website that you find below every video and keep up with the latest news through Twitter as long as it stays online, okay? Enjoy the view, bye-bye.