Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Oct 20, 2018
In this video let's cover a simple and practical example to understand neural networks.
ERRATA: At 17:06 when I present the code, it shows me using an IDENTITY function rather than ReLu (RECTIFIER), which is inconsistent with the slides. While this technically worked, ReLu was my intended activation function and I forgot to correct this before recording. I also could have done this without a middle layer since this toy example is likely linear, but I wanted to make it real-world convolutional.