Loading...

Machine Learning 1: Lesson 9

14,403 views

Loading...

Loading...

Transcript

The interactive transcript could not be loaded.

Loading...

Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Sep 21, 2018

Today we continue building our logistic regression from scratch, and we add the most important feature to it: regularization. We'll learn about L1 vs L2 regularization, and how they can be implemented.

We also talk more about how learning rates work, and how to pick one for your problem.

In the second half of the lesson, we start our discussion of natural language processing (NLP). We'll build a "bag of words" representation of the popular IMDb text dataset, using sparse matrices to ensure good performance and reasonable memory use.

We'll build a number of models from this, including naive bayes and logistic regression, and will improve these models by adding ngram features.

Loading...

When autoplay is enabled, a suggested video will automatically play next.

Up next


to add this to Watch Later

Add to

Loading playlists...