1 00:00:00,000 --> 00:00:04,735 [MUSIC] 2 00:00:04,735 --> 00:00:08,226 Great, we've now seen the notion of overfitting in classification, 3 00:00:08,226 --> 00:00:09,982 especially logistic regression. 4 00:00:09,982 --> 00:00:13,718 We've seen how decision boundaries get really complicated as we start 5 00:00:13,718 --> 00:00:14,534 overfitting. 6 00:00:14,534 --> 00:00:17,018 How those parameter values tend to increase. 7 00:00:17,018 --> 00:00:19,943 How we get overconfident about our predictions. 8 00:00:19,943 --> 00:00:24,356 We introduce the notion of regularization to try to 9 00:00:24,356 --> 00:00:28,060 mitigate blowing up of those parameters. 10 00:00:28,060 --> 00:00:32,450 By doing so, we've seen how they can take the ideas of logistic regression, which 11 00:00:32,450 --> 00:00:39,480 were so far more on the theoretical side, into being ready for real practical work. 12 00:00:39,480 --> 00:00:43,510 You should be able to take those ideas and implement them in practice 13 00:00:43,510 --> 00:00:48,310 using L2 regularization, and you'll also see how L1 regularization can make sense. 14 00:00:48,310 --> 00:00:52,243 We haven't described how to implement that, but it's similar to what you did for 15 00:00:52,243 --> 00:00:53,503 the last problem before. 16 00:00:53,503 --> 00:00:58,439 [MUSIC]