1 00:00:00,000 --> 00:00:04,560 [MUSIC] 2 00:00:04,560 --> 00:00:05,744 And now we're back. 3 00:00:05,744 --> 00:00:09,685 So, we computed the or we defined the likelihood function for logistic 4 00:00:09,685 --> 00:00:14,611 regression, we computed its derivative, we talked about simple real ascent algorithm 5 00:00:14,611 --> 00:00:18,684 for finding the best parameters and for those who went through that full, 6 00:00:18,684 --> 00:00:23,152 kind of slow, mathematical revision you saw where that gradient came from, but 7 00:00:23,152 --> 00:00:25,090 that was totally optional. 8 00:00:25,090 --> 00:00:28,270 We can go back and summarize where we are right now. 9 00:00:28,270 --> 00:00:29,685 So now learning problem. 10 00:00:29,685 --> 00:00:34,330 We have some training data, we threw this some feature extractor that gets us 11 00:00:34,330 --> 00:00:37,470 H of X, and we saw a discretion model that says the probability that it's 12 00:00:37,470 --> 00:00:43,050 a positive review is one over one plus E to the minus W transpose H. 13 00:00:43,050 --> 00:00:46,770 And then we define the quality metric which is the likelihood function and 14 00:00:46,770 --> 00:00:50,630 we use gradient ascent to optimize it to get w hat. 15 00:00:50,630 --> 00:00:54,010 And the gradient descent algorithm is pretty simple and pretty intuitive. 16 00:00:55,100 --> 00:01:03,240 Now, you're able to define what the quality metric is for logistic regression. 17 00:01:03,240 --> 00:01:06,820 You can interpret the likelihood function as being kind of the probability that you 18 00:01:06,820 --> 00:01:08,370 get training data right. 19 00:01:08,370 --> 00:01:10,020 And you're going to maximize that. 20 00:01:10,020 --> 00:01:12,890 We talked about the gradient ascent algorithm that does it with 21 00:01:12,890 --> 00:01:15,710 really simple updates, and we had this optional section where we 22 00:01:15,710 --> 00:01:18,100 derived a gradient ascent algorithm from scratch. 23 00:01:19,550 --> 00:01:23,750 Next module, we're going to take this one step further and 24 00:01:23,750 --> 00:01:28,070 explore the idea of regularization and over-fitting logistic regression, 25 00:01:28,070 --> 00:01:29,860 which is a really important thing in practice. 26 00:01:29,860 --> 00:01:31,681 So that's where we're going to go in the next module. 27 00:01:31,681 --> 00:01:35,818 But now you're ready, even at this point, supplement your own logistic 28 00:01:35,818 --> 00:01:39,436 regression algorithm from scratch, which is super exciting. 29 00:01:39,436 --> 00:01:43,749 [MUSIC]