1 00:00:00,000 --> 00:00:04,684 [MUSIC] 2 00:00:04,684 --> 00:00:06,040 Wow, exciting. 3 00:00:06,040 --> 00:00:07,890 We got through our first module. 4 00:00:07,890 --> 00:00:09,920 We got through a representation for 5 00:00:09,920 --> 00:00:14,450 linear classifier, in much more depth than we did in the first course. 6 00:00:14,450 --> 00:00:17,800 In fact, we had a chance to dive in and 7 00:00:17,800 --> 00:00:21,630 really understand how these models are built, the parameters and then the length 8 00:00:21,630 --> 00:00:25,640 foundation associated with them, as well as how to extend it to other settings. 9 00:00:25,640 --> 00:00:31,480 So now we're able to take our learning loop that goes 10 00:00:31,480 --> 00:00:37,120 from training data to creating features h(x) to this logistic regression model. 11 00:00:37,120 --> 00:00:42,360 And define a learning algorithm which we've talked in high level only now but 12 00:00:42,360 --> 00:00:45,980 we're going to talk in more detail in the next module, to come up with w hat and 13 00:00:45,980 --> 00:00:47,330 we're going to make predictions from it. 14 00:00:48,700 --> 00:00:53,790 So, now you can take everything you've learned and be able to talk about 15 00:00:53,790 --> 00:00:58,460 what does a linear classifier look like, what's the logistic reaction model, 16 00:00:58,460 --> 00:01:03,080 how that relates to the general score function, like we have with regression, 17 00:01:03,080 --> 00:01:08,290 but squeezes it to the interval 0, 1 to predict probabilities. 18 00:01:08,290 --> 00:01:12,750 How it's affected by parameters, and how to extend that to multi class setting and 19 00:01:12,750 --> 00:01:14,800 how you do with category called data. 20 00:01:14,800 --> 00:01:16,390 So, we done a lot. 21 00:01:16,390 --> 00:01:21,993 We're really ready to really think deeply about classifiers. 22 00:01:21,993 --> 00:01:22,493 [MUSIC]