1 00:00:00,079 --> 00:00:04,401 [MUSIC] 2 00:00:04,401 --> 00:00:06,828 So we talked about polynomial regression, 3 00:00:06,828 --> 00:00:09,458 where you have different powers of your input. 4 00:00:09,458 --> 00:00:13,790 And we also talked about seasonality where you have these sine and cosine bases. 5 00:00:13,790 --> 00:00:18,440 But we can really think about any function of our single input. 6 00:00:18,440 --> 00:00:19,920 So let's write down our 7 00:00:21,300 --> 00:00:25,300 model a little bit more generically in terms of some set of features. 8 00:00:25,300 --> 00:00:30,397 And I'm gonna denote each one of my features with this function H. 9 00:00:30,397 --> 00:00:33,777 So H0 is gonna be my first feature, 10 00:00:33,777 --> 00:00:39,570 H1 my second feature, H capital D, my last feature. 11 00:00:39,570 --> 00:00:42,220 So we can more compactly represent this model 12 00:00:42,220 --> 00:00:45,230 using the sigma notation that we introduced previously. 13 00:00:45,230 --> 00:00:50,140 Where we put an index I equals one to capital D. 14 00:00:50,140 --> 00:00:54,349 Saying we're summing over each of these capital D different features. 15 00:00:54,349 --> 00:00:59,463 And just to be very clear h sub j of x is our jth feature, 16 00:00:59,463 --> 00:01:06,990 and wj is the regression coefficient or weight associated with that feature. 17 00:01:09,020 --> 00:01:13,290 So just to give some examples that we've gone through, this first feature might 18 00:01:13,290 --> 00:01:17,389 just be one, this constant feature that we've used in all of the past examples. 19 00:01:18,580 --> 00:01:25,640 Or when we think about our second feature, h1, maybe that's just our linear term, x. 20 00:01:25,640 --> 00:01:31,130 Our third feature might be x squared or maybe it's our sine basis. 21 00:01:31,130 --> 00:01:36,050 Or we could think of lots of other feature examples and 22 00:01:36,050 --> 00:01:40,190 when we get to our capital Dth feature, maybe it's just our 23 00:01:40,190 --> 00:01:43,610 input raised to the pth power when we're thinking about polynomial regression. 24 00:01:44,650 --> 00:01:47,120 So, going back to our regression flow chart or 25 00:01:47,120 --> 00:01:51,940 block diagram here, we kinda swept something under the rug before. 26 00:01:51,940 --> 00:01:56,030 We never really highlighted this blue feature extraction box, 27 00:01:56,030 --> 00:01:58,780 and we just said the output of it was x. 28 00:01:58,780 --> 00:02:04,120 Really, now that we've learned a little bit more about regression and this 29 00:02:04,120 --> 00:02:10,100 notion of features, really the output of this feature extraction is not x but h(x). 30 00:02:10,100 --> 00:02:14,210 It's our features of our input x. 31 00:02:14,210 --> 00:02:17,800 So x is really the input to our feature extractor, and 32 00:02:17,800 --> 00:02:21,050 the output is some set of functions of x. 33 00:02:21,050 --> 00:02:23,662 So for the remainder of this course, 34 00:02:23,662 --> 00:02:29,063 we're gonna assume that the output of this feature extraction box is h of x. 35 00:02:29,063 --> 00:02:33,179 [MUSIC]