[MUSIC] Okay, well in summary, we've really learned a lot this module. We've thought about this task of feature selection and we've described ways searching over first, all possible sets of features that we might want to include to come up with the best model. Talked about the challenges of that computationally, and then we turned to thinking about greedy algorithms and then discuss these radialized regression approach of lasso for addressing the feature selection task. So really it's covering a lot of ground. That's really important concepts in machine learning. And this lasso regularized regression approach, although really, really simple, has dramatically transformed the field of machine learning statistics and engineering. It's shown its utility in a variety of different applied domains. But I wanna mention a really important issue, which we kind of alluded to, which is that for feature selection, not just lasso, but in general when you're thinking about feature selection, you have to be really careful about interpreting the features that you selected. And some reasons for this include the fact that the features you selected are always just in the context of what you provided as the set of possible features to choose from to begin with. And likewise, the set of selected features are really sensitive to correlations between features and, in those cases, small changes in the data can lead to different features being included, too. So to say that one feature is important and the other isn't, you have to be careful with statements like that. And also of course, the set of selective features depends on which algorithm you use. We especially saw this when we talked about those greedy algorithms, like the forward stepwise procedure. But I did want to mention that there are some nice theoretical guarantees for lasso under very specific conditions. So in conclusion, here's a very long list of things that you can do now that you've completed this module. Everywhere from thinking about searching over the discrete set of possible models to do future selection using all subsets with these 3D algorithms. To formulating a regularized regression approach, lasso, to implicitly do this feature selection, searching over a continuous space, this tuning parameter lambda. We talked about formulating the objective, geometric interpretations of why the lasso objective leads to sparsity. And we talked about using coordinate descent as an algorithm for solving lasso. And so coordinate descent itself was an algorithm that generalizes well beyond lasso, so that was an important concept that we got out of this module as well. And finally, if you watch the optional video, we talked about some really technical concepts relating to subgradients. And to conclude this module we talked about some of the challenges associated with lasso. But, as well as some of the potential impact that this method has, because it's really quite an important tool. And like I've mentioned, it's really shown a lot of promise in many different domains. [MUSIC]