[MUSIC] And I wanna talk about this notion of overfitting because this is something that we've talked about before in the course. I wanna formalize it, and we're gonna discuss it a lot more in the remainder of this course. Okay, so the notion of overfitting is if you have some model, let's say a model here with parameters W hat, so this model has some complexity and some associated estimated parameters, W hat. Well, this model is overfit, if there exists a model with estimated parameters, I'll just call them w prime. So let's just say some other point here. Let's say these have parameters w prime such that two conditions hold. The training error, so one is training, can't spell right now. Training error of w hat is less than the training error of w prime. But on the other hand, the true error of w hat is greater than the true error of w prime. Okay, so this might not seem that intuitive, but let me go through it in terms of this picture here, which is exactly what these points, one and two, are saying. Which is there are a wide range of models that have true error larger than for example, this w prime here. But the ones that are overfit are the ones that have smaller training error. These are the ones that are really, really highly fit to the training data set but don't generalize well. Whereas the other points on the other half of this space are the ones that are not really well fit to the training data and also don't generalize well. Okay, so this is formally our notion of what an overfitted model is. [MUSIC]