So far in this course we've avoided talking about performance or program efficiency. And we've done that quite deliberatly. The reason is that it's very easy for programmers to worry too much about efficiency, to soon about efficiency or to be honest, to worry just plain incorrectly about efficiency. As a general rule, it's a better idea to design a simple program that's easy to understand, and easy to change and then worry about the efficiency later, once the program's running. What you can do then is you can run the program against a sample data set, and measure its performance, and see where the real performance problems are. What you'll find is you often get surprised. Some of the things you thought would be a performance problem get taken care of automatically by the programming language implementation. Other things you might have thought would be a performance problem turn out not to be. And then unfortunately there's some other things that you didn't think would be performance problems that turn out to be performance problems. That said, there's one category of performance problem you really need to fix as part of the design. That has to do with problems of exponential growth in the time it takes your program to run as the data gets larger. Alright, exponential growth in performance is not going to be a good property for your program to have. So what we're going to do in this video is look at how you can use Local to deal with an important category of these problems. I'm back in a version of the file system program, FSV6. And in this version, the definitions are the same. I've got this function made, skinny. And what made skinny does is make trees that are very skinny, they're not bushy at all. Each element just has one child until it gets down to the leaf. And if I say it makes skinny of two, for example, I get a tree that looks like this. There's an element that's called X at the top. It has one child called X and it has one child called Y, which is the leaf. So it's X, X, X, X and then Y at the bottom. So makes skinny of ten is ten X's and a Y. Make skinny of 100 is 100 x's and a y. That's all make skinny does. And then find is the function we designed before that just tries to find an element with a given name. So what I'm going to do now Is I'm going to call find of y on some trees of different depths. And I'm using a new BSL primitive here. It's a very special thing. It's more like an if than a function, because it evaluates its operands in a special way. But what time does is it evaluates its operands, it sees how long it takes and gives us back the time. So, what you can see here I'm timing how long it takes to find y in a tree 10 deep. A tree 11 deep, 12, 13, 14 and so on. Okay? So, here we go. If I run that then the times I get Let me scroll this top so you can see. The time it takes in a tree that's 10 so y is the 11th. E is 2 milliseconds. This is in milliseconds. in the next one, it's 3 milliseconds, and the next one it's 7, then it's 20, 35, 57. So now what I'm going to do is plot that. So let's look at this plot. A bad thing is happening here, isn't it? Each time, we make skinny one deeper. What's happening to the time it takes for this to run? It's increasing by what? It's increasing by pretty close to a factor of 2. Let me just add a couple more to see. We'll make that 16 and 17. If we extend the plot to, the numbers changed a little bit. 'Kay? The numbers changed a little bit that's because it doesn't always compute the exact same time. It turns out that measuring how long a program takes to run is a very complicated thing. But these numbers are, are a good indication of how long it takes to run. So with this now extended plot- scroll it a bit so we can see all the data- it's really clear that it seems to be taking twice about as long to run, when we make the tree one deeper. Now that's a serious performance problem. The tree gets deeper by one. Find takes twice as long. That's an exponential performance problem. Okay? because, as the tree gets deeper and deeper, its basically 2 to the depth of the tree. That's bad. So why is that happening? Well here's why its happening right here in find. Look at what Find does here. It says, Well, go look for n in the first child. And if that doesn't produce False, then that's the value you want. So, produce that value. And from the perspective of producing the right value, this program is perfectly correct and good and wonderful. But the problem it has is that in a branch where it finds the value it's looking for, it searches in that branch twice. So in this particular degenerate case of the tree, where the tree is one long first branch, at the very top we search the next level down twice. And then for each of those, we search the next level down twice, because each time we first do this to see if it's there, and when we find out it is there, we search it again to produce the result. So We search this level twice. We search this level four times. Two times, two, each time it's two times more. And there's the exponential growth. So right there, there it is. There's the exponential growth. We can see why, it's a factor of two each time. So how do we fix that? Well it's really easy to fix that with local. Local's exactly what we want to use here. Basically, we're calling a function here, and then we're calling it here again, to recompute the same value. And what we can do is we can use Local to avoid re-computing the same value. Now watch here is the systematic way I do it. I find the enclosing expression that contains both computations. So here's one and here's another one, and I find the nearest enclosing expression that does it. In that case it's this if I wrap that in local, and I's give, I say define and you know I've think of some, some name here, right. I can call it try and I take this value here. This expression and I name its result, Try. So now I'm saying, hey, go do this fine and let's name it Try. And then each place I used to have the expression, I put, Try. And I need to close that Local, so that's going to be right. There the if, I add one more [UNKNOWN]. I do a command i to fix the indentation and now what this says is it says hey go look in the first child and lets call the result we get from that try and now I can look at the value try as many times as I want. In this case potentially twice. Without recomputing the value. I computed it only once here. I call it try. And then, each time I use try, I don't recompute it. So if we try this version of the program. Ha, sorry, pardon the pun. If we run this version of the program, and look at the performance numbers we get, look. No exponential growth. So this is a case where we use local to avoid recomputation. Now, I want to be really clear here. The reason this recomputation matters is because it's recursive. This recomputation leads to exponential growth because of the recursion. Here's a thing I would encourage you not to do. If you, for example, are going to take a, an argument that you consume and add one to it and use that twice, you know, using Local to avoid the computation of adding one twice isn't worth doing. That's going to make your program harder to read. For little and perhaps to be honest with you no performance game at all. Kay, because program in language implementations are good at spoting things like that and doing them for you. But here because we have a recursive function if the we do the re-computation at each level we get the expedential growth, exponential Performance problems are definitely bad. No one's ever going to say that's not a problem. So here it's worth using local to avoid the re-computation. So there you go, that's our second important use of local. We use it to avoid recomputing values. Especially in recursive functions because in those recursive functions that recomputation could lead to exponential performance problems and the general refactoring is you find the nearest expression that encloses all of the recomputed values. You wrap a local right around that expression, you give the value some name, like try or something else. And then you use that name in place of each of the original expressions that recomputed the value.