1 00:00:00,002 --> 00:00:05,420 Okay so, good morning again Discrete Optimization, LDS and LNS. 2 00:00:05,420 --> 00:00:08,134 So what you're going to see in this lecture are two really amazing things, 3 00:00:08,134 --> 00:00:10,620 okay? They're really useful in practice. 4 00:00:10,620 --> 00:00:13,338 They are also truly beautiful great ideas, okay? 5 00:00:13,338 --> 00:00:16,340 So one is, you know, a strategy for exploring a tree. 6 00:00:16,340 --> 00:00:19,352 The other one is an hybrid optimization technique, okay? 7 00:00:19,352 --> 00:00:23,192 So, it's a very short lecture but the two ideas that you have there, you will be 8 00:00:23,192 --> 00:00:27,180 able to reuse them all, all over the place, okay? 9 00:00:27,180 --> 00:00:30,580 So, so essentially what I'm going to talk about today is searching, okay? 10 00:00:30,580 --> 00:00:34,760 And the first thing is going to be limited discrepancy search, LDS, okay? 11 00:00:34,760 --> 00:00:37,500 Which is, basically giving you another way of searching entry. 12 00:00:37,500 --> 00:00:40,000 So, we have seen that first search and best for search. 13 00:00:40,000 --> 00:00:42,100 That's mostly what we've been talking about. 14 00:00:42,100 --> 00:00:45,220 This is a very different way of looking at exploring the tree. 15 00:00:45,220 --> 00:00:47,614 The order in which you are going to be looking at these leaves is going to be 16 00:00:47,614 --> 00:00:50,546 very different. And it's going to be based on a very 17 00:00:50,546 --> 00:00:54,011 clever idea, okay? And then the second phase that we are 18 00:00:54,011 --> 00:00:58,340 going to talk about is called large neighborhood search, okay? 19 00:00:58,340 --> 00:01:01,690 It's a hybridization of search but it can also be an hybridization of constraint 20 00:01:01,690 --> 00:01:06,144 programming and local search. And it's very effective for solving very 21 00:01:06,144 --> 00:01:09,240 large scale problems, okay? So lets first talk about limited 22 00:01:09,240 --> 00:01:12,218 discrepancy search. And the key idea the key assumption here 23 00:01:12,218 --> 00:01:16,602 is that you assuming that you have a good heuristic for your problem, okay? 24 00:01:16,602 --> 00:01:19,871 So your solving that typically give you a good solution, okay? 25 00:01:19,871 --> 00:01:23,165 Not the optimal maybe not even close to the optimal but typically this is a 26 00:01:23,165 --> 00:01:26,424 heuristic that makes very few mistakes, okay? 27 00:01:26,424 --> 00:01:29,413 And then we are going to assume that the search tree is binary but this is not a 28 00:01:29,413 --> 00:01:31,920 limitation. We could generalize that. 29 00:01:31,920 --> 00:01:35,190 But typically many of the problems are going to be binary. 30 00:01:35,190 --> 00:01:37,962 If you look at scheduling it's going to be which task is scheduled first on that 31 00:01:37,962 --> 00:01:41,505 machinery or which is not scheduled first on that machine, okay? 32 00:01:41,505 --> 00:01:47,051 and then what we are going to assume is that, inside the tree, okay? 33 00:01:47,051 --> 00:01:50,200 When you follow the heuristic you take a left branch, okay, that's typically how 34 00:01:50,200 --> 00:01:54,006 all the trees built, okay? So, following the heuristic means 35 00:01:54,006 --> 00:01:58,460 branching left and then a mistake is when you branch right, okay? 36 00:01:58,460 --> 00:02:00,981 Because essentially the algorithm is going to branch left, okay? 37 00:02:00,981 --> 00:02:05,030 But if make a mistake, you're going to reconsider a choice and go right, okay? 38 00:02:05,030 --> 00:02:08,380 So, basically, you know when you go left the heuristic is right, when you go right 39 00:02:08,380 --> 00:02:12,348 the heuristic is wrong, okay? So the key idea behind behind Limited 40 00:02:12,348 --> 00:02:15,758 Discrepancy Search is that and this very useful for variety of scheduling 41 00:02:15,758 --> 00:02:19,202 problems. Is that what you want to do is that you 42 00:02:19,202 --> 00:02:24,828 want to use this heuristic to guide you in searching the trees, these tree, okay? 43 00:02:24,828 --> 00:02:28,060 Not only in choosing you know the first leaf, and way to go. 44 00:02:28,060 --> 00:02:31,604 But also in exploring the entire tree. So, you're not going to follow the 45 00:02:31,604 --> 00:02:35,316 heuristic and use that for a search. You're going to follow the heuristic and 46 00:02:35,316 --> 00:02:39,222 use, and use it for actually deciding which part of the tree you are going to 47 00:02:39,222 --> 00:02:43,206 explore first. So, what you want to do is to explore the 48 00:02:43,206 --> 00:02:46,510 tree by looking at an increasing order of mistakes. 49 00:02:46,510 --> 00:02:49,176 So, you're going to assume that, you're going to, essentially what you do is you 50 00:02:49,176 --> 00:02:52,417 trust the heuristic, you know, less and less over time, okay? 51 00:02:52,417 --> 00:02:54,954 So initially you want to trust the heuristic and then you want to trust it, 52 00:02:54,954 --> 00:02:58,326 you see, but not completely. And then you want to trust it a little 53 00:02:58,326 --> 00:03:02,800 bit less and so on and so forth, okay? So how do, how, what does that mean? 54 00:03:02,800 --> 00:03:06,148 That basically means that you're going to explore this tree in a set of waves, 55 00:03:06,148 --> 00:03:08,734 okay? And the first wave is basically no 56 00:03:08,734 --> 00:03:10,920 mistake. And then the next wave is going to be one 57 00:03:10,920 --> 00:03:13,090 mistake. And then the third wave is going to be 58 00:03:13,090 --> 00:03:15,641 two mistakes and so on and so forth, okay? 59 00:03:15,641 --> 00:03:18,541 But once again you know the our understanding this is much easier when 60 00:03:18,541 --> 00:03:22,038 you see a picture. I'm going to disappear, because I want 61 00:03:22,038 --> 00:03:25,915 you to see the tree right? So initially you're going to trust the 62 00:03:25,915 --> 00:03:29,927 heuristic that basically means that you are assuming that you make no mistake and 63 00:03:29,927 --> 00:03:34,757 this is what you see. No mistake, you always branch left okay 64 00:03:34,757 --> 00:03:38,274 and then you get to this. That basically means the heuristic was 65 00:03:38,274 --> 00:03:40,480 always right. That could be the optimal solution. 66 00:03:40,480 --> 00:03:44,321 But obviously the heuristic makes some mistakes. 67 00:03:44,321 --> 00:03:47,311 So the next wave, we are going assume that the heuristic is really really good, 68 00:03:47,311 --> 00:03:51,177 except that it made one mistake but we don't know which one, right? 69 00:03:51,177 --> 00:03:54,009 And that's the next wave, okay we're going to make one mistake but we don't 70 00:03:54,009 --> 00:03:57,202 know where, okay? So that's basically means that instead of 71 00:03:57,202 --> 00:04:00,671 going left all the time you are going to go always left, except once. 72 00:04:00,671 --> 00:04:04,319 Okay, we go right and these, these are the leaves that we are going to explore, 73 00:04:04,319 --> 00:04:07,178 okay? So, once again, you can say oh, I make a 74 00:04:07,178 --> 00:04:10,705 mistake at the beginning and then I have to go left, okay? 75 00:04:10,705 --> 00:04:14,322 Or you go left, you make a mistake afterwards and then you have to go left. 76 00:04:14,322 --> 00:04:16,946 Okay, you have to go left, because you can make only one mistake [UNKNOWN] was 77 00:04:16,946 --> 00:04:20,461 right, the heuristic was right twice and then I make a mistake, okay? 78 00:04:20,461 --> 00:04:24,001 So in a sense what you see now is that you have explored this tree leaves in 79 00:04:24,001 --> 00:04:27,793 wave two, okay? And one of the big things you see is that 80 00:04:27,793 --> 00:04:32,220 this leaf here is not at all what you have with that first search. 81 00:04:32,220 --> 00:04:35,516 It's a very different order right? So that for search, you would have 82 00:04:35,516 --> 00:04:38,854 basically explored this part basically, okay? 83 00:04:38,854 --> 00:04:42,787 Now, now this is a tiny tree, right? So imagine that this is a tree which is 84 00:04:42,787 --> 00:04:45,880 you know, twenty depth, you know, high, okay? 85 00:04:45,880 --> 00:04:49,378 So essentially you would have, you know, explored some of the leaves over there 86 00:04:49,378 --> 00:04:52,717 but you would have also explored one of the leaf on the right part of the tree, 87 00:04:52,717 --> 00:04:56,066 right? So this is a very, very different order 88 00:04:56,066 --> 00:04:58,360 than what depth-first search would do, okay? 89 00:04:58,360 --> 00:05:01,462 So that's the second wave. The third wave which is actually a book 90 00:05:01,462 --> 00:05:05,955 Alvin Toflow, but you don't care, right? so the third wave here is explored in 2 91 00:05:05,955 --> 00:05:09,424 mistakes, okay? And here I consider color but anyway. 92 00:05:09,424 --> 00:05:13,273 So what you going to see here is that. Okay, so I'm assuming that I don't make a 93 00:05:13,273 --> 00:05:17,214 mistake at the point and then you have to make two mistakes. 94 00:05:17,214 --> 00:05:21,210 And that's one of the leaf that you will see with making two mistakes. 95 00:05:21,210 --> 00:05:23,964 Or you make one mistake, okay? Then you go left and then you make 96 00:05:23,964 --> 00:05:27,254 another mistake, that's another leaf. Or you make a mistake, you make a 97 00:05:27,254 --> 00:05:30,305 mistake, and then you have to be, you have to be right, okay? 98 00:05:30,305 --> 00:05:32,970 So this is essentially what happens in the third wave. 99 00:05:32,970 --> 00:05:37,050 Okay the third wave explore in the as all the leaves that the heuristic can arrange 100 00:05:37,050 --> 00:05:41,371 by making two mistakes, okay? And then you have four, you know the last 101 00:05:41,371 --> 00:05:44,861 wave, you know which we going to explore in this particular case. 102 00:05:44,861 --> 00:05:48,887 This case but of course when the tree is much larger you explore many many more of 103 00:05:48,887 --> 00:05:53,101 these waves, okay? So that's the basic idea behind limited 104 00:05:53,101 --> 00:05:56,679 discrepancy search. One again there are variations around 105 00:05:56,679 --> 00:05:59,601 them. So one of the things that typically it's 106 00:05:59,601 --> 00:06:03,259 interesting in practice is the fact that you know we have you know, less 107 00:06:03,259 --> 00:06:07,684 information at the root. No, so of the mistake you want them 108 00:06:07,684 --> 00:06:11,443 probably, you know at the top. And so you can design algorithm that are 109 00:06:11,443 --> 00:06:14,310 taking that into account. One of my colleagues told me once did 110 00:06:14,310 --> 00:06:17,098 actually something like that. So they have variation around that 111 00:06:17,098 --> 00:06:19,190 heuristic. But the key point is always based, in 112 00:06:19,190 --> 00:06:23,265 this particular case, on a good heuristic that you are trying to follow, okay? 113 00:06:23,265 --> 00:06:26,038 So, so that's the first thing that I wanted to talk about which is how you 114 00:06:26,038 --> 00:06:29,520 explore, you know, a tree using a good heuristic. 115 00:06:29,520 --> 00:06:32,310 The second thing that I want to talk about is this clever idea of doing large 116 00:06:32,310 --> 00:06:35,590 neighborhood search, okay? And it's a combination of constraint 117 00:06:35,590 --> 00:06:38,538 programming and local search and this is, this is basically a four step process, 118 00:06:38,538 --> 00:06:41,336 okay? You start with a feasible solution, and 119 00:06:41,336 --> 00:06:44,868 we know that CP is very good for finding feasible solution, okay? 120 00:06:44,868 --> 00:06:47,573 And so that's a CP stat. Then you do, you select a neighborhood 121 00:06:47,573 --> 00:06:50,170 and that's typically a local search stop, okay? 122 00:06:50,170 --> 00:06:53,854 So you want to select a neighborhood and I will talk about that in a moment, okay? 123 00:06:53,854 --> 00:06:56,780 And then what you do is you optimize that neighborhood with CP. 124 00:06:56,780 --> 00:06:59,860 That means that you select the best neighbor or you know, a very good 125 00:06:59,860 --> 00:07:03,860 neighbor with constraint programming in that neighborhood. 126 00:07:03,860 --> 00:07:06,455 Okay now you have a better solution, okay? 127 00:07:06,455 --> 00:07:09,125 And you repeat the process from step two, okay? 128 00:07:09,125 --> 00:07:13,201 Which basically means you select another neighborhood, okay? 129 00:07:13,201 --> 00:07:17,010 And you re-optimize again using CP, okay? Now what is the neighborhood? 130 00:07:17,010 --> 00:07:18,960 Okay, so, well there are many ways to do that. 131 00:07:18,960 --> 00:07:21,520 But, you know, the, the basic intuition here that you need to remember is that 132 00:07:21,520 --> 00:07:25,273 all you're going to do is you're going to fix a subset of the variables, okay? 133 00:07:25,273 --> 00:07:28,985 That set of variables is fixed, no, you can't change it, okay? 134 00:07:28,985 --> 00:07:32,209 But the other one's, you know, you kind of you know free them, and you can re, 135 00:07:32,209 --> 00:07:35,494 you know, find new values for these variables. 136 00:07:35,494 --> 00:07:38,502 So you fix part of the solution, you let the rest of the solution be free and you 137 00:07:38,502 --> 00:07:42,059 reoptimize that part. Okay, now which subset you know that's 138 00:07:42,059 --> 00:07:46,270 going to be problem specific, you can exploit the structure of the problem. 139 00:07:46,270 --> 00:07:49,270 You can exploit a lot of the information that you have in the in, in, in the 140 00:07:49,270 --> 00:07:53,422 constraints at a particular point. And so you know different, different 141 00:07:53,422 --> 00:07:57,260 problems can have a different neighborhood that you select there. 142 00:07:57,260 --> 00:07:59,283 And it's normal right? Ao that's the same thing as local search, 143 00:07:59,283 --> 00:08:00,803 okay? Depending on the problem, you have a 144 00:08:00,803 --> 00:08:03,250 different neighborhood, but that's what your going to try to do. 145 00:08:03,250 --> 00:08:05,854 Okay in executing problems you can look at the machine for selecting which 146 00:08:05,854 --> 00:08:08,755 variables you select. And in a routing problems your going to 147 00:08:08,755 --> 00:08:11,200 look at the vehicles and thing like this okay? 148 00:08:11,200 --> 00:08:14,272 So, now why do we do LNS? Well there big reason is that constraint 149 00:08:14,272 --> 00:08:18,560 program is very good at finding feasible solution and very good at finding small 150 00:08:18,560 --> 00:08:23,634 combinatorial space. What is difficult is you scale, okay? 151 00:08:23,634 --> 00:08:26,938 When you have a lot of variable and the local step here is giving you that 152 00:08:26,938 --> 00:08:30,610 scalability. You solve the global problem by solving a 153 00:08:30,610 --> 00:08:34,482 lot of smaller problems of exactly the same type, okay? 154 00:08:34,482 --> 00:08:38,810 So that's what you are doing here in local, in, in large neighborhood search. 155 00:08:38,810 --> 00:08:43,097 No we'd obviously generalize the MIP directly, you change CP there, okay? 156 00:08:43,097 --> 00:08:46,210 With MIP and you also have a large neighborhood algorithm. 157 00:08:46,210 --> 00:08:49,174 The only thing that's different is the MIP solver is going to be used for 158 00:08:49,174 --> 00:08:52,919 finding a feasible solution. Is going to be reused for re-optimizing 159 00:08:52,919 --> 00:08:56,832 the, this, this, the neighborhood, okay? Once again, the variation around this 160 00:08:56,832 --> 00:08:59,486 algorithm and so they're a very nice technique. 161 00:08:59,486 --> 00:09:02,825 You know In some of the mid community things like the feasibility [UNKNOWN] 162 00:09:02,825 --> 00:09:06,130 resemble this as well, okay? But, this is, this is what I want to 163 00:09:06,130 --> 00:09:08,610 convey here is once again the intuition on how you can [UNKNOWN] some of the 164 00:09:08,610 --> 00:09:11,620 techniques that we have seen in this class, okay? 165 00:09:11,620 --> 00:09:14,442 So, let me give you an example which, which I really like. 166 00:09:14,442 --> 00:09:17,393 Okay, for ex, you know express, you know explaining this. 167 00:09:17,393 --> 00:09:21,749 And this is called an asymmetric you know traveling salesman problem with time 168 00:09:21,749 --> 00:09:25,259 window, okay? And this is all you can think about the 169 00:09:25,259 --> 00:09:29,810 problem, you have a set of locations that you have to visit, like CSP. 170 00:09:29,810 --> 00:09:32,802 But you have a certain amount of time at these locations, that's the service time, 171 00:09:32,802 --> 00:09:35,053 okay? And then you have a time window in which 172 00:09:35,053 --> 00:09:38,264 you have to visit. You can only visit this particular point 173 00:09:38,264 --> 00:09:41,611 in time, okay? And the then the distance between these 174 00:09:41,611 --> 00:09:45,459 locations is asymmetric. Okay, going form A to B but shorter than 175 00:09:45,459 --> 00:09:49,226 from B to A, okay? And that complicates the matter, okay? 176 00:09:49,226 --> 00:09:52,658 So, what you are trying to do is to find a Hamiltonian path which visits all these 177 00:09:52,658 --> 00:09:56,104 location, okay? So that means a path that visit everyone 178 00:09:56,104 --> 00:10:00,769 of these location exactly once. You have to visit the location such that 179 00:10:00,769 --> 00:10:05,481 you satisfy their time window, okay? And then you have to make sure that what 180 00:10:05,481 --> 00:10:09,316 you want to do is minimize the total you know distance that you are traveling in 181 00:10:09,316 --> 00:10:14,390 this Hamiltonian path, okay? So that what you're trying to. 182 00:10:14,390 --> 00:10:18,756 Now, so this is a very simple constraint programming model for doing that which is 183 00:10:18,756 --> 00:10:24,109 expressed a scheduling problem, okay? So you define essentially a number of 184 00:10:24,109 --> 00:10:27,673 activities for every one of the location, okay? 185 00:10:27,673 --> 00:10:31,817 And the duration of these activities on the service time, okay? 186 00:10:31,817 --> 00:10:35,909 So you have to research a vehicle which is going from one activity to another you 187 00:10:35,909 --> 00:10:41,087 know to every one of the activities. And also you know is there for and then 188 00:10:41,087 --> 00:10:45,560 what you have minimized is basically the transition time between these activities, 189 00:10:45,560 --> 00:10:48,941 okay? And you make sure that you satisfy the 190 00:10:48,941 --> 00:10:52,862 time window constraint, okay? So once again you know, I don't want you 191 00:10:52,862 --> 00:10:56,972 to understand this is in any detail. But the key point is look at the size of 192 00:10:56,972 --> 00:11:00,440 this problem right it's about what 15 lines of code, okay? 193 00:11:00,440 --> 00:11:03,755 And so what I'm going to show you is that is once you use large number root search 194 00:11:03,755 --> 00:11:07,121 on this algorithm, it becomes a very very effective technique for solving this 195 00:11:07,121 --> 00:11:11,762 problem, okay? So this is, essentially an illustration 196 00:11:11,762 --> 00:11:15,416 of what, you know, what, what this problem is all about, so every one of 197 00:11:15,416 --> 00:11:21,300 these lines is a location, okay? The blue thing inside this is the service 198 00:11:21,300 --> 00:11:24,110 time, okay? The, the, the wheat /g, you know, the 199 00:11:24,110 --> 00:11:28,040 wheat interval there is basically the time window for every one of them. 200 00:11:28,040 --> 00:11:30,297 What you don't see is where they are locating in space, and therefore, you 201 00:11:30,297 --> 00:11:33,780 have, you know, some, you have to move from one to the next and so on. 202 00:11:33,780 --> 00:11:36,690 So you don't see that in this particular display. 203 00:11:36,690 --> 00:11:40,190 You, you know, when the task are served in the temporal fashion. 204 00:11:40,190 --> 00:11:44,635 You don't see the spacial, you know, the spacial arrangements here, okay?. 205 00:11:44,635 --> 00:11:47,952 So, what is, what will the large neighborhood search do? 206 00:11:47,952 --> 00:11:50,172 The large neighborhood search is going to look for this and said oh, I have a 207 00:11:50,172 --> 00:11:52,684 feasible solution. Great, step one is done okay? 208 00:11:52,684 --> 00:11:55,135 Now I'm going to look a the neighborhood, okay? 209 00:11:55,135 --> 00:11:56,900 So, I'm going to take this box here, okay? 210 00:11:56,900 --> 00:11:59,630 Everything outside the box is going to be fixed I'm not going to, you know touch 211 00:11:59,630 --> 00:12:02,908 these tasks, okay? They are going to be scheduled exactly 212 00:12:02,908 --> 00:12:06,271 where I choose you know where the feasible solution is, is, is chosen to 213 00:12:06,271 --> 00:12:10,247 schedule them. But I want to do is re-optimize this path 214 00:12:10,247 --> 00:12:12,993 okay? So I'm assuming essentially everything 215 00:12:12,993 --> 00:12:17,376 outside the box is fixed and now I'm re-optimizing inside the box, okay? 216 00:12:17,376 --> 00:12:20,848 And so I re-optimizing this until you know I find an optimal solution or you 217 00:12:20,848 --> 00:12:25,818 know I am tired or optimizing this thing. And then I will take another box and you 218 00:12:25,818 --> 00:12:29,346 know do the same thing, okay? Or you could do something a little bit 219 00:12:29,346 --> 00:12:33,036 more interesting, okay? I can randomly choose the task that I am 220 00:12:33,036 --> 00:12:36,648 going to relax, okay? So here the task which are in pink, are 221 00:12:36,648 --> 00:12:39,730 going to be the one I am going to be relaxing. 222 00:12:39,730 --> 00:12:42,817 All the other ones are going to be fixed and no the undone have to be, success, 223 00:12:42,817 --> 00:12:46,275 you know successive in this, in this, in this travel. 224 00:12:46,275 --> 00:12:50,152 They can just be anywhere and no I'm re-optimizing that part as well, okay? 225 00:12:50,152 --> 00:12:53,856 So, so you have an algorithm here which is basically two neighborhood, okay? 226 00:12:53,856 --> 00:12:57,321 We choose either the one which is the sequence of task or we choose the random 227 00:12:57,321 --> 00:13:02,536 set of task that in fixed the rest, okay? So we re-optimized this and we keep doing 228 00:13:02,536 --> 00:13:05,928 this a while, okay? And so, what I want to show you is the 229 00:13:05,928 --> 00:13:10,624 kind of result that you get. They are all results you know like seven, 230 00:13:10,624 --> 00:13:14,971 six, seven years ago. But they are very interesting, okay? 231 00:13:14,971 --> 00:13:18,388 So what I, what we did at that point was basically looking at a very sophisticated 232 00:13:18,388 --> 00:13:23,311 branch and categories and for that particular for that particular problem. 233 00:13:23,311 --> 00:13:26,177 Which was done by the you know, the amazing team in Berlin. 234 00:13:26,177 --> 00:13:28,697 You know lead by you know Martin Grötschel and so they have this very 235 00:13:28,697 --> 00:13:32,161 sophisticated algorithm for finding a high quality solution. 236 00:13:32,161 --> 00:13:35,241 And sometimes proving optimality although this is very difficult, so only the very 237 00:13:35,241 --> 00:13:38,980 small problems only the very tiny problems you can prove optimality. 238 00:13:38,980 --> 00:13:43,140 And so this is the best solution that they had found in essentially you 5 hours 239 00:13:43,140 --> 00:13:47,411 of CPU time, okay? And what you see there is a size, this of 240 00:13:47,411 --> 00:13:50,560 the instances, okay? Which go from 40 to 233 in this 241 00:13:50,560 --> 00:13:53,830 particular case, okay? So this is what they intended is what we 242 00:13:53,830 --> 00:13:56,425 said is, okay? Lets took this very, very simple 243 00:13:56,425 --> 00:14:00,484 constraint programming problem. I will provide this with local search, 244 00:14:00,484 --> 00:14:04,189 get this LNS algorithm and see how fast we can read this bar because almost none 245 00:14:04,189 --> 00:14:08,683 of them were optimal, okay? And so this is what you can do, get 246 00:14:08,683 --> 00:14:12,807 essentially in five minutes okay? So the the LNS algorithm can improve all 247 00:14:12,807 --> 00:14:17,630 the box and it needed ten minutes for finding the other ones. 248 00:14:17,630 --> 00:14:21,214 So what I'm basically showing you is you know, the power of this method in a 249 00:14:21,214 --> 00:14:24,477 sense. Very simple algorithm when you have this 250 00:14:24,477 --> 00:14:28,285 hybridization, gives you, you know a very dramatic improvement over a very, very 251 00:14:28,285 --> 00:14:32,587 sophisticated algorithm which is not advertised, okay? 252 00:14:32,587 --> 00:14:34,940 But it was a very sophisticated algorithm, okay? 253 00:14:34,940 --> 00:14:37,882 So this is the power of, you know, large neighboring search. 254 00:14:37,882 --> 00:14:41,054 It has been applied to many, many problems, and it has been generalized in 255 00:14:41,054 --> 00:14:43,931 many ways. But this is a very nice hybridization 256 00:14:43,931 --> 00:14:47,330 technique for finding high-quality solutions very quickly. 257 00:14:47,330 --> 00:14:49,810 Now once you have this high-quality solution, you can turn to a neighbor or 258 00:14:49,810 --> 00:14:52,370 branch and gather a, you know, pure constraint programming system for, for, 259 00:14:52,370 --> 00:14:55,729 for proving optimality. But this is the kind of things that you 260 00:14:55,729 --> 00:14:58,558 know, large number of research will allow you to do, finding high quality solution 261 00:14:58,558 --> 00:15:01,990 very quickly. Okay, so this is the end. 262 00:15:01,990 --> 00:15:04,810 Thank you very much. hope you use these two techniques. 263 00:15:04,810 --> 00:15:06,250 They are very useful in practice.