Okay, so this is Discrete Optimization and Mixed Integer Program. We'll talk about MIP. So, this is one of these magical tools that we have in optimization, so it's going to be very different from what you have seen so far. And this is a tool, when it works, it's really amazing. And sometimes it works, sometimes it doesn't, but you will see it's really a magical tool. so what I'm going to do in this lecture is give you a basic introduction to Mixed Integer, Mix Integer Program, okay? We call them MIP, okay. I'll talk about MIP all the time today. And then we talk about branch and bound, which is the basic technique to actually solve them, okay. Basic introduction. This is a big field. What I want to give you is the feeling about what this is about okay? So, this is, this is an integer program. So, what you have to think is that it is like linear program. Okay? So, like linear programming. The only difference is that the variable here, have all to be integers. Okay? So, we still have, you know, an objective function that we are minimizing. And we have n variables and m constraints. Okay? So, you see the, you know you see the end variables, you see the end constraint. All these constraints are inequalities okay? So, the variables are constraints to be non negative okay? Again okay? And then we have these additional constraints that they have to be integer values okay? And this difference, this makes a huge difference in practice of course. Okay? So, we'll talk about the mixed integer program when not all the variables have to be integers. Some of them can and some, some are and some aren't, okay? And that's essentially a mixed integer program. So, we have some values that are like in linear programming. They can take you know, rational values, real numbers. Are and then some of them are going to be integers. Okay, so and that's why you call mixed, mixture of two variables. Okay, but once again, a big you know, a big, big thing to mention here is that the constraints are linear, they're qualities and of course we're minimizing. Of course all the techniques that I've shown you for linear programming are still working, right? Maximizing is easy to do. You minimize the negation. Writing equations or you know, inequalities or equations, and, and you know, equations are inequalities. Are going to be done exactly like we did for linear problems. And of course if you want a variable which can take negative value, you can always use a different set we have done before. Okay? So, most of the same technique, but what is key is that the constraints and the objective function are linear. But now, some of the variables can be integers. Okay? So, the difference between having integer variables or not having integer variable is big, right? So, that's the gap between being polynomial and NP complete, okay? So, this is a huge difference, as soon as you introduce this, this, this integer variables okay? So, so the, the, the entire complexity of the problem changes and for most people in industry sometimes, it's a big, it's a big difference right? They say, oh but it should be easier there are fewer integers than there are real numbers but no having integers is what makes this problem really complex. Okay? So, this is an example that we have seen before. Right? The knapsack example. This is, this is a mixed integer program. Okay? So, we have a linear objective function, the variable xi are constrained to be 0 and 1. And they have to be integer value. Want to take the item or not. Okay? You know it's not a fraction value. And of course we had this linear inequality. Okay? So, that's the most simple in a sense. Mixed integer program that we have okay? So, what we're going to do now is look at warehouse location we see this problem before when we do local search. And what we going to do is develop the mid program for it. So, I'm going to go over what the problem is again and then we see how we design. And mid program, actually we'll come back several time to this problem, okay? So, what we had there is that we have various locations where we can open a, a warehouse, okay? So, when we open a, a warehouse in a particular location, we say we open that warehouse. If we don't put it there it's basically, basically say that it's closed. Okay? And then we have a bunch of customers have, you know, around you know, all over the place. Okay? And the goal is actually to choose which warehouses to open so that you can serve all the customers. And every one of the customer. Is,will be allocated to exactly one warehouse okay? Now what you want to minimize is the cost of opening these warehouses and maintaining them, that's the fixed cost for every warehouse, warehouse. And then you have the transportation cost from the customers to the warehouse okay? So, you will want to minimize the sum of these two things okay? So that's the problem okay. So, this one solution where you open one warehouse and you serve all the customers with that warehouse okay? This is another solution where we open two warehouses okay? And the customer are served you know, in the way which is depicted here. But once again you know I want to focus, I, I want to mention the fact that every one customer is assigned to exactly one warehouse. Okay? And this is another solution with two warehouses. It's a different customer allocation. Okay? So, now the question is can we find a MIP model okay? For warehouse location. Okay? And so once again, it's [UNKNOWN] constraint programs. It's going to be easy, right? So, the only three things that we have to do is find what are the decision variables. And then we have to express linear constraints, because we are in a MIP context in terms of this decision variable. And then express the objective function, which is also to be linear. Okay? So, can we find that for warehouse location? Okay, and obviously the answer is yes. Otherwise I wouldn't be talking about this problem, right. So, what are the decision variables? Okay? So, in this particular case we are going to use two types of decision variables. Okay? So, the first type is going to basically be associated with every warehouse. Okay? So right, so this is what we want at the end, we want to know. You know, which warehouse is to be open okay? How many and where, okay? So, we'll have a decision variable xw for every warehouse w. And that variable is going to be one, if that warehouse is open, okay? So, that means that we're building the warehouse at that location, okay? It's zero if we just don't build the warehouse at that location, okay? So, that's the first type of variables that's going to tell us where to build these warehouses. And how many. Okay? Now, we need a second thing, right? So, we need for every customers to know which you know, warehouse it's going to be allocated to. So, we are going to use a second set of variable, which we call the Y variables here. Okay? And Ywc is going to be equal to 1. If customer c is basically served by warehouse w okay? And zero otherwise. Zero would mean, okay custo, warehouse w doesn't serve customer c. Okay? So, once again two types of variables. Whether we open a warehouse or not that's the x variables and the y variables is going to tell me for a pair of customer's warehouse are these two guys connected? You know, is this warehouse serving that customer? Okay?. So, now we the decision variables the only thing that we have to do now is express the constraints okay? So, what are the constraints okay? So think about it. What are the constraints in this particular setting, okay?. Now there is one constraint which is really important that we don't want to miss is that we can serve a customer with a warehouse. Only if that warehouse is open. Right? So, we don't to connect a customer to a warehouse that doesn't exist. Right? So, that's a very simple constraint. Okay? But it's very important. Otherwise you know, we don't, we wouldn't have a solution. And the second one is that we want to make sure that the customer is served by exactly one warehouse. Okay? So, so we have to make sure that the customer is served. But it can only be served by one warehouse. Okay? And so, essentially that's the second constraint, the second type of constraint that we have to express. Okay? Now, we have to express these constraints as linear constraints. Okay? And that's what this slide is showing you. Okay?. So, the first constraint telling you that you know, a warehouse is to be opened to serve a customer, is a simple inequality with two, the two, the two types of variables okay? So, we are basically saying that, why the value c has to be smaller or equal to xw, what does that mean okay? So, if we close the warehouse, so that means that xw, if we close warehouse w so that we don't build a warehouse, that means xw is 0 right? If xw is 0 then it's for, you know y you know, wc is going to be forced to be 0, we can't actually you know, serve that customer with that warehouse. That's part of what we want, right? And if, if xw is 1, that basically means that the warehouse is open, we can use it. But that doesn't mean that we have to allocate customer c to that warehouse, but we can. Okay? So, that value is going to be 1, which basically means that the variable you know, ywc it's free, it can take value 0 it can take value 1. That's the meaning of that constraint. Okay? So, in a sense we are encoding you know, the relationship between these decisions using an inequality in this particular case. And this works well because these two variables take 0, 1 values and we come back to the fact that in a MIP a lot of the decision variables are going to be 0,1. Okay? The second constraint that you see there, is basically telling us that a customer should be served by exactly one warehouse. And what we're going to do is, we're going to look at one customer, we're going to look at all the possible warehouse. And we're going to make sure that the sum of the variables that are basically denoting whether that warehouse is serving that customer, is equal to 1. And that's what you see there, okay? So, you see a sum for all the warehouses. And then you see the variable ywc okay? So, that basically means is y actu, is w actually serving customer c? Okay? So, it's the sum of all these guys and they have to sum to 1. Okay? Once again, these guys are 0, 1 so that basically means that one of these warehouses is going to serve that customer. Okay? So, beautiful we have a model, if we can express the objective function. The objective function now is the sum of the fixed costs and the transportation cost. Okay? It's reasonably easy to state. What you do, this is essentially the fixed part, okay? So, what we are doing is the sum for all the warehouse of the fixed cost of the warehouse, okay, times whether the warehouse is open or not. So, if the warehouse is closed, there is no cost. If the warehouse is open will, you know, incur a cost of cw. Okay? And then the second, the second aspect here is basically the transportation cost. We look at the pair wc, and if, and if, you know, w is serving c, that variable y is going to be 1. So, we want to incur the transportation cost. twc okay? So, that's we have to do, of course with the variable y is 0 then essentially there is no cost incurred. You know, these things are not connected. Okay? So now, so this is the transportation costs okay? So fixed costs, transportation costs. We have the objective function and we have the entire model okay? So, this is the entire MIP model for warehouse location. Okay? Objective function, fixed cost, variable cost. Okay? Then we have the constraints here, that basically state that you know, you can serve a customer with a warehouse. If the warehouse is, if, you can serve a customer with a warehouse if the warehouse is open. Then you have to serve exactly 1 customer uh, [INAUDIBLE] a customer exactly once, so you look at all the warehouse exactly one should serve that customer. And then usually these two types of variables have to be 0 and 1. Okay? We open up a warehouse or we don't, we serve you know, the warehouse w serve customer c or not. Okay? And that's the entire model. Okay? So, so its interesting when you look at these models that the variables here are 0, 1. Okay? And I'm going to come back to that, but we could have another essential model. We could think of you know, instead of having all these 0, 1 variables, ywc for all the warehouses, why don't I have simply a variable yc. Which denotes the, the, the warehouse that is serving customer c, okay? Now this is a question for you, okay? So think, you know, use, you know, try to build a MIP model using a decision variable which is yc, okay? And see what the challenges are. It's a very, it's a very interesting exercise try to see what is going to be tough if you use these decision variables, instead of those 0, 1 variables that I showed you, okay? So, so to think [UNKNOWN] is that in general, MIP models love 0, 1 variables, okay? So, you'll see a huge amount of 0, 1 variables in MIP model. You'll see typically very few integer variables, okay? The integer variables are typically going to be 0 and 1. So, you know, it's like you know, MIP models like to have decision which are actually yes or no, okay? So, very different from constraint programming model in that sense, okay? But the good point is that once you have the 0, 1 variables expressing linear constraints, is typically very simple, okay? So, in a sense you know, you use these very simple 0, 1 variables and then all the constrains. Are you know, simple to state in general. Okay? So, so sometimes they have many indices, but they are easier to state, they are easy to state okay? So, it doesn't mean that you know modeling in MIP is easy. And we'll see some of these issues later on so, there're are still many, many possible models you can design. You know, what are the basic, what are the basic decision variables that you are going to use? What are the constraints? What are the objectives? Everything that we said for constraint programming is still valid. Finding a good model is not necessarily easy, okay? And so, we'll see some of these things in, in a moment, actually, or the next lecture. Okay, so now we have a beautiful MIP model, how do we solve them? Now this has been an active research area for many, many decades, and it's still a very active research area, it's a fascinating area, okay? And so, one of the basic techniques to solve MIP problems okay? Is to use branch and bound. Okay? Remember, branch and bound, we saw that you know, first lecture you have, or second lecture, with bounding, and branching. Okay, bounding is what, is finding an optimistic evaluation of the, of the objective function. Optimistic, optimistic relaxation, we love relaxing, right? And then branching is when, you know, is basically splitting the problem into sub problems, and then apply the same algorithm on the sub products. Okay? So this is the basic two steps. Now what is nice about MIP models is that they have a very, very natural relaxation, okay? Which is linear programming. And that's why we talked about linear programming so much. We have this linear relaxation that we can use as the bonding of as the bonding part of MIP models, of branch and bond for MIP. Okay? So, the key idea is that you going to remove the integrality constraints, and you have a relaxation, okay? So, this is the only thing that you do, right? You look at this big MIP model. If you remove the integrality constraints, what do you have? You have a linear program. You can solve that very fast, and that's what we going to do, okay? So, the bounding path, the optimistic relaxation, is going to be basically relaxing the integer constraints. You wrap your largest set of values that the variable can take, so it's a relaxation. Okay? So, so basically the branch and bound for MIP models is going to be solving the linear relaxation. Okay? At a particular node, you're going to do that, you're going to look at that node. You're going to say, oh, I want to solve the linear relaxation. And if the linear relaxation is worse than the best solutions you have found so far, you can prune that node, that, that node away, okay? It basically means that the associated sub problems to that node is provably sub-optimal. You can get rid of it. Okay? So, if not, okay, so if the, you, you are basically looking at this linear relaxation more closely, and you are looking to see if it has all integer values, which can happen, right? And if that happens, that means that in this large space of solutions, you know, the linear programming relaxation, it throws in an integer value for all of these variables. And that means that you know, obviously it's a solution to the overall MIP problem, well to, for that node, right? And so, essentially you have found a feasible solution and you can obtain the best solution you have found so far, okay? Now if none of these two things apply then you need to branch, okay? And the interesting thing about MIP models, is that they are going to use the linear relaxation to drive or to branch in general. Okay? And so there are many ways to do this, okay so I'm just going to mention one here. But this is also a very active area. How you branch, what do you use, how do you use, do you use the objective function? Do you use the constraints, and so on. But this is the simplest scheme that is using the linear relaxation. So, what you're going to do is you're going to look at this relaxation, and you know that, you know, some of the integer variables have a fractional value, and they should not. Okay? So, what you're going to do is take one of those, okay, which has, let's say, x which has a fractional value f. And what you're going to do is create two set problems. Okay? One of them is going to be saying that x has to be small or equal to the integer which is the smallest, the largest integer which is smaller than f. Okay? So, that's the floor of, of f and okay and that's one of the sub problems, okay? And the other sub problem is going to see that x is greater than the seal of f, that's the smallest integer which is larger than f. Okay? So, these are two sub problems you can add to the, to the set of constraints that you have and then you repeat the branch and bound. Okay? So, the key point here, if you remember constraint programming. Constraint programming was obsessed with constraints, okay? And was basically branching on the, you know, using feasibility information. Here we are obsessed with the objective. We are getting these, good, you know, well, hopefully good. Relaxation of the objective optimistic evaluation of the objective. And then what we use is we use this linear relaxation which is like ooh, you know, this is like a very good solution, okay? But it's not feasible, and trying to use that for branching, okay? So, in a sense we are branching using optimality information, okay? So, summarizing what I just said. Okay, so we are basically focusing on the objective here. Okay, once again I'm exaggerating, of course the MIP system in practice. We'll also focus on some other constraint, but it's good to exaggerate to, you know, drive the point home here. So, we are basically focusing on the objective and the relaxation gives you an optimistic bound all the best value that the MIP can ever get. Okay? The pruning is based obviously on the objective and sub-optimality. Okay? So, when you detect the linear relaxation is not higher than the best solutions you've found, not better than the best solution you've found so far, you just prune it away, okay? And obviously the relaxation has been obtained by relaxing the integer integrality constraints okay? And we have a global view of all the constraints, you know. So, in constraint programming all the constraints were acting, you know independently. Here we relax all the integrality constraints, but when we look at the constraints, we look at them globally. Okay, so we have this linear program, which basically look at the constraints globally, but they just have removed the integrality constraints. Okay? So, in the Knapsack, in a sense, so when you look at the relaxation, what we did, this is basically the MIP model for the knapsack. And the relaxation is only looking at this, you know, 0,1 that you see there, and basically replacing them by the fact that oh, it doesn't have to take the values 0. One, it can take any value between 0 and 1. Okay? That's the linear relaxation and we can do that for any MIP model right? So we just remove these integrality constraints okay? Now, so in the case of the Knapsack, the linear relaxation is the same as the greedy approach that I've shown you right? Order the item by the most value per weight okay, and select them in that order okay? Until you get a fractional value and you take that fractional value. to, to, to get a relaxation. So, that's what the linear programming relaxation is going to do. So, I'm going to talk a lot about this linear programming relaxation the next couple of, of classes. It's a beast and, you know, you see it's, it's, it has very interesting behavior. Okay? Now when we branch, we branch with a fractional value, and essentially in this particular case of the Knapsack, the fractional value is going to be the most. You know the, the item with the, the largest ratio that we can't actually fit in the item. We fit all of the items which are more valuable per weight, right, better ratio. Okay? And then we are basically looking at that last item that we would want to put into the Knapsack, but we could only put in a fraction of it. That's essentially, you know, the, the item we're going to branch on, because this is the only item which will have a fractional value, okay? Now, the question that I have for you, for you to think of at this point, is that ooh, you know, we're going to take two things. These are 0,1 variables, right? So, he's going to be, so that, that the, when we round the fraction downwards or upwards, what you are going to get is essentially 0 on one side and 1 on the other side. So, we are going to not take the item or take the item. So, when we don't take the item, can you think of what the linear relaxation is going to do, okay? So, lot of the games in MIP is going to figure out what these [INAUDIBLE] linear relaxation is going to do. And it's very capricious, you'll see. Okay? So, in a sense here, you know, what I'm asking you, is that if we don't take the item, what's going to happen. Okay, or if we take the item what's going to happen to this linear relaxation, can you figure that out? Okay? So in a sense, the question that I'm asking you is, ooh which value is going to become fractional if you don't take the item? Okay, think about it. I decide not to take the item. All the items that were more valuable, what's going to happen to them? Well, you're going to still take them. And then what you're going to have is a fractional value for one or more, well, yeah, one or more items, later on. Okay? So that's what's going to happen. If you don't take the item. If you take the item, what's happening? Ooh, before you couldn't put this item in the Knapsack. No you decided, I want to take that item. So, that means that some of the items that were before, you won't be able to put inside the Knapsack, okay? And so the, there is going to be a fractional value for an item that was more value that's now going to be, that's now going to be fractional okay? So, this is the kind of reasoning that sometimes you need to think about, you know, what is the linear relaxation doing. Because that will give you insight on how to actually improve your models, okay? So, this is a very simple illustration because this is a really, really simple example but it's going to drive two points home, okay? And, and, you know, you'll see them. So, once again, so we had basically these three, you know, this knapsack with three items, okay? So, the value of the item is Vi, the weight of the item is Wi, your item 1, 2 ,3. Okay? So, most valuable item is item 3. Okay, 35 divided by 3 is pretty good. Okay? Above 10. Okay? So, a 45 divided by 5 is what is 9 and 48 divided by 8 is some, it's around 6 right? And so therefore, you know this is the most valuable item that is the next one and this is the least valuable item from the weights, you know value ratio. Okay? So, at the top of the, at the top of the tree at the root node, what you have is a value of 92 and that value of 92 is basically obtained by selecting item. Well, I think we can let the linear relaxation do this and this is what the linear relaxation does okay? So, the linear relaxation is going to take, you know, value, value one for x1, I'm taking item 1, I'm taking item 3. Okay? So, these are the two, you know, items with the best ratio. And then I take only a fraction on the second item, because I can't put the entire item in okay? And you know once again this is 5 and 3 this is 8 you know I can only take a quarter of that guy. And this is all you get you know, this this value because you, you, you sum 45 and 35 that's 80 and then a quarter of 48 that's 12 that's the value of the linear relaxation. So, this linear relaxing is basically giving you this solution and that particular value. Okay? What are we going to do? Okay? I told you. We look at this thing and we look, ooh, what is the fractional value? And in this particular case, it's item x2 okay? So, we're going to branch on x2, and we're going to create two sub-problems, okay? One where x2 is 0 and one where x2 is 1. Okay? What happens if x2 is 1, is 0, okay? So, that's the node that you see over there. Okay? Now the linear relaxation is 80 okay, you can select only item 1 and 3. Okay? So, we solved the linear relaxation at that node, and what do we obtain? You know, x1 is equal to 1, x2 is equal to 0, x3 is equal to 1. That's the linear relaxation. The linear relaxation here is finding, you know, an integral solution. We know that we don't have to branch anymore. We are fine, okay, this is the best we would be able to do. It's all integral okay, so we don't have to branch anymore. It's done, okay? So, the linear relaxation here is telling you when to stop branching in a sense, okay? So, we have found a solution we evaluated okay? The other thing that we wanted to do, is basically you know, choose x2 is equal to 1, okay? So we basically fix the value of x2 to 1. We solve the linear relaxation again, and the linear relaxation is going to give us a value of 71.33. Okay? You see that now there is this value of this x3 which is fractional, okay, it was actually, it was actually not fractional in the original relaxation, okay? So, we know that this is, we would have to continue branching, but what we have seen here is that the value of the inner relaxation is 71. The best solution that I have [UNKNOWN] so I can actually prune that node away already. So, with one branching essentially I, I solved this problem optimally. Okay? So, this is essentially illustrating the various concept here. At the particular node you solve the linear relaxation. Okay? You look, the value of the linear, the objective function of the linear program is basically giving you an optimistic evaluatoin of what the MIP can do. Okay? At that particular node. Okay? And then you look at the variables over there and you take a fractional variable to actually branch okay? And then you branch left and then you branch right based on that fractional value. Okay? Now, one of the questions that you have to think of is when is branching bound going to be effective? Okay? What is it's going to be really effective is the value of the linear relaxation is tight. It's very, very close if the linear programming relaxation is very close to the optimal value of the MIP. Okay? Is that sufficient? huh? Think about it. So, if I tell you, I guarantee, okay, that this linear relaxation is going to be 0.5%, always of the best value of the MIP. Is that sufficient for the MIP to actually behave nicely all the time? Okay? Question for you. Okay? I wish we had quiz in this class. We don't, but anyway. So think about it. Is it sufficient? Okay? So, what is a good MIP model? A good MIP model is going to be a model which is a good linear relaxation. It's not sufficient necessarily. Okay? I answered that question, right? But this is a necessary condition, you want a model with a very good linear relaxation. Why? Because you're going to prune the search space much better. Okay? Now, which variable should we branch on? So, we have a lot of variables, it seems that we have plenty of variables with fractional values. Which one are we going to branch on? Okay? So, one of the good heuristic is to branch with the most fractional value okay? So why, why would we do that well, you know once again exaggerate. Assume that you variable which is not very fractional okay? So, it's like you know 0.99999999999 okay? And it can take value 0, 1 okay? So, when it is 0.9999999 [UNKNOWN] I hope I you know, use the same number of nines, but that means that it's really close to one, probably branching there doesn't make too much sense. If a variable is 0.5 what does that mean? It means that the linear relaxation is saying, hm, maybe I should take this item hm, maybe I should not take this item hm, I do not really know. Okay? And so therefore these are the type of variables that you need to branch on. Okay? So, this is a basic introduction to MIP. Next time we are going to see really cute things, on how you can actually build MIP model. What does it mean to be a good MIP model? Okay? come back. Thank you.