Design and Analysis of Meta-heuristics for Constrained Optimization Problems

Raymond Hsu (CSE) and Rob Callan (ESE)
Faculty Advisors: Steven Kimbrough (OPIM) and Lyle Ungar (CIS)

CSE 400/401 Senior Design Project
Paper - Poster





Overview

        In our senior design project we design, implemented, and analyzed the effectiveness of different meta-heuristic algorithms on solving two constrained optimization problems: the Bin Packing Problem (BPP) and the Generalized Assignment Problem (GAP).
        Specifically, we focused on applying a certain meta-heuristic technique, the Genetic Algorithm (GA).  Although genetic algorithms have been used to solve these problems before, we implement a new type of genetic algorithm: the Feasible-Infeasible Two Population Genetic Algorithm (FI-2Pop GA).  We compared its performance to traditional genetic algorithms and simulated annealing (SA) by implementing the algorithms in Java then running numerous simulations.
        Initially, our intention was to focus only on BPP.  We implemented SA and GA (both traditional and FI-2Pop) for BPP and also developed a Visualization Tool for BPP.  However, we found that neither of these meta-heuristics performed as well as simple approximation algorithms like First Fit, Best Fit, and Worst Fit.
        We discussed this finding with our advisor who suggested we move to GAP instead.  We took a similar approach with GAP and implemented both SA and a GA (with traditional and FI-2Pop capabilities).  However, before doing any coding we first researched GAP and determined that it did not have any approximation algorithms or other solving techniques that perform especially well.
        See Results and Conclusions.


What are Meta-heuristics?

        Meta-heuristics combine heuristics in an efficient to solve a problem better than is possible with any individual heuristic.  The two meta-heuristics used in our project are Genetic Algorithms and Simulated Annealing.  See the Wikipedia entry on meta-heuristics.

Bin Packing Problem (BPP)

       The Bin Packing Problem asks, given a set of items with different weights, what is the fewest number of binsneed to hold all of the items.    An unlimited number of identical bins (i.e. all bins have the same capacity) are provided and each item must be assigned to exactly one bin.  A real-life application of BPP is determining the number of shipping containers necessary to ship a set of goods.
        The smallest instances of BPP we considered contain 120 items, which means there are 120120 possible assignments.  The largest instances of BPP we considered contained 500 items.

BPP Resources:

Generalized Assignment Problem (GAP)

        Given a set of jobs and agents, the Generalized Assignment Problem seeks to find the assignment of jobs to agentswhich will maximize profit.  Each job is constrained by its resource capacity and different jobs require different amount of resources to complete on different agents.  Each job will also generate a different profit depending on which agent it is assigned to.  Each job must be assigned to exactly one agent.  A real-life application of GAP is, given a set of computing tasks and a set of processors to complete the tasks, what assignment of tasks to processors will allow one to finish all the tasks in the least amount of time.
        The smallest instances of GAP we considered contain 5 agents and 15 jobs, which means there are 515 possible assignments.  The largest instances of GAP we considered contained 80 agents and 400 jobs.

GAP Resources:
  • Wikipedia entry on GAP
  • Collection of BPP instances used in our simulations from Beasley's OR Library
  • A collection of harder BPP instances, also used in our simulations

Genetic Algorithms (GAs)

        Genetic Algorithms combine the concepts of Darwinian natural selection, sexual reproduction, and genetics from biology and apply them to solving a problem. The GA works to solve a problem using the following steps:
  1. Initial Population Generation. Generate a population of candidate solutions.
  2. Parent Selection. Select two solutions from the populations (“parents”) for breeding.
  3. Crossover. Use crossover between parents to produce a new solution (“child”) that contains “DNA” from both parents.
  4. Mutation. Allow for stochastic variation by possibly mutating the child.
  5. Insertion into Population. Calculate the child’s fitness level based on the fitness function and (possibly) insert the child into the population of candidate solutions.
  6. Repeat steps 2-5 until either a specified fitness level is reached, a specified number of generations have elapsed, or a specified amount of time has elapsed.
        Over time, the overall fitness of the population is expected to increase as more fit solutions are given a greater chance to reproduce and less fit solutions are gradually weeded out.  The GA keeps track of the best solution encountered through its execution.

GA Resources:

Feasible-Infeasible Two Population GA (FI-2Pop GA)

        The FI-2Pop GA was invented by Kimbrough (our faculty advisor), Koehler, Lu, and Wood.  The components of a traditional GA described are all found in a FI-2Pop GA as well.  The difference is that the traditional GA only considers a population of feasible solutions, effectively “killing off” infeasible solutions that are produced immediately.  The FI-2Pop GA, however, maintains two populations - one feasible and one infeasible.  The idea is that since an optimal solution to any constrained optimization problem is likely to lie close to the boundary between the region of feasible and infeasible solutions, having two populations will allow the GA to converge on the optimal solution from different directions.  Implementing a FI-2Pop GA requires specialized Parent Selector and Inserter methods that account for the presence of the two populations.

FI-2Pop GA Resources:

Results and Conclusions

(A full table of our simulation results for BPP and GAP can be found in our paper.)

Results for BPP
  • SA performs better than GA.
  • FI-2Pop GA performs better than traditional GA.
  • Simple approximation algorithms First-Fit, Best-Fit, and Worst-Fit perform better than either SA or GA.
Results for GAP
  • GA performs better than SA.
  • FI2-Pop GA and traditional GA perform about the same.
  • Most significant performance improvement comes from use of Local Improvement Procedures (Chu and Beasley, 1997).
  • The effect of other configuration changes besides the use of the Local Improvement Procedures was minimal, although certain Population Generation and Crossover procedures did have a slightly positive effect.
  • GAs can find a near-optimal solution rapidly, but have difficulty reaching the optimal solution.  In many cases, especially among the small-sized problems, the GA was able to reach within 2-3% of the optimal solution, but could not actually reach the optimal solution.  A chart of the typical performance of a GA can be found here.
Conclusions

       
In spite of its simplicity, SA performed reasonably well.  SA did better than the GAs for BPP, while the GAs did better than SA for GAP.  For GAP, algorithms must take into account the tradeoff between profit gained and the cost of resources when making an assignment.  For BPP, algorithms only need to consider cost - the number of bins used. It is possible that SA is better suited for simpler problems with fewer constraints that are easier to optimize.  When the problem becomes more complex and the constraints increase, the GA may be able to use information more effectively than SA.
       Our investigation into the efficacy of the FI-2Pop GA versus traditional GAs was largely inconclusive.  Overall, it appears that using the FI-2Pop GA had a neutral to slightly positive effect on GA performance.  For BPP, the FI-2Pop did noticeably better than the traditional GA, however, this encouraging finding was overshadowed by the superior performance of the approximation algorithms.  For GAP, it appears the FI-2Pop GA did not significantly affect performance one way or another.  The set of FI-2Pop GAs on average performed roughly the same as the set of traditional GAs for GAP.
        The two meta-heuristics we have studied in our project, simulated annealing and genetic algorithms, are effective tools to solve constrained optimization problems.  Their general, problem-independent nature allows them to be applied to a variety of problems without significant changes.  However, their general nature is also a weakness because it appears problem-specific information is crucial to a highly successful algorithm for constrained optimization problems.


Visualization Tool for BPP (screenshot)

        We developed a visualization tool for BPP in Java.  This point-and-click tool may be used to visually compare the performance of SA v. GA.  This tool will let instructors show their students meta-heuristics “in action”, with the ability to display the results of the meta-heuristic as it is running.  One window plots the fitness of solutions over time.  Two other windows display a representation of the best solution encountered so far.  Users are also able to tweak different setup parameters.  For example, the user could change the capacity of the bins or weights of the items and generate a random instance of BPP to test the algorithms on.  Alternatively, we give the user the option to load previously serialized instances of BPPs from file.  Finally, the user can also change how the meta-heuristic runs (e.g. changing schedule length in SA or changing the size of the feasible and infeasible populations in FI-2Pop GA).
        Note: Since no good graphical representations exist for GAP, a visualization tool was not possible for GAP.  


About the authors

        Raymond Hsu is a senior at the University of Pennsylvania ('07).  He is pursuing dual degrees: a B.S.E. in Computer Science & Engineering (CSE) from the School of Engineering and Applied (SEAS) and a B.S. in Economics with a concentration in Operations & Information Management (OPIM) from the Wharton School.  He will be working in New York City after he graduates.
        Rob Callan is a senior at the University of Pennsylvania ('07).  He is pursuing a B.S.E. in Electrical and Systems Engineering (ESE) from the School of Engineering and Applied (SEAS) .  He will be attending graduate school after he graduates.