Abstract
Search algorithms for Pareto optimization are designed to obtain multiple solutions, each offering a different tradeoff of the problem objectives. To make the different solutions available at the end of an algorithm run, procedures are needed for storing them, one by one, as they are found. In a simple case, this may be achieved by placing each point (the image of a solution in objective space) that is found into an "archive" which maintains only nondominated points and discards all others. However, even a set of mutually nondominated points is potentially very large (infinite in continuous objective spaces), necessitating a bound on the archive's capacity. But with such a bound in place, it is no longer obvious which points should be maintained and which discarded; we would like the archive to maintain a representative and well-distributed subset of the points generated by the search algorithm, and also that this set converges. To achieve these objectives, we propose an adaptive archiving algorithm, suitable for use with any Pareto optimization algorithm, which has various useful properties as follows. It maintains an archive of bounded size, encourages an even distribution of points across the Pareto front, is computationally efficient, and we are able (with caveats) to prove a form of convergence. Previously proposed archiving algorithms, which we also discuss, have more general convergence properties, but at the expense of not being able to maintain an even distribution of points along the front, or are very computationally expensive, or do not guarantee to maintain a certain minimum number of points in the archive. In contrast, the method proposed here maintains evenness, efficiency, and cardinality, and provably converges under certain conditions (e.g. when there are two objectives) but not all. Finally, the notions underlying our convergence proofs support a new way to rigorously define what is meant by "good spread of points" across a Pareto front, in the context of grid-based archiving schemes. This leads to proofs and conjectures applicable to archive sizing and grid sizing in any Pareto optimization algorithm maintaining a grid-based archive.
Keywords
Affiliated Institutions
Related Publications
Handbook of Genetic Algorithms
This book sets out to explain what genetic algorithms are and how they can be used to solve real-world problems. The first objective is tackled by the editor, Lawrence Davis. Th...
Performance evaluation of genetic algorithms for flowshop scheduling problems
The aim of this paper is to evaluate the performance of genetic algorithms for the flowshop scheduling problem with an objective of minimizing the makespan. First we examine var...
Feature selection for high-dimensional genomic microarray data
We report on the successful application of feature selection methods to a classification problem in molecular biology involving only 72 data points in a 7130 dimensional space. ...
Efficient MATLAB Computations with Sparse and Factored Tensors
In this paper, the term tensor refers simply to a multidimensional or N-way array, and we consider how specially structured tensors allow for efficient storage and computation. ...
A modified particle swarm optimizer
Evolutionary computation techniques, genetic algorithms, evolutionary strategies and genetic programming are motivated by the evolution of nature. A population of individuals, w...
Publication Info
- Year
- 2003
- Type
- article
- Volume
- 7
- Issue
- 2
- Pages
- 100-116
- Citations
- 297
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/tevc.2003.810755