Computer Science – Neural and Evolutionary Computing
Scientific paper
2003-03-31
Computer Science
Neural and Evolutionary Computing
10 pages, LaTeX, see http://www.neuroinformatik.rub.de/PROJECTS/SONN/
Scientific paper
The sharpened No-Free-Lunch-theorem (NFL-theorem) states that the performance of all optimization algorithms averaged over any finite set F of functions is equal if and only if F is closed under permutation (c.u.p.) and each target function in F is equally likely. In this paper, we first summarize some consequences of this theorem, which have been proven recently: The average number of evaluations needed to find a desirable (e.g., optimal) solution can be calculated; the number of subsets c.u.p. can be neglected compared to the overall number of possible subsets; and problem classes relevant in practice are not likely to be c.u.p. Second, as the main result, the NFL-theorem is extended. Necessary and sufficient conditions for NFL-results to hold are given for arbitrary, non-uniform distributions of target functions. This yields the most general NFL-theorem for optimization presented so far.
Igel Christian
Toussaint Marc
No associations
LandOfFree
Recent Results on No-Free-Lunch Theorems for Optimization does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Recent Results on No-Free-Lunch Theorems for Optimization, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Recent Results on No-Free-Lunch Theorems for Optimization will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-101829