Show Menu
Cheatography

Metaheuristics: Cheat Sheets Cheat Sheet by

In the field of programming, optimization is the selection of a best element, with regard to some criterion, from some set of available alternatives, and a metaheuristic is a strategy designed to find, generate, or select a heuristic that may provide a sufficiently good solution to an optimization problem.

Evolut­ionary Comput­ation

Evolut­ionary Computing

Evolut­ionary Computing is a research area within computer science. As the name suggests, it is inspired by Darwin's theory of natural evolution.
In an enviro­nment with limited resources, the fittest indivi­duals survive. Also, they have more chances to have offspring.

Evolutive Algorithms Termin­ology

Individual
Represents a solution
Phenotype
The repres­ent­ation of an individual
Genotype
The repres­ent­ation used for solving the problem
Gene
A simple value from the genotype
Fitness
A numeric value that represents the quality of the solution
Population
It is a group of indivi­duals that recombine and mutate their proper­ties. The initial population is randomly created
Selection of parents:
The parents must be selected based on their fitness
Crossover (Repro­duc­tion)
The parents inherit their charac­ter­istics to their offspring.
Mutation
Indivi­duals modify their charac­ter­istics or behavior to improve themselves
Survival
The fittness. indivi­duals survive and can live more time
Termin­ation Condition
If you know the value of good fitness, the algorithm can stop when you find an individual with good fitness

Pseudocode

BEGIN
     INITIALISE population with random candidate solutions;
     EVALUATE each candidate; 
     REPEAT UNTIL ( TERMINATION CONDITION is satisfied ) DO
          1 SELECT parents;
          2 RECOMBINE pairs of parents;
          3 MUTATE the resulting offspring;
          4 EVALUATE new candidates; 
          5 SELECT individuals for the next generation; 
     OD
END

Evolut­ionary Progra­mming (EP)

Evolut­ionary Progra­mming (EP)

In the classical example of EP, predictors were evolved in the form of finite state machines.
A finite state machine (FSM) is a transducer that can be stimulated by a finite alphabet of input symbols and can respond in a finite alphabet of output symbols.
It consists of a number of states S and a number of state transi­tions.
The state transi­tions define the working of the FSM: depending on the current state and the current input symbol, they define an output symbol and the next state to go to.

Mutation operators to generate new FSMs

The idea was evolving finite state machines (FSMs). There are five generally usable mutation operators to generate new FSMs:
Changing an output symbol
Changing a state transition
Adding a state
Deleting a state
Changing the initial state

Evolut­ionary Progra­mming Termin­ology

Repres­ent­ation
Real-v­alued vectors
Parent selection
Determ­inistic (each parent creates one offspring via mutation)
Recomb­ination
None
Mutation
Gaussian pertur­bation
Survivor selection
(πœ‡ + πœ‡)
Specialty
Self-a­dap­tation of mutation step sizes

Particle Swarm Optimi­zation

Particle Swarm Optimi­zation

Inspired by the movement of a flock of birds when searching for food.

Particle Repres­ent­ation

Each particle 𝑖 represents a solution for the problem. In the time t, it has a position π‘₯𝑖 (𝑑) ∈ ℝ𝑑 and a velocity 𝑣𝑖 ∈ ℝ𝑑.

Position and Velocity Update

The positions and velocities are updated following the next equations, where 𝑃𝑏𝑒𝑠𝑑 𝑖 is the best position where the particle 𝑖 has been, 𝐺𝑏𝑒𝑠𝑑 is the best location founded until the moment, π‘Ÿ1 π‘Žπ‘›π‘‘ π‘Ÿ2 are random numbers between 0 and 1, and 𝑀, 𝑐1, π‘Žπ‘›π‘‘ 𝑐2 are hyper parame­ters. Those last values can be initia­lized at 0.9 and gradually reducing it until 0.1.
 

Genetic Algorithms (GA)

Genetic Algorithms (GA)

John Holland proposed genetic Algorithms in the 1970s. Initially, they were called β€œRepro­ductive Plans.” These algorithms are maybe the most famous of the evolutive algorithms family.

The inspir­ation comes from the DNA structure, which is people's genetic code. All the inform­ation is stored in chromo­somes that have a lot of genes. Holland’s proposal consists of repres­enting the solutions by binary arrays.

Selection of Parents

Roulette selection
You can imagine a roulette where each section is assigned to an indivi­dual. If we have 10 indivi­duals, the roulette is divided into 10 sections. The section size is propor­tioned to the indivi­dual’s fitness.
Tournament selection
It consists of randomly choosing k indivi­duals and selecting the fittest one. k represents the tournament size.

Reprod­uction (crossover or recomb­ina­tion)

1 point crossover
This technique divides the parents into two sections randomly choosing a crossover point.
N point crossover
The parents are divided into several sections.
Uniform crossover
For each gene, it copies the gene of the first or the second parent randomly.

Mutation

Bitwise mutation
Consists of randomly selecting one or several genes and changing their values.
Random resetting
Consists of randomly selecting one or several genes and resets its values.
Uniform mutation
It randomly selects one or several genes and chooses a random value between the minimum and maximum values.
Swap mutation
Consists of randomly selecting two elements and swapping their values.

Differ­ential Evolution

Difere­ncial Evolution (DE)

It is a robust algorithm for solving continuous multid­ime­nsional optimi­zation problems. In this algorithm, indivi­duals as seen as vectors.
The novelty is the mutation operator, that uses three indivi­duals for mutate another one, and the mutation depends on the distance

Differ­ential Evolution Example

Differ­ential Evolution Termin­ology

Repres­ent­ation
The indivi­duals are repres­ented as vectors whose entries are the variables values.
Mutation
Mutation is the main operation in Differ­ential Evolution. The new individual 𝑣 𝑖 is calculated as follows:
𝑣𝑖 = π‘₯π‘Ÿ1 + 𝐹(π‘₯π‘Ÿ2 βˆ’ π‘₯π‘Ÿ3 )
Crossover
For each variable k of 𝑒 𝑖 , the value is selected randomly between 𝑣𝑖 or 𝑒𝑖
Selection
The selection is performed by tourna­ment.

Constraint Handling

Disadv­antages of Constrains

In general, the presence of constr­aints will divide the space of potential solutions into two or more disjoint regions, the feasible region, containing those candidate solutions that satisfy the given feasib­ility condition, and the infeasible region containing those that do not.

Penalty Functions

Static
Relies on the ability to specify a distance metric that accurately reflects the difficulty of repairing the solution, which is obviously proble­m-d­epe­ndent, and may also vary from constraint to constraint
Dynamic
The fitness function used to evaluate the population is a combin­ation of the distance function for constraint i with a death penalty for all solutions violating constr­aints
Is a distance metric of the infeasible point to the feasible region

Constrains in EA

The presence of constr­aints implies that not all possible combin­ations of variable values represent valid solutions to the problem at hand.
Unfort­una­tely, constraint handling is not straig­htf­orward in an EA, because the variation operators are typically β€œblind­" to constr­aints.
There is no guarantee that even if the parents satisfy some constr­aints, the offspring will satisfy them as well.

Repair Functions

Takes an infeasible point and generates a feasible solution based on it. In some problems, this technique is relatively simple.
In general, defining a repair function may be as complex as solving the problem itself.
 

Evolution Strategies (ES)

Evolution Strategies (ES)

The goal is to solve continuous multid­ime­nsional optimi­zation problems.
The main charac­ter­istic is the self-a­dap­tation of parameters. It means that some evolutive algorithm parameters change during the execution.
Those parameters are included in the individual repres­ent­ation and evolve at the same time that the solution.

Evolution Strategies Termin­ology

Indivi­dual's Repres­ent­ation
The indivi­duals’ solutions are repres­ented as vectors whose inputs are the values of the variables
Mutation
The indivi­dual’s position is modified by adding a random number, noise, to each entry
Recomb­ination
In ES there are two recomb­ination variants
Interm­ediate recomb­ination
The values of the parents are averaged.
Discrete recomb­ination
One of the parent’s values is randomly chosen with equal chance for either parents
Parent selection
Parent selection in ES is completely random, because here the whole population is seen as parent
Survivor Selection
(πœ‡, πœ†) selection, where only the best πœ‡ offspring are selected. (πœ‡ + πœ†) selection, where the best πœ‡ indivi­duals (from the union of parents and offspring) are selected
Specialty
Self-a­dap­tation of mutation step sizes

Genetic Progra­mming

Genetic Progra­mming (GP)

Automa­tically solves problems without requiring the user to know or specify the structure of the solution in advance.
The main idea of GP is to evolve a population of computer programs, where indivi­duals are commonly repres­ented as syntax trees.

Elements of a GP individual

Terminals
Leave nodes in a syntax tree. Variables that can be predefined or randomly generated.
Functions
Internal nodes in a syntax tree. Operations

Genetic Programing Termin­ology

Initial population
1. Full, where all the trees are randomly created, and all the leaves have the same depth
2. Grow, each node selects an element randomly from either the function set or the terminal set
3. Ramped half-a­nd-half where half of the population is created with the full technique and the other half with grow
Selection
Tournament Selection
Crossover
Classic Crossover
Mutation
Subtree mutation, randomly selects a mutation point in a tree and substi­tutes the subtree rooted there with a randomly generated subtree

Ant Colony Optimi­zation (ACO)

Ant Colony Optimi­zation (ACO)

Solves problems of finding paths in graphs. It is inspired by the ants' behavior when searching for food. The ants leave pheromones that allow other ants to follow the path to food. The more ants go for a specific path, the more pherom­ones.

Example

In this algorithm, an artificial ant must simulate a path starting at a specific point. It moves node by node, choosing based on the pheromones of each path.

Ant Colony Termin­ology

Cij
Path from the node i to the node j
Tij
Pheromones in the path from the node i to the node j
Nij
Heuristic in the path from the node i to the node j
p
Evapor­ation rate, between 0 and 1

Steps

First
All the pheromones can be initia­lized with a small value.
Second
Ants start to move around the graph node by node using the previous equation.
Last
The pheromones must be updated. Ants deposit pheromones to their path propor­tional to its distance. The pheromones evaporate.
 

Comments

No comments yet. Add yours below!

Add a Comment

Your Comment

Please enter your name.

    Please enter your email address

      Please enter your Comment.

          Related Cheat Sheets