Posted on Updated on

The gatool in MATLAB provides researchers with the ability to quickly apply optimization techniques to problems that need a genetic algorithm. As with all toolboxes contained in MATLAB the gatool has a command line interface and a GUI interface. We will be using the GUI interface since it provides an easier means of modifying the parameters of the toolbox. The gatool implements a canonical genetic algorithm, with the ability to provide custom functions for mutation, crossover and selection operators. Before going any further lets review Introduction To MathWorks MATLAB. Now that you are back lets see how this can be done.

We will need to create two functions for use with the gatool (1)objective function, (2)creation function. The objective function will be minimized by the genetic algorithm to give an ideal solution for our problem definition. While the creation function will be used to create initial individuals for the genetic algorithm population.

Objective Function:

function rtn = objective(param)
seq = [1 2 3 4 5];
temp = 5;
for i = 1:length(param)
 if eq(seq(i),floor(param(i)))
 temp = temp - 1;
rtn = temp;

Creation Function:

function rtn = creator(genomeLength, fitnessFn, options)
temp = floor(5.*rand(5,1));
rtn = temp';

Gocha’s: Arguements for the respective functions must be provided otherwise the toolbox will not be able to provide the necessary parameters to the functions.

Launch gatool-> Type gatool in the MATLAB command window

GATOOL Interface: Only parameters marked in red were modified.

Results: Click the start button on the gatool to start the optimization routine.

Results Plot:

This post intends to use the gatool present in MATLAB to evolve a vector of integers from 1 to 5. If you take the floor of the final point listings you will see that this aim was achieved with 0.0 fitness.  As an exercise you could modify the other parameters present in the gatool and re-run the routine to observe any changes that may occur.

Particle Swarm Optimization (PSO)

Posted on Updated on

Nature is a great teacher as it relates to optimization and evolutionary computing. Among the many lessons taught by nature is the concept of the swarm; whether its a swarm of Locusts, school of Piranhas or a flock of birds the group and group conciousness serves a very important and dominating purpose. In case of the Locusts and Piranhas that purpose is embodied in their very optimized feeding technique which has evolved over millennia into the efficiency that we see today. Particle Swarm Optimization (PSO)  is just one of the many and varied optimization techniques borrowed from nature.  Many see Particle Swarm Optimization as a hybrid Genetic Algorithm(GA) implementation which allows a more deterministic search of the problems solution space.

Particle Swarm Optimization takes advantage of a mathematical formula that tells each candidate solution(CS) how far it is from optimal and what it needs to do to be to closer to the swarms optimal position. Depending on the calculated position of the CS and the speed of convergence the CS is guided to a new optimal position for each epoch.

We  can visualize this this by looking at a school of Mackerel trying to escape the jaws of a hungry great white shark. Each Mackerel is aware of its own position in the  school, the position of the shark and also the position and wellness of its neighbour. This swarm feature allows the  school to function as a single cohesive organism. It is quite obvious to the fish that  if their neighbour turns left and is still not eaten then it is a pretty safe bet that when it turns left it too may be swimming off into the sunset and not into the hungry jaws of the shark.

The PSO optimization is a much simplified expression of the previous narrative,  if you are already familiar with the workings of a Genetic algorithm you should be quite at home with this incarnation. A GA with some simple modifications can be converted into a PSO; by by modifying its operators so that each CS is aware of its immediate neighbours and their fitness. Another advantage with PSO’s is the reduced number of parameters which must be configured in order for the technique to work. Many different solutions can be solved without much modification to the algorithm.

Using A Perceptron ANN With Genetic Algorithm Trainer To Solve Logical AND

Posted on Updated on

Neural nets (NN)  have always been a “black box” fascination for me until recently when i had my eureka moment. Neural Networks are mainly used for classification and prediction applications include stock market trends, feature recognition and some types of optimization techniques. A NN can be trained in two ways, supervised and unsupervised.

The NN is exposed to a dataset with the training values and the correct outputs to those values. A Training algorithm such as back propagation is used to minimize the error between resulting outputs and the correct outputs.

The NN is allowed to form its own conclusion based on the dataset provided. These types of networks are usually used for classification type problems, a very good example of this it the SOM (Self Organizing Map) network.

That said, if you have not already read my post on Genetic Algorithms (GA) please do so now, since we will be using GA’s as the means of optimizing the required weights for each input of the Perceptron.

Wow you are back already? Ok lets look at a simple NN to see how it works, the NN we will be reviewing is called the Perceptron this particular configuration has only two input neurons since we will be dealing with a binary operation and one output neuron.

Simple Perceptron

i = inputs
w = weights
o = output
b = bias

A bias is also associated with the network, this bias is simply a neuron with a constant multiplied by a weight, this bias is added to the activation function in order for the NN to recognize problems that are not linearly separable. Each input is multiplied by a corresponding weight, the weight is a small value which will be adjusted until the the output neuron is activated. Activation occurs when the activation function Σ(i1*w1+i2*w2)+b is greater than certain threshold. The output of the activation function is then sent to a transfer function which interprets that output as 1 or 0 depending on the inputs supplied to the NN.


Simple Genetic Algorithm To Evolve A String of Integers

Posted on Updated on

Genetic Algorithms are a means of optimization copied from the natural world. According to the theories of evolution, nature has a way of selecting the best (fittest) individuals to mate (Crossover) and reproduce; thus carrying on the best features of each selected individual. Even though traits from both parents are carried over into children there is still an element of randomness involved (Mutation) that gives their offspring the ability to explore their fitness landscape (adapt) to their environment. Genetic Algorithms tries to mimic this behavior with three common operators (1)Selection, (2)Crossover, (3)Mutation.

The selection operator determines how the individuals of a population are selected to mate, the most popular selection method is called elitism and this is the method that we will use in our genetic algorithm implementation.

The crossover operator determines how the parents are recombined to form offspring. We will be using single point crossover in this implementation.

Mutation inserts randomness into the genotype of each offspring giving it the ability to diversify from the features of its parents.

Implementing genetic algorithms can be seen as somewhat of an art because almost all of the code is boiler plate except for the chromosome representation used and how the fitness of each individual is calculated. These two factors usually have the most impact on the accuracy and speed of the genetic algorithm and are the most difficult to represent. These factors along with the mutation rate, crossover rate and selection method have to be tinkered with until a viable configuration is reached.

This implementation will evolve the number sequence 123456789 in that specific order.

Interesting Functions

private static void CalculateFitness(sequence seq)
var ordered = new sequence{buffer = new List<string>() {"1", "2", "3", "4", "5", "6", "7", "8", "9"}};

int result=0;
for (int i = 0; i < 9; i++)
if (ordered.buffer[i] == seq.buffer[i]) result++;
} = result;


private void Epoch(List<sequence> population)
            survivors.AddRange(population.Where(i => >= survivorThreshold));

            Mutate(population.Where(i => >= mutantThreshold));

            CrossOver(population.Where(i => >= crossoverThreshold) as IEnumerable);


            for (int i = 0; i < populationSize - survivors.Count; i++)
                var temp = GenerateSequence();

The full code listing can be found at github