Neural Networks

Designing A Custom Neural Network In MATLAB

Posted on Updated on

The MATLAB Neural Network toolbox ships with numerous predefined and canonical neural nets, however sometimes you may need to create a custom net with just the right connections, biases and hidden layers to suite your particular problem domain. To achieve this goal we can use the matlab network object. The  network object allows granular design of neural networks by exposing all properties of the net that we are designing. The preceding code demonstrates how to build a simple neural to learn the truth table for Logical AND.

First lets look at the Logical AND truth table:

p q p ∧ q
T T T
T F F
F T F
F F F

Open a new edit window in MATLAB and enter the following code:

    1. This creates an empty network object and assigns it to the net variable, sets up the number of inputs and uses cell array syntax to index into its properties.
      %% Design Custom Neural Network
      net = network;                                  % create network
      net.numInputs = 2;                              % set number of inputs
      net.inputs{1}.size = 1;                         % assign 2 to input size
      net.inputs{2}.size = 1;
      net.numLayers = 1;                              % add 1 layer to network
      net.layers{1}.size = 1;                         % assign number of neurons in layer
      net.inputConnect(1) = 1;                        % connet input to layer 1
      net.inputConnect(2) = 1;
      net.biasConnect(1) = 1;                         % connect bias to layer 1
      net.biases{1}.learnFcn = 'learnp';              % set bias learning function
      net.biases{1}.initFcn = 'initzero';             % set bias init function
      
      net.outputConnect(1) = 1;
      
      net.layers{1}.transferFcn = 'hardlim';          % set layer transfer function [hard limit]
      net.inputWeights{1}.initFcn = 'initzero';       % set input wieghts init function
      net.inputWeights{1}.learnFcn = 'learnp';        % set input weight learning function
      net.inputWeights{2}.learnFcn = 'learnp';
      net.inputWeights{2}.initFcn = 'initzero';
      
      net.initFcn = 'initlay';                        % set network init function
      net.trainFcn = 'trainc';                        % set network training function
      net.performFcn = 'mae';                         % set network perf evaluation function
      
      view(net)
      net = train(net,[0 0 1 1;0 1 0 1],[0 0 0 1]) ;  % train network
      
    2. Custom Network Diagram:
    3. Test Network
      In the command window type

      net([1;1])
      

      This should output a 1 to the command window indicating 1 AND 1 = 1

Image Classification Using MATLAB SOM/LVQ

Posted on Updated on

I like to think of myself as a hacker :-), not in today’s sense of the word [person who breaks into secured computer areas] but as a hacker in the sense of first two definitions found here. I like to experiment with things especially related to computers and Artificial Intelligence in particular. MATLAB happens to be a virtual toolbox for the trade pun intended  :-), using some of its toolboxes we will see how we can solve a ubiquitous problem that faces most persons with a nice camera and a voracious appetite for taking pictures.

Now, I don’t have an expensive Nikon, but I do have loads of pictures; and one day i was trying to find this particular picture when it occurred to me that if i could not remember the name or date when i took the picture  it would require me to search blindly every picture I had in order to find that special one. Now what if i had some way of finding the picture based on what i could remember of it? ie. environment in which it was taken, colours and objects along with some other visual specifications, wouldn’t that be cool.

So I went for the proverbial toolbox MATLAB, which tools will I need?

  1. Neural Network    -> selforgmap, lvqnet, vec2ind, ind2vec
  2. Image Processing -> imhist, imresize, rgb2gray, imread

Other:

  1. mspaint

Note: For this demonstration I will be using:

  • MATLAB R2011b on Windows 7
  • Pictures found at  C:\Users\Public\Pictures\Sample Pictures

Ok lets do it,  start up MATLAB, copy and paste all the pics from the above directory to MATLAB’s current directory. {Chrysanthemum, Desert, Hydrangeas, Jellyfish, Koala, Lighthouse, Penguins, Tulips}.

  1. In the new edit window copy and paste the code as given below. Save file as scan.m
    function scan(img)
    files = dir('*.jpg');
    hist = [];
    for n = 1 : length(files)
        filename = files(n).name;
        file = imread(filename);
    
        hist = [hist, imhist(rgb2gray(imresize(file,[ 50 50])))]; %#ok
    end
    
    som = selforgmap([10 10]);
    som = train(som, hist);
    t   = som(hist); %extract class data
    
    net = lvqnet(10);
    net = train(net, hist, t);
    
    like(img, hist, files, net)
    end
                   

    Links to the functions used were provided above therefore i will not be going into the details of how they work, however there is a narrative with regard to the workings of the code:

    The code starts by searching the current MATLAB directory for all files with a .jpg extension. On each iteration of the loop an image is loaded and resized to 50 x 50, it is then converted to greyscale and a histogram measurement is taken of its pixels [feature vector]; the results are then appended to a 256 x n matrix with n been the number of images scanned.

    A self organizing map network is then used to identify classes into which the images fall. The feature matrix and class data is used to train a Learning Vector Quantization neural network, that will be used for classification of images presented to it.

  2. Next we will create a function to display all matching images for an image we submit to the LVQ network.
    function like(im, hist, files , net)
        hs = imhist(rgb2gray(imresize(im,[50 50])));
        cls = vec2ind(net(hs));
    
        [~, n] = size(hist);
        for i = 1 : n
            if(cls == vec2ind(net(hist(:, i))))
                figure('name', files(i).name);
                imshow(imresize(imread(files(i).name), [100 100]))
            end
        end
    end
    
  3. Download a picture of a koala and save it outside your MATLAB path as koalatest.jpg
  4. At the MATLAB command prompt type scan(imread(‘[replace with path to koalatest]\koalatest.jpg’)
  5. After a minute or two the networks should have been trained and a figure displaying the matching koala.jpg image shown to you.

NOTE: As explained above this is hacking, not production code I wrote this up in about 20 minutes as a demonstration for classification of images, with imagination this could be extended to classify things like sound for example using a feature map crated from humming a tune to find a song with a similar melody.

LVQ Been trained:

Predicting The Lottery With MATLAB® Neural Network

Posted on Updated on

DISCLAMER: This post does not in any way prove or disprove the validity of using neural networks to predict the lottery. It is purely for the purpose of demonstrating certain capabilities available in MATLAB ® . The results and conclusions are my opinion and may or may not constitute applicable techniques of predicting the popular Jamaican Lottery Cash Pot game.

Background:

Supreme Ventures Jamaica Limited has a lottery game called Cash Pot (CP) . The game is based on 36 balls being loaded into a chamber and one ball been selected at random from the grouping. The game is ran four (4) times each day seven (7) days per week.

Anecdotal Heuristics:

While doing a little tongue and cheek research at my favorite barbershop, I stumbled upon some heuristics that are employed by most patrons who play the (CP) game. One involved writing down the day, time, and winning number for each day’s lottery. After building up a sufficient dataset, they could then query a particular day and time; and with some simple arithmetic tally the most likely number to be played on that day and time. I was informed that this proved to be a very efficient way of telling which number was to be played next. Another popular heuristic involved pre-assigned symbols; these symbols were associated with each of the thirty six (36) numbers. Then based on dreams aka “rakes” numbers would be chosen that matched the symbols seen in the “rake”. These two methods were the favorite amongst the players of Cash Pot.

Procedure for predicting Cash Pot with MATLAB ANN:

  1. Get the dataset from Supreme Ventures Jamaica website. [contains all winning numbers with date and time]
  2. We will need to do some twiddling with the file in order to get it into a format that MATLAB can use. To do that we need to remove all headings/sub-headings and labels.
  3. Next remove the DRAW# and MARK columns since we will not be using those in our analysis.
  4. In column D use the =WEEKDAY() formula to get the day number from the corresponding date: repeat for all rows.
  5. Use find and replace to replace MORNING with 1, MIDDAY with 2, DRIVETIME with 3 and EVENING with 4. [Save the file]
  6. Using MATLAB folder explorer, navigate to the file then double click on it to run the import tool.
  7. Select columns B and D then hit the import button; this should import only columns B and D, rename the imported matrix to cpInputs .
  8. Select column C and hit the import button; this should import column C only, rename the imported matrix to  cpTargets.
  9. Because MATLAB sees Neural Network(NN) features as rows, transpose the two matrices using
  10. cpInputs = cpInputs’;
    cpTargets = cpTargets’;
    
  11. In the MATLAB command window type nntool.
  12. Import cpInputs and cpTargets into the NN data manager.
  13. Hit the new button on the Neural Network Data Manager and change the default name to cpNN.
  14. Set Input data to cpInputs, Target data to cpTargets.
  15. Hit the create button to create the NN.
  16.  

    Note:

    The newly created NN has two inputs, the first been the day of the week on which the [CP] is scheduled to be played and the second input the time of day that the [CP] is scheduled to played. It also has a hidden layer with 10 neurons with associated bias, and an output layer with 1 neuron and its associated bias. The output is a scalar double which represents the predicted winning number.

  17. Let’s go ahead and train this network. On the train tab of the Network: cpNN dialog, select cpInputs for Inputs and cpTargets for Targets; then press the Train Network button to start the network training.
  18. Results of training.
  19. After training the network to the desired tolerance’s go back to the Neural Network/Data Manager dialog box and hit the export button, select cpNN from the list then hit the export button.
  20. Go back to the MATLAB command window and type
  21. CpNN([2;3]) % [day;time]
    
  22. The resulting value will be the NN’s best guess of what will be the winning entry for Cash Pot on a Tuesday at DRIVETIME.

Conclusions:

My initial analysis of the results of the NN was not conclusive, maybe the parameters of the NN could be adjusted and the results compared to actual winning numbers. However, even after doing so one may find that the outputs are still random and contain no discernible patterns, which should be the case for a supposedly random game of chance.

Using A Perceptron ANN With Genetic Algorithm Trainer To Solve Logical AND

Posted on Updated on

Neural nets (NN)  have always been a “black box” fascination for me until recently when i had my eureka moment. Neural Networks are mainly used for classification and prediction applications include stock market trends, feature recognition and some types of optimization techniques. A NN can be trained in two ways, supervised and unsupervised.

Supervised
The NN is exposed to a dataset with the training values and the correct outputs to those values. A Training algorithm such as back propagation is used to minimize the error between resulting outputs and the correct outputs.

Unsupervised
The NN is allowed to form its own conclusion based on the dataset provided. These types of networks are usually used for classification type problems, a very good example of this it the SOM (Self Organizing Map) network.

That said, if you have not already read my post on Genetic Algorithms (GA) please do so now, since we will be using GA’s as the means of optimizing the required weights for each input of the Perceptron.

Wow you are back already? Ok lets look at a simple NN to see how it works, the NN we will be reviewing is called the Perceptron this particular configuration has only two input neurons since we will be dealing with a binary operation and one output neuron.

Simple Perceptron

i = inputs
w = weights
o = output
b = bias

A bias is also associated with the network, this bias is simply a neuron with a constant multiplied by a weight, this bias is added to the activation function in order for the NN to recognize problems that are not linearly separable. Each input is multiplied by a corresponding weight, the weight is a small value which will be adjusted until the the output neuron is activated. Activation occurs when the activation function Σ(i1*w1+i2*w2)+b is greater than certain threshold. The output of the activation function is then sent to a transfer function which interprets that output as 1 or 0 depending on the inputs supplied to the NN.