Archive | May 2015

Neural Networks

This as been a very busy semester at UÉ, so I haven’t written much for this blog. In this post I give an update on something in which I’ve been recently occupied, Neural Networks.

I’ve taken a course on Machine Learning. One of the topics that I wanted to explore was Neural Networks. To do so I started by reading books on the subject. Next I found a few things online and work from that.

Learning

The book “Machine Learning”, by Tom Mitchel, was a great starting point. It talked about Perceptrons and Sigmoids, gave an introduction for Gradient Descent, introduced the concept of Neural Network built from Sigmoid Neurons, and finally talked a about the Back Propagation and how this algorithm is used to train the network.

After that I’ve looked for things online. Off course, there are many resources available. I found two particular websites to be of interest. These:

Neural Networks and Deep Learning

and

Nature of Code

The Nature of Code website showed a free book (Which can also be bought), that talked about NNs and their applications in one chapter. The book is not only about machine learning, it talks about other topics. It also has a lot of examples using Process.js, which is a JavaScript (:D) port of Processing. Really an interesting read.

The first website, “Neural Networks and Deep Learning”, really caught my attention. It explains NNs in full detail. It goes on to show an example of using NNs to recognize handwritten numbers using the famous MNIST Dataset. What the network does is to take a handwritten number from 0 to 9 in an image of 28×28 pixels and recognize which number it actually represents. The site explains all the equations used in details and also provides a Python+Numpy implementation.

Implementation

I myself wanted to implement my own version of the algorithms. I prefer to do so because I learn way much more then just by using an implementation found on the Internet. My version is very similar to the one by the author of the book, but it does have a few differences.

Implementing this algorithm took a few hours of understanding every little detail of the equations that describe how Back Propagation calculates the derivatives of the quadratic error function, how Gradient Descent uses those results, how training is done, etc. It was worth it.

The code can be found in my GitHub page.

Experiments

I’ve made two simple experiments. They consist in creating a training set of random points that get classified as being in a given class. The network then learns from that training set and is able to classify new points into their correct classes.

I don’t have anything to formal to show yet, or any real example. In the following weeks I will be looking into real applications of NNs and I will use my implementation in the MMNIST Dataset, like in the book.