XOR toy example with MLP

User 5344 | 6/30/2016, 9:56:10 PM

How exactly should I build this simple example with the graphlab.deeplearning.MultiLayerPerceptrons??

I use this code for the net : net = gl.deeplearning.MultiLayerPerceptrons(numhiddenlayers=2, numhiddenunits=[3,2])

Then the model is this: XORannmodel = gl.neuralnet_classifier.create(data,target='Y',network = net)

SFrame used: data = graphlab.SFrame({'X1':[0,0,1,1],'X2':[0,1,0,1],'Y':[0,1,1,0]})

It seem I always get Training-accuracy of 50%. Is there something wrong with my parameterization??

Comments

User 940 | 7/1/2016, 5:59:12 PM

Hi @angelcon ,

Initialization and learning rate are both important in neural nets, even for simple problems. Here's how I would modify your network (I did some experimentation).

python net.layers[0] = graphlab.deeplearning.layers.FullConnectionLayer(num_hidden_units=3, init_random='xavier') net.layers[2] = graphlab.deeplearning.layers.FullConnectionLayer(num_hidden_units=2, init_random='xavier')

This changes initialization from gaussian to xavier, which automatically scales initialization of weights based on number of input and output neurons.

I would also change the learning rate to 1 and increase the number of iterations.

Here's is how you do that:

python XOR_ann_model = gl.neuralnet_classifier.create(data,target='Y',network = net, learning_rate=1, max_iterations=100)

Now, it gets 100% accuracy per the math!

I hope this helps.

Cheers! -Piotr