User 1319 | 3/6/2015, 3:05:42 AM
I am training a deep learning model (for image classification) with a modified <i class="Italic">Imagenet</i> net (I changed the <i class="Italic">numhiddenunits</i> in layer 22 to match the number of classes in my application). I'm using GraphLab Create 1.3 with EC2 's Nvidia GRID K520 GPU (3072 cores, 8GB). The training set is around 30K image (resized 256 by 256) with 5 classes.
I would greatly appreciate it if you kindly answer the following questions:
1- Is it possible to retrain a deep learning model by passing a trained network to the <i class="Italic">net</i> argument in the <i class="Italic">gl.neuralnet_classifier.create</i>?. This saves a lot of time if more iterations are required to improve the model accuracy, or new observations become available.
2- What is the stopping criteria for GraphLab's deep learning neural network?
3- From your experience, what would be an educated guess for the <i class="Italic">max_iterations</i> ?
4- Given the same number of observations for two datasets, does the difference in number of classes between the two datasets greatly increase the <i class="Italic">max_iterations</i> (for example, two datasets with 50K image each, but one has 10 classes and the other 100 classes).
5- I've noticed that <i class="Italic">gl.neuralnetclassifier.create </i> does not have <i class="Italic">classweights</i> argument . Does <i class="Italic">gl.neuralnet_classifier.create</i> handle an imbalanced dataset automatically, or do I have to under-sample my dataset to balance the classes before I train the model?
6- In my model, I kept <i class="Italic">validation='auto' </i>, Is the 5% validation set chosen using stratified random sampling (to maintain similar class distribution)?.
7 - To make the results of deep learning models reproducible , is it possible to have <i class="Italic">seed</i> argument in the <i class="Italic">gl.neuralnetclassifier.create</i> (similar to <i class="Italic">randomsplit(0.8, seed=5)</i> ). This would also be useful for other stochastic models (e.g. Boosted Trees).
Thank you for your time.