Graphlab does not converge when running SGD

User 934 | 11/11/2014, 4:19:30 PM

Currently I am running Graphlab, and found that SGD (Stochastic gradient descent) does not converge without setting max_iter. I also set tol (tolerance) to a large value (1000), but it does not make any difference. This is the command line I used to run SGD:

./sgd --matrix /project/aachien/scratch/inputfy/syntheticdata/als0-0 --D 20 --tol 1000 --ncpus 4

Is it a requirement to set maximum iteration to run this algorithm? Can't SGD terminate when its prediction error is within a small range?

Comments

User 6 | 11/11/2014, 4:23:54 PM

Hi, PowerGraph SGD is not guaranteed to converge, as we play with the ordering of updates and some updates are delayed.

I highly recommend switching to Graphlab Create, where we have improved our SGD significantly.