User 934 | 11/11/2014, 4:19:30 PM
Currently I am running Graphlab, and found that SGD (Stochastic gradient descent) does not converge without setting max_iter. I also set tol (tolerance) to a large value (1000), but it does not make any difference. This is the command line I used to run SGD:
./sgd --matrix /project/aachien/scratch/inputfy/syntheticdata/als0-0 --D 20 --tol 1000 --ncpus 4
Is it a requirement to set maximum iteration to run this algorithm? Can't SGD terminate when its prediction error is within a small range?