Solved it by use SFrame.read_csv() instead. Now the program runs successfully, but there's a new problem:
I'm trying to process a input dataset with size of 18MB, each row has ~1700 features. For a test processing in which I only picked out 10 rows of the file, the program runs without problem. But for the full dataset, it failed as:
Unable to reach server for 3 consecutive pings. Server is considered dead. Please exit and restart.
Traceback (most recent call last):
File "LogisticRegression.py", line 7, in <module>
File "/usr/local/lib/python2.7/dist-packages/graphlab/toolkits/regression/logisticregression.py", line 303, in create
ret = graphlab.toolkits.main.run("regressiontraininit", opts)
File "/usr/local/lib/python2.7/dist-packages/graphlab/toolkits/main.py", line 73, in run
(success, message, params) = unity.runtoolkit(toolkitname, options)
File "cyunity.pyx", line 59, in graphlab.cython.cyunity.UnityGlobalProxy.runtoolkit
File "cyunity.pyx", line 63, in graphlab.cython.cyunity.UnityGlobalProxy.runtoolkit
RuntimeError: Communication Failure: 113.
[INFO] Stopping the server connection.
Unable to reach server for 4 consecutive pings. Server is considered dead. Please exit and restart.
[WARNING] <type 'exceptions.IOError'>
[WARNING] <type 'exceptions.ValueError'>
[INFO] GraphLab server shutdown
Seems the program could not process this 18MB input file(not so big I think)? I'm running now on a 2 cores machine with 7GB memory. Any suggestions? Thanks!