Hi Danny, thanks for replying. Yes, I'm planning to use hdfs. My cluster size is approximately 20 machines and data I want to process is huge too, so I cannot ignore hdfs.
Even with compiling the specific toolkit it's giving me the same error, its trying to build the hadoop it downloaded.
~/graphlab/release/toolkits/graph_analytics $ make
[ 0%] Performing build step for 'hadoop'
./libtool: eval: line 4783: unexpected EOF while looking for matching `''
./libtool: eval: line 4784: syntax error: unexpected end of file
make: * [libhdfs.la] Error 2
make: [../deps/hadoop/src/hadoop-stamp/hadoop-build] Error 2
make: ** [CMakeFiles/hadoop.dir/all] Error 2
make: *** [all] Error 2
I checked both the config.log and configure.deps but could not find any problem there. Here is the content of configre.deps and I've attached the log herewith:
Release build directory:
Debug build directory (optimization disabled):
Directory in which graphlab is installed (prefix):
Is experimental (research) code enabled:
The graphlab home directory:
The directory in which graphlab installs external dependencies:
Use OpenMP? This can accelerate some graph building code:
Use MPI? Without MPI GraphLab cannot run distributed:
Use tcmalloc? Thread-Caching Malloc improves memory allocation:
The c compiler to use:
The c++ compiler to use:
Any addition user defined CFLAGS:
The Java compiler:
The cmake binary used to geneate the project:
Thanks for your help.
NOTE: I have hadoop installed in my system, I hope it'd not be an issue while istalling graphlab coz I see the error involves building hadoop which graphlab downloads while executing make for the first time.
EDIT: Even while compiling without hadoop this error comes, seems likes the error is related to something else, not related to graphlab.