Big Graph Partitioning

User 2049 | 6/12/2015, 5:29:27 PM

Hi all,

I used to installed GraphLab Power on my Labtop for graph partitioning. However, I only work with small graph (about <5 millions vertices).

Now, I want to partition big graph about 10 millions vertices and 30 millions edges. It is difficult to me.

As I know, I must install GraphLab on Hadoop system. So, I installed a cluster hadoop system with 16 nodes (32 map tasks and 32 reduce tasks).

Can you help me how to install GraphLab (Dato..) on my Hadoop System for partition a big graph.

Thank you very much, Quyet NV

Comments

User 19 | 6/14/2015, 8:49:12 PM

Hi Quyet,

The PowerGraph open source project is no longer supported, but you can try GraphLab Create (for free) which offers a scalable graph data structure (the SGraph) which can handle graphs of that size on a single system.

What kinds of algorithms are you aiming to use for graph partitioning?

Cheers, Chris