PyLambda Worker Memory Management

User 190 | 11/5/2014, 4:04:16 PM

Hello, I'm running graphlab create inside an iPython notebook. When I'm doing especially large projects I run out of memory at certain stages in the process.

When I check htop I have 16 pylambda workers each consuming ~1.5GB of RAM.

Is there a way to manage this better? Can I get them to release the memory or decrease their memory footprint?

Comments

User 14 | 11/5/2014, 6:17:21 PM

A python process does not free memory until it gets shutdown, I guess we can expose a functionality to reset those workers since they are stateless anyway.

Meanwhile, you can try reducing the number of lambda workers to use on starting GraphLab Create by using

<pre><code> gl.setruntimeconfig('GRAPHLABDEFAULTNUMPYLAMBDAWORKERS', 8) </code></pre>


User 190 | 11/5/2014, 8:19:20 PM

That'd be pretty key -- those little workers get pretty heavy and need to go on a memory diet once in a while.


User 1933 | 7/7/2015, 8:03:59 PM

Has there been any movement on this? This issue is totally breaking a model we're working on...