memory limit of 4GB in GPU?

User 2266 | 9/19/2015, 4:14:56 AM

Hi, I tried running GPU version of gl with my Titan X. TX comes with 12GB of memory, however i seems like gl is not fully using the memory in TX.

is there a configuration to set max memory to be used with GPU?

thanks

Comments

User 940 | 9/19/2015, 5:23:20 PM

Hi @sjl070707,

The GPU is used to store and update neural net parameters, therefore how much memory is used on the GPU should be directly correlated to the size of the neural network. At some point, there's nothing more to store on the GPU even though there's plenty of free space left.

Are you training your own neural net? If so,if you increase the size of the network(maybe by increasing number if units in a fully connected layer ), does the GPU usage go up? If not, then we should investigate.

Hope this helps!

Cheers! -Piotr


User 2266 | 9/29/2015, 2:08:28 AM

Hi Piotr As I increased the batch size of during the training (from 150) I saw the memory usage going up. since TX comes with 12GB memory, I have successfully increased the maximum memory usage upto 11GB in nvidia-smi

Thank you again for your help


User 940 | 10/6/2015, 5:39:44 PM

@sjl070707 ,

Yes, memory GPU usage scales linearly with batch size. However, if you grow batch size too much, it may impact the training. How large did you set it?

Cheers! -Piotr