The GPU is used to store and update neural net parameters, therefore how much memory is used on the GPU should be directly correlated to the size of the neural network. At some point, there's nothing more to store on the GPU even though there's plenty of free space left.
Are you training your own neural net? If so,if you increase the size of the network(maybe by increasing number if units in a fully connected layer ), does the GPU usage go up? If not, then we should investigate.
Hope this helps!