Welcome to MilkyWay@home

GPU RAM Requirements

Message boards : Application Code Discussion : GPU RAM Requirements
Message board moderation

To post messages, you must log in.

AuthorMessage
kevinjos

Send message
Joined: 13 Mar 18
Posts: 9
Credit: 66,232,294
RAC: 0
Message 67313 - Posted: 7 Apr 2018, 2:27:48 UTC

I am seeing errors such as the following when trying to run 8 WUs per GPU:

Error creating context (-6): CL_OUT_OF_HOST_MEMORY

https://milkyway.cs.rpi.edu/milkyway/result.php?resultid=2304133664

How much GPU RAM do WUs require on average? Is it unreasonable to expect 8 WUs to run on a GPU with only 12GBs RAM? I ask because when I only run 4 WUs per GPU the GPU utilization is not near 100%, thus why I would like to run a greater number at once. Could someone point to me the relevant lines in the code? I'd be happy to take a look to better understand the GPU RAM allocations.
ID: 67313 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Cautilus

Send message
Joined: 29 Jul 14
Posts: 19
Credit: 3,451,802,406
RAC: 0
Message 67322 - Posted: 11 Apr 2018, 3:31:41 UTC

Yeah it seems like the work units use up to 1.5GB of VRAM on NVIDIA cards for whatever reason. On AMD cards they only use like 100MB per work unit, I'm not sure why NVIDIA work units use so much more VRAM. But I would also like to know if there's a way to reduce the VRAM usage for NVIDIA cards.
ID: 67322 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
kevinjos

Send message
Joined: 13 Mar 18
Posts: 9
Credit: 66,232,294
RAC: 0
Message 67327 - Posted: 13 Apr 2018, 2:09:29 UTC

Yeah it seems like the work units use up to 1.5GB of VRAM on NVIDIA cards for whatever reason. On AMD cards they only use like 100MB per work unit, I'm not sure why NVIDIA work units use so much more VRAM. But I would also like to know if there's a way to reduce the VRAM usage for NVIDIA cards.


Thanks for sharing! This provides a good lead into where the issue my be. The GPUs I am using are from Nvidia. Also the 1.5GB VRAM observation is in line with 8WUs/GPU * 1.5GB/WU = 12GBs pushing the limit of a 12GB card.
ID: 67327 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
363fc9cda368b2e14d4322e60afde2...

Send message
Joined: 28 Sep 17
Posts: 19
Credit: 60,732,047
RAC: 0
Message 67556 - Posted: 31 May 2018, 19:17:06 UTC - in response to Message 67327.  

maybe the volta cards run so fast it needs more vram to keep up
ID: 67556 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote

Message boards : Application Code Discussion : GPU RAM Requirements

©2024 Astroinformatics Group