[chimerax-users] Specify GPU id for ChimeraX
Tom Goddard
goddard at sonic.net
Tue Nov 24 10:47:16 PST 2020
Hi Shasha,
ChimeraX does not use CUDA. It only uses the graphics card with OpenGL for graphics rendering, not for non-graphical calculations. There is one exception to that, the ISOLDE plugin to ChimeraX can use CUDA if you tell it to.
So I think the environment variable you would need to use is NVIDIA_VISIBLE_DEVICES. I don't know why that would not work. ChimeraX is using Qt to create a QOpenGLContext(). That Python code is in your distribution in file
chimera/lib/python3.7/site-packages/chimerax/graphics/opengl.py
# Create context
from PyQt5.QtGui import QOpenGLContext
qc = QOpenGLContext()
qc.setScreen(self._screen)
The Qt window toolkit has no capabilities to choose the GPU as far as I know. I don't have a multi-GPU nvidia system to test on, but I tried starting ChimeraX from a bash shell with
NVIDIA_VISIBLE_DEVICES=1 chimerax
and put in code to print the environment variables before the QOpenGLContext is created and the environment is printed and set. I was worried that ChimeraX might remove some environment variables but that does not happen. So I cannot explain why the environment variable does not work.
I know nothing about Nvidia-SMI but am surprised that it can choose between different graphics cards while rendering to the same screen. I am more familiar with macOS with an external GPU and two displays. With that operating system if I run ChimeraX on the iMac and MacBook laptop display it uses the computer's graphics, and if I run ChimeraX on an external display attached to the external GPU it runs it using the external GPU -- in other words the display you run on controls which GPU is used. In fact, on macOS it remarkably switches which GPU is being used if I simply drag the ChimeraX window from one display to the other. Of course Ubuntu is entirely different and it seems like NVIDIA_VISIBLE_DEVICES=1 should work.
Tom
> On Nov 24, 2020, at 9:21 AM, Shasha Feng <shaalltime at gmail.com> wrote:
>
> Hi Guillaume, and Eric
>
> Thanks for the tip. The temporary assignment of visiable GPU devices is exactly what I want to get. Though it looks like the recipe of using 'CUDA_VISIBLE_DEVICES=1' does not work at least on my ubuntu 20.04 with chimerax 1.0. I also tried Eric's suggestion just now.
>
> sf at sf-MS-7C35:~$ echo $CUDA_VISIBLE_DEVICES
>
> sf at sf-MS-7C35:~$ export CUDA_VISIBLE_DEVICES=1
> sf at sf-MS-7C35:~$ echo $CUDA_VISIBLE_DEVICES
> 1
> sf at sf-MS-7C35:~$ chimerax &
> [1] 673010
> sf at sf-MS-7C35:~$ nvidia-smi
> Tue Nov 24 12:09:28 2020
> +-----------------------------------------------------------------------------+
> | NVIDIA-SMI 450.66 Driver Version: 450.66 CUDA Version: 11.0 |
> |-------------------------------+----------------------+----------------------+
> | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
> | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
> | | | MIG M. |
> |===============================+======================+======================|
> | 0 GeForce RTX 207... Off | 00000000:2D:00.0 On | N/A |
> | 60% 74C P2 191W / 215W | 763MiB / 7974MiB | 99% Default |
> | | | N/A |
> +-------------------------------+----------------------+----------------------+
> | 1 GeForce RTX 207... Off | 00000000:2E:00.0 Off | N/A |
> | 0% 34C P8 14W / 215W | 14MiB / 7982MiB | 0% Default |
> | | | N/A |
> +-------------------------------+----------------------+----------------------+
>
> +-----------------------------------------------------------------------------+
> | Processes: |
> | GPU GI CI PID Type Process name GPU Memory |
> | ID ID Usage |
> |=============================================================================|
> | 0 N/A N/A 1343 G /usr/lib/xorg/Xorg 35MiB |
> | 0 N/A N/A 2338 G /usr/lib/xorg/Xorg 174MiB |
> | 0 N/A N/A 2463 G /usr/bin/gnome-shell 233MiB |
> | 0 N/A N/A 671633 G ...AAAAAAAAA= --shared-files 45MiB |
> | 0 N/A N/A 672504 C /opt/conda/bin/python 229MiB |
> | 0 N/A N/A 673010 G chimerax 33MiB |
> | 1 N/A N/A 1343 G /usr/lib/xorg/Xorg 4MiB |
> | 1 N/A N/A 2338 G /usr/lib/xorg/Xorg 4MiB |
> +-----------------------------------------------------------------------------+
>
> After setting the environment variable and running chimerax in the same session, it still runs on GPU 0.
> I also tried a recipe that defines
> "export NVIDIA_VISIBLE_DEVICES=1,
> export CUDA_VISIBLE_DEVICES=0" shared here [https://stackoverflow.com/a/58445444 <https://stackoverflow.com/a/58445444>]. It does not work either.
>
> To ChimeraX developers,
> I wonder how ChimeraX is exposed to CUDA. I have basis in CUDA computing and using CUDA in Python. If you can give some clues, that would be great.
>
> Best,
> Shasha
>
> On Tue, Nov 24, 2020 at 12:18 PM Eric Pettersen <pett at cgl.ucsf.edu <mailto:pett at cgl.ucsf.edu>> wrote:
> To supplement Guilaume's very helpful answer, you could make an alias to reduce the typing involved, and you could put the alias in your shell startup file. For the bash shell, the syntax for making an alias named 'cx' for the command would be:
>
> alias cx="CUDA_VISIBLE_DEVICES=1 chimerax"
>
> Other shells have similar (but not necessarily identical) syntaxes.
>
> --Eric
>
> Eric Pettersen
> UCSF Computer Graphics Lab
>
>
>> On Nov 24, 2020, at 12:09 AM, Guillaume Gaullier <guillaume at gaullier.org <mailto:guillaume at gaullier.org>> wrote:
>>
>> Hello,
>>
>> You can restrict which of your GPUs ChimeraX will be able to detect by starting it from the shell like so:
>>
>> CUDA_VISIBLE_DEVICES=1 chimerax
>>
>> replace 1 with the device number you want, this is the same one as reported by nvidia-smi. This will work until you close ChimeraX, next time you run it you still need to add the environment variable before the "chimerax" command.
>>
>> You can also make this environment variable stay around until you close the shell session like so:
>>
>> export CUDA_VISIBLE_DEVICES=1
>>
>> then you can open ChimeraX from the same shell session, close it, and reopen with only the "chimerax" command and it should still only see the GPU you indicated.
>>
>> When you close and restart your shell, you will have to export the environment variable again. I don’t recommend adding the export to your ~/.bashrc or other shell initialization script, because then all your shell sessions will have this environment variable set, so all the commands you run will only see this GPU, which is probably not what you want. It is less likely to get in your way down the road if you only set this environment variable for the duration of a shell you opened specifically to run ChimeraX from.
>>
>> I hope this helps,
>>
>> Guillaume
>>
>>
>>> On 24 Nov 2020, at 01:51, Shasha Feng <shaalltime at gmail.com <mailto:shaalltime at gmail.com>> wrote:
>>>
>>> Hi Tom,
>>>
>>> Sorry about not clarifying my operating system. I am using ubuntu 20.04 with two NVIDIA GPU cards.
>>> Do I need to change OpenGL setting or reconfigure the nvidia setting?
>>>
>>> Thanks,
>>> Shasha
>>>
>>>
>>>
>>> On Mon, Nov 23, 2020 at 6:58 PM Tom Goddard <goddard at sonic.net <mailto:goddard at sonic.net>> wrote:
>>> Hi Shasta,
>>>
>>> ChimeraX has no way to select which GPU it uses. The operating system or opengl driver decides. You didn't mention which operating system you are using. Here is an example of how to set the default OpenGL GPU in Windows.
>>>
>>> https://www.techadvisor.co.uk/how-to/pc-components/how-set-default-graphics-card-3612668/ <https://www.techadvisor.co.uk/how-to/pc-components/how-set-default-graphics-card-3612668/>
>>>
>>> Tom
>>>
>>>
>>>> On Nov 23, 2020, at 2:38 PM, Shasha Feng <shaalltime at gmail.com <mailto:shaalltime at gmail.com>> wrote:
>>>>
>>>> Hi,
>>>>
>>>> Is there any way to specify which GPU device for ChimeraX to run on? Currently, it uses the default GPU 0, which can disturb the existing jobs. Thanks.
>>>>
>>>> Best,
>>>> Shasha
>>>>
>>>> _______________________________________________
>>>> ChimeraX-users mailing list
>>>> ChimeraX-users at cgl.ucsf.edu <mailto:ChimeraX-users at cgl.ucsf.edu>
>>>> Manage subscription:
>>>> https://www.rbvi.ucsf.edu/mailman/listinfo/chimerax-users <https://www.rbvi.ucsf.edu/mailman/listinfo/chimerax-users>
>>>
>>>
>>> _______________________________________________
>>> ChimeraX-users mailing list
>>> ChimeraX-users at cgl.ucsf.edu <mailto:ChimeraX-users at cgl.ucsf.edu>
>>> Manage subscription:
>>> https://www.rbvi.ucsf.edu/mailman/listinfo/chimerax-users <https://www.rbvi.ucsf.edu/mailman/listinfo/chimerax-users>
>>
>> _______________________________________________
>> ChimeraX-users mailing list
>> ChimeraX-users at cgl.ucsf.edu <mailto:ChimeraX-users at cgl.ucsf.edu>
>> Manage subscription:
>> https://www.rbvi.ucsf.edu/mailman/listinfo/chimerax-users <https://www.rbvi.ucsf.edu/mailman/listinfo/chimerax-users>
>
>
> _______________________________________________
> ChimeraX-users mailing list
> ChimeraX-users at cgl.ucsf.edu
> Manage subscription:
> https://www.rbvi.ucsf.edu/mailman/listinfo/chimerax-users
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://plato.cgl.ucsf.edu/pipermail/chimerax-users/attachments/20201124/af996367/attachment.html>
More information about the ChimeraX-users
mailing list