ParaView on Levante#

There are two main solutions for using ParaView on levante. You can use the Client-Server mode, where the client is running on your local machine, and connects to a paraview server on levante, or you can get a (fraction of a) GPU node, and run ParaView there via (turbo)vnc.

Using the Client-Server mode#

For using ParaView on levante, there currently is the option of using the client-server mode of ParaView 5.10.1. To use it, download paraview 5.10.1 on your local computer.

Warning

You need exactly version 5.10.1! Not 5.11.X (the default on the paraview download page)

These instructions and the setups described here are under active development. Fail fast, try again, fail better. In case of issues, please contact us via support@dkrz.de .

The paraview-internal solution#

For this solution, you need ssh-askpass (available in basically all linux distributions), or password-less ssh authentications (passwordless keys or ssh-agent), or putty with ssh-keys on windows.

For information on building ssh connections, see Access and Environment.

Download the profile file , start paraview 5.10.1 on your local system. (on MacOS, open /Applications/ParaView-5.10.1.app will pass the ssh-settings from your shell into paraview).

  • File -> Connect…

  • [ Load Servers ]

  • Load the profile file.

  • Chose DKRZ Levante

  • [ Connect ]

  • Adjust “SSH Username” to your dkrz user name, “Account to be charged” to your DKRZ project ID (ab1234).

  • Chose a fairly random “pvserver port” somewhere in the thousands (trying to avoid collisions with other users)

  • The other choices should probably stay at their default. You might need to adjust the ssh command (plink not putty.exe from your putty for windows). If you are using Windows, and the path to your plink contains spaces (which it most likely does), you need to encapsulate the path in quotes after locating the file.

  • Currently only mesa rendering is supported. The GPU support will follow.

  • [ OK ]

Known issues#

  • Your ssh command (plink / ssh ) might not be used to connecting to levante6.dkrz.de. Try "PATH_TO_PLINK/plink" levante6.dkrz.de (resp. ssh levante6.dkrz.de) in a terminal to get rid of error messages regarding host keys / …

  • This approach works best with passwordless keys. If you use keys with a password, you will either need to use an ssh_agent, and provide that with the password before connecting (and make sure ParaView knows about the agent), or use ssh-askpass.

  • ssh-askpass is not really supported on mac. You can install it with macports, but then paraview (or your ssh?) might search for it in /usr/X11R6/bin/ssh-askpass, while it is in /opt/local/bin/ssh-askpass. Symlinks are your friends to fix this issue.

  • Older versions of the script use levante6.dkrz.de instead of levante.dkrz.de because there were differences in the ssh configurations of the different login nodes. levante.dkrz.de should also work, and we have updated the script accordingly.

The Shell script solution#

On U*X systems#

  • Download the connect-script (currently in an early draft status).

  • Adapt

    • PARAVIEW_ACCTCODE (l55) to your DKRZ project id (ab1234),

    • PARAVIEW_USERNAME (l62) to your DKRZ user name

    • PARAVIEW_CLIENT (l67) to the path to the paraview binary you just downloaded

  • Execute the script, and you should be set.

  • Come back regularly to this page to check for updates.

If you don’t have bash and ssh#

You need to allocate a node (from compute or a fraction of it via interactive),:

salloc -N 1 -p compute,interactive --mem=0 -A ab1234

and build an ssh tunnel to it – the equivalent of

ssh -J levante.dkrz.de -L 11111:localhost:11111 THE_NAME_OF_THE_NODE_YOU_JUST_ALLOCATED

Then connect to localhost:11111 via paraview (File->Connect...).

The VNC Solution#

You can use turbovnc on the GPU nodes of levante. Currently, 4 GPU nodes with 4 nvidia A100 GPUs each are available to all users. The easiest way of accessing these nodes is via the jupyterhub interface.

If you want to use the native turbovnc client of your system, you need to deal with the fact that these nodes are behind the firewall of the system, so a plausible workflow is along the lines of

1st terminal (windows users can use putty instead of ssh):

ssh levante
salloc --mem=100G --gpus=1 -p gpu -A YOUR_PROJECT_ID
module purge # some modules conflict with the window manager of vnc
/opt/TurboVNC/bin/vncserver -wm /sw/bin/vncsession -geometry 2400x1600 &          # this will tell you a display id
hostname # l12345.atos.local

2nd terminal(windows users use plink instead of ssh)

ssh -L 5901:l12345:590X levante.dkrz.de # where 'l123456' is the id from 'hostname' above, and 'X' is the display id you just got, e.g. 1
# windows users use plink instead of ssh
  • Start your local turbovnc and connect to localhost:1 , where the 1 is the last digit of the 5901 to the left of l12345 in the ssh command.

  • Open a temrinal in the vnc desktop environment

  • Set export VGL_DISPLAY=egl0

  • Call vglrun /work/k20200/k202160/paraview-build/install_v5.13.0_1e357ac85b42/bin/paraview or similar.

Known issues#

  • Sometimes ParaView will show rendering artefacts. If similar artefacts occur in vglrun glxgears, please report the node id to support@dkrz.de