Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: OpenMPI updates still referenced Ginsburg

Table of Contents

...

Name

Version

Location / Module

Category

Apptainer 1.2.5-1.el9(loaded automatically on all compute nodes)Run Docker-like containers

cmake

3.20.2

(loaded automatically on all compute nodes)


cuda

12.3

(loaded automatically on GPU nodes)

GPU Computing

gcc 11.4.1(loaded automatically on all compute nodes)Compiler - C / C++

gdal, gdal-devel libraries

3.4.3(loaded automatically on all compute nodes)

gsl/gsl-devel libraries

2.6-7

(loaded automatically on all compute nodes)

GNU Scientific Library

gurobi10.0.3module load gurobi/10/0/3Prescriptive analytics platform and a decision-making technology 
hdf5/hdf5-devel

1.12.1-7

(loaded automatically on all compute nodes)

Hierarchical Data Format version 5

hdf5p

1.10.7 and 1.14.3

module load hdf5p

Hierarchical Data Format version 5, PARALLEL version
Intel oneAPI toolkit

various

module load intel-oneAPI-toolkit <library>

Core set of tools and libraries for developing high-performance, data-centric applications across diverse architectures. 
julia1.5.3module load julia/1.5.3Programming Language
knitro13.2.0module load knitro/13.2.0Software package for solving large scale nonlinear mathematical optimization problems; short for "Nonlinear Interior point Trust Region Optimization"
make4.3(loaded automatically on all compute nodes)
Mathematica14.0(loaded automatically on all compute nodes)Numerical Computing
MATLABR2023bmodule load MATLAB/2023bNumerical Computing
openmpi5.0.2module load openmpi/gcc/64/4.1.5a1OpenMPI Compiler (provided by Nvidia/Mellanox)

Python 

(Incl many libraries such as  numpy, torch, Tensorflow, scipy, and more)

3.9.18

(loaded automatically on all compute nodes)Python for Scientific Computing

Qt 5

5.15.9-1

(loaded automatically on all compute nodes)
R4.3.2(loaded automatically on all compute nodes)Programming Language
Schrodinger

2024-1

module load schrodinger

A collection of software for chemical and biochemical use. It offers various tools that facilitate the investigation of the structures, reactivity and properties of chemical systems.

Singularity (now called Apptainer. see above)




Visual Studio Code Server


Not a module

A server side Integrated Development Environment hosted on Insomnia compute nodes

...

The first time you launch Mathematica you will need to provide the host details of the MathLM (license) server. Using the Activating Mathematica guide, click 'Other ways to activate' then choose 'Connect to a Network License Server' and enter the IP address, 128.59.30.140

OpenMPI Settings 

...

The default OpenMPI on Insomnia is  openmpi-5.0.2, which is provided by Nvidia Mellanox and optimized for the MOFED stack. You will receive the following warnings when using mpirun/mpiexec:

...

You can pass the following option, which will use ucx which is default as of version 3.x:

--mca pml ucx --mca btl '^openib' 

To help with the following warning:

Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

You can also add:

--mca orte_base_help_aggregate 0

If you choose to use the openmpi/gcc/64/4.1.1_cuda_11.0.3_aware module, this version expects a GPU and will throw the following warning on non-GPU nodes:

The library attempted to open the following supporting CUDA libraries, but each of them failed. CUDA-aware support is disabled.
libcuda.so.1: cannot open shared object file: No such file or directory
libcuda.dylib: cannot open shared object file: No such file or directory
/usr/lib64/libcuda.so.1: cannot open shared object file: No such file or directory
/usr/lib64/libcuda.dylib: cannot open shared object file: No such file or directory

If you are not interested in CUDA-aware support, then run with --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you are interested in CUDA-aware support, then try setting LD_LIBRARY_PATH to the location of libcuda.so.1 to get passed this issue.

You can pass this option:

...

Insomnia has a few MPI options loadable as modules in addition to Intel oneAPI/hpctoolkit/mpi/2021.11:

• openmpi5/5.0.2
• mpi/mpich-x86_64/4.1.1
• mpi/openmpi-x86_64/4.1.1

If you find that a mpirun hangs or does not complete, try adding the following option:

 -mca coll ^hcoll

RStudio in an Apptainer container

...

Code Block
apptainer pull --name rstudio.simg docker://rocker/rstudio:4.3.1

In order for RStudio to start in a browser via an interactive session you will need the IP address of the compute node. Note that the IP below will likely be different for you:

Code Block
$ hostname -i 
10.197.16.39           (REMEMBER, this is only an example IP. Yours will likely be different)

From RStudio 4.2 and later, some added security features require binding of a locally created database file to the database in the container. Don't forget to change the password.

Code Block
mkdir -p run var-lib-rstudio-server

printf 'provider=sqlite\ndirectory=/var/lib/rstudio-server\n' > database.conf

PASSWORD='CHANGEME' singularity exec \
   --bind run:/run,var-lib-rstudio-server:/var/lib/rstudio-server,database.conf:/etc/rstudio/database.conf \
   rstudio.simg \
   /usr/lib/rstudio-server/bin/rserver --auth-none=0 --auth-pam-helper-path=pam-helper --server-user=$USER

This will run rserver in a Singularity container.



Now open another Terminal and start the RStudio rserver session using Port Forwarding to connect a local port on your computer to a remote one on Insomnia.

...

Visual Studio Code Server

Note

A pre-existing Github account is now required to use the below instructions.


Visual Studio Code is an Integrated Development Environment that some like to use on their laptops. If you are familiar with that, the HPC has a server-side version hosted on the compute nodes (NOT the login nodes) for users to connect their local VS Code application on their laptop to Insomnia and open files from their Insomnia folder directly.  To use it, do the following:

...

* Visual Studio Code Server
 
*
 
* By using the software, you agree to
 
* the Visual Studio Code Server License Terms (https://aka.ms/vscode-server-license) and
 
* the Microsoft Privacy Statement (https://privacy.microsoft.com/en-US/privacystatement).
 
*
 
[2024-04-05 16:12:16] info Using Github for authentication, run `code tunnel user login --provider <provider>` option to change this.
 
To grant access to the server, please log into https://github.com/login/device and use code <###-###>


When you use the device login url, you will first get a page asking you to use your GitHub credentials to login.

After using your GitHub login, then you will get a page asking you to input the device code  given you above (represented as <###-###> here)

Next you will see a page requesting you to authorize VS Code studio's access permissions.

After that, when you use your local VS Code application on your computer, you will see a running ssh tunnel listed. Double click to connect to it. This can take a moment to finish.

Once done, you will be able to open files in your Insomnia folder the same as you do ones on your local computer.