...
The Fitzpatrick Lab currently has 5 6 workstations that are managed by Zuckerman Institute Research Computing. 1 of these workstations was built by Lambda Labs, 4 of these workstations were built by Single Particle while 1 was built by Exxact. The following table summarizes the hardware specifications for these workstations:
...
Hostname | Vendor | CPU | Logical Cores | CPU Clock Speed (GHz) | GPU Model | Number of GPUs | Operating System |
---|---|---|---|---|---|---|---|
exxgpu1.fitzpatrick | Exxact | Intel(R) Core(TM) i7-8700 CPU | 12 | 3.2 | NVIDIA GeForce 1060 | 1 | CentOS 7 |
spgpu3.fitzpatrick | Single Particle | Xeon(R) Silver 4116 | 48 | 2.1 | GeForce RTX 2080 Ti | 1 | CentOS 7 |
spgpu2.fitzpatrick | Single Particle | Xeon(R) Silver 4116 | 48 | 2.1 | GeForce RTX 2080 Ti | 1 | CentOS 7 |
spgpu1.fitzpatrick | Single Particle | Xeon(R) Silver 4116 | 48 | 2.1 | GeForce RTX 2080 Ti | 1 | CentOS 7 |
spgpu4.fitzpatrick | Single Particle | Xeon(R) Silver 4116 | 48 | 2.1 | GeForce RTX 2080 Ti | 1 | CentOS 7 |
warp.fitzpatrick | Lambda Labs | AMD Threadripper 3960X | 48 | 4.5 | GeForce RTX 2080 Ti | 1 | Windows 10 Enterprise Edition |
Linux Workstations
Software Installed
Each of the Linux workstations listed above has the following software installed:
...
Code Block |
---|
[zrcadmin@spgpu3 ~]$ module load imod Lmod has detected the following error: Cannot load module "imod" because these module(s) are loaded: sbgrid While processing the following module(s): Module fullname Module Filename --------------- --------------- imod /opt/lmod/modulefiles/Linux/imod.lua |
Application-Specific Notes
cryoCARE
cryoCARE is set up with a custom wrapper script for ease of use. The following syntax is used by this wrapper script:
...
Code Block |
---|
singularity run --nv -B /opt/cryocare/user:/run/user -B /opt/cryocare/example/:/notebooks -B <DATA DIRECTORY>:/data /opt/cryocare/cryoCARE_v0.1.1.simg |
CryoSPARC
CryoSPARC is available by first starting it up (if it isn't already running) and then navigating to either http://localhost:39000 if you are working on the machine that you want to run it on, or by navigating to http://spgpu#.fitzpatrick.zi.columbia.edu:39000, where # is the number associated with the workstation (see chart above). Note that using the full spgpu domain name will only work if you are on the Columbia campus or using the CUIT VPN.
...
Note |
---|
CryoSPARC will not start up automatically after a reboot, and must be manually started from the command line after a restart occurs. |
Remote Access and Network Restrictions
To access a Linux workstation remotely, you can type the following command:
...
For security reasons, the Fitzpatrick workstations are only remotely accessible if you are on the Columbia campus or using the CUIT VPN.
Windows Workstation
Software Installed
The Windows workstation has the following software installed on it:
These software applications all have shortcuts on the desktop of the main account (Anthony Fitzpatrick).
Network Storage
The Fitzpatrick lab's Engram storage is configured to be mapped to the D drive automatically when the main account (Anthony Fitzpatrick) logs in.
Application-Specific Notes
M
M will take in data produced by RELION as input as described in the pipeline outline here. Since RELION is not a native Windows application, this will require that you make heavy use of the Fitzpatrick lab Engram storage, which (as mentioned above) is mapped to the D drive. Roughly, you will need to perform the first steps of the pipeline (preprocessing, particle image/sub-tomogram export) on warp.fitzpatrick using Warp, then run classification and refinement on the cryoem cluster or a Linux-based workstation and save the results to Engram, and then perform the final steps of the pipeline on warp.fitzpatrick using M.