By default, each user is allocated a Home directory (/icto/user/$USER) and a Data directory (/data/home/$USER) for data storage. Additionally, three scratch directories (/scratch, /scratch2, /scratch3) are available for temporary storage during job execution. For FHS users, the Data directory is located at (/home/$USER). Faculty members and PIs have the option to purchase additional disk storage […]
Articles Tagged: Coral
Contributing to the HPCC environment
ICTO encourages faculty members to contribute to the HPCC environment by purchasing servers, which provide the following benefits: Professional Management: When you purchase computing equipment, it’s essential to consider factors like space and good server environment. Additionally, you need to handle maintenance tasks such as system patching and hardware replacement in case of failures. By […]
Useful Job Submission Queue and Linux Commands
Read More →What is the queuing and scheduling scheme in HPCC?
HPC Cluster runs under the control of a queue management system which is SLURM (Simple Linux Utility for Resource Management). SLURM is similar to most other queue systems. You write a batch script and submit it to the queue manager. The queue manager then schedules your job to run on the designated queue (called PARTITION in […]
What is “Environment Modules” in HPCC?
In order to use the installed software, it is necessary to configure the relating environment variables for compilers and libraries so that searching paths are correct, corresponding libraries can be linked up, license servers can be connected, etc. To make it easier for you to select and switch between different software packages, ‘Environment Modules’ (modules.sourceforge.net) […]
What is the available software installed in HPCC?
You may also refer to the output of the following command to get the most updated information. ‘module whichis’ Please refer the table here a rough overview of the available software.
How do I connect to HPCC?
If you are trying to access the HPCC from off-campus, please connect to the SSL VPN first. Web Portal https://login2.coral.um.edu.mo/ FAQ: visit here for more information SSH Host: login0.coral.um.edu.mo Host: login1.coral.um.edu.mo GPU: 2 x NVIDIA K40m Host: login2.coral.um.edu.mo GPU: 1 x NVIDIA A100 Port: 22 SSH Client: visit here for more information Graphical User Interface (GUI) https://login1.coral.um.edu.mo:4443/ […]