The Hazel High Performance Computing (HPC) cluster is a shared system that helps researchers and students run computational work that is too large, too time-consuming, or too complex for a typical desktop or lab server. HPC enables faster turnaround, larger models, and the ability to run many analyses in supporting research and instruction across a broad range of disciplies.
Work on the HPC cluster is submitted as a job to a scheduler. Users request the resources their work such as CPU cores, memory, GPUs, and run time and the scheduler runs the job when resources are available. This approach supports efficient, fair sharing while enabling jobs that scale from quick tests to large production campaigns.
Common HPC usage patterns include:
The HPC cluster supports a broad mix of software: licensed commercial applications, community-supported research codes, and custom tools developed by research groups.
Many researchers use HPC to scale widely adopted commercial applications (such as Ansys and Gaussian) running larger models, higher fidelity simulations, or more design iterations:
HPC is also commonly used for open research codes maintained by scientific communities and optimized for parallel computing such as WRF (Weather Research and Forecasting Model).
A substantial portion of HPC use is research-group models, pipelines, and analysis tools written in Python, C/C++, Fortran, R, CUDA, and other languages. These applications often combine multiple steps (simulation, data processing, visualization, AI) and scale across many cores, nodes, or GPUs.
Increasingly applications are available as container that help to enhance reproducibility of results as well as ease of use. Apptainer is provided on the HPC cluster for running containers.
HPC is optimized for batch, compute-intensive workloads. In some situations, other computing options are a better fit:
HPC helps NC State users:
Go to the Request Access page for instructions on how to start a project
So take a look at what OIT-HPC has to offer and schedule a consultation any time.