Research Computing Environment
Location: Stanley Hydraulics Laboratory, fourth floor
Contact: Brian Miller
IIHR maintains a diverse set of computing resources and facilities. Over the past two decades IIHR has been at the forefront of HPC parallel applications moving from several large Silicon Graphics Power Challenge Array shared memory systems, a Sun Microsystems distributed memory system, to current large node distributed memory systems. Our codes are being implemented on Nvidia Kepler/Xeon Phi highly parallel systems and within various cloud computing environments.
The Neon and Argon clusters are currently the primary central HPC resources following the recent retirement of our initial HPC system, Helium. Collectively the two systems are composed of over 488 compute nodes with more than 10,640 processor cores. Each system has an internal high performance message-passing network (Infiniband and Omnipath) and the two systems are networked by trunked high speed 10Gb Ethernet links.
Both Neon and Argon are managed so that investor queues are quickly made available to members of the investor’s group. When idle, these resources are released for use by others. This system has worked out very nicely and will continue to be the model for UI HPC resource sharing.
Neon Cluster
The older HPC system, Neon, came on line in December 2013 to augment HPC resources available to IIHR researchers. Like Helium before it, Neon is operated by IIHR—Hydroscience & Engineering in conjunction with ITS and a group of collaborative researchers from around the university. Neon is a shared system with, currently, 4,256 standard cores, 2,280 Xeon Phi cores, 27 TB memory, 500 TB of storage, and 40 Gbps Infiniband QDR message passing fabric.
Argon Cluster
The newest HPC system, Argon, came on line in January 2017. Like its predecessors, Helium and Neon, Argon is jointly operated by IIHR, ITS and a group of collaborative researchers from around the university. Argon is a shared system with, currently, 6,400 standard cores, 2,280 Xeon Phi cores, 58 TB memory, 100 TB of NFS scratch storage, and 100 Gbps Omnipath message passing fabric with a 5:1 oversubscription. Users have a 1 TB home directory allocation.
The exact numbers of each of the following nodes is in flux as nodes are still be purchased.
The following is a description of other major computing resources, equipment, services, and software available to all IIHR affiliates and students.
- IIHR operates several large scale data harvesting and processing systems related to flood sensing and modeling. The IFIS system collects LDM and other weather data and builds a sequence of products for later modeling. Raw data packets are ingested on one system and passed to another system for processing and storage in a database. A third system provides web-based access to these data products. Similarly, a network of bridge mounted flow sensors supply data to servers that are handled in a manner similar to the IFIS network. This architecture has proven scalable and reliable.
- HPC at IIHR is augmented by 18 Silicon Mechanics storage units, providing 750 TB of storage in a RAID 60 configuration. This storage space is replicated to an offsite location with hourly snapshots taken for user-invoked file recovery.
- Supporting the local centralized facilities are 80 Linux workstations and more than 300 individual PCs running MS Windows 7. There are 30 PC-based servers handling web, ftp, security, and specialized database services. Many of the servers are virtualized using VMWare hosts at IIHR and the centralized Information Technology Facility (ITF).In addition, a number user-located storage devices, publication-quality color printers, scanners, cameras, and other peripherals are in use.
- This hardware is complemented by a carefully selected set of public domain, commercial, and proprietary software packages that include Tecplot, Gridgen, Fluent, FlowLab, Matlab, Origin, ERDAS, ERMapper, ERSI, Skyview, and the core GNU utilities. Additionally, software such as AutoCAD, MS Windows, MS Office, OS X, Mathematica, IDL, SigmaPlot, and SAS, are used under university-wide site licenses.