Research Computing Environment
Location: Stanley Hydraulics Laboratory, fourth floor
Contact: Mark Wilson
IIHR—Hydroscience & Engineering (IIHR) maintains a diverse set of computing resources and facilities. Over the past two decades, IIHR has been at the forefront of high-performance computing (HPC) parallel applications, moving from several large Silicon Graphics Power Challenge Array shared memory systems, a Sun Microsystems distributed memory system, to the current large node distributed memory systems. Our codes are being implemented on Nvidia Kepler/Xeon Phi highly parallel systems and within various cloud computing environments.
The following is a partial list of equipment, services, and software available to all IIHR affiliates and students.
- The primary compute platform is a High-Performance Compute (HPC) cluster called Helium. It is a shared system built from Hewlett-Packard DL160 nodes that features 3,800 total cores, 12 TB of memory, more than 500 TB of storage, a 40 Gbps Voltaire Infiniband QDR message passing fabric for MPI communications, and three Ethernet networks for management and NFS storage. The cluster queuing system, Sun Grid Engine, provides access to very large jobs, well beyond the limits of the dedicated hardware for any individual user. The programming environment includes OpenMP, MPI, and the Intel and GNU compiler and tool suites. The cluster was acquired with funding from the NIH, AFOSR, and a number of individual researcher-led contributions, in addition to monies from the College of Engineering and the university provost. It is operated by IIHR in conjunction with ITS and a group of collaborative researchers.
- A second HPC system, Neon, came on line in December 2013 to augment HPC resources available to IIHR researchers. Like Helium, Neon is operated by IIHR in conjunction with ITS and a group of collaborative researchers from around the university. Neon is a shared system with (currently) 3,884 standard cores, 2,280 Xeon Phi cores, 27 TB memory, 500 TB of storage, and 40 Gbps Infiniband QDR message passing fabric.
- IIHR operates several large scale data harvesting and processing systems related to flood sensing and modeling. The Iowa Flood Information System (IFIS) collects LDM and other weather data and builds a sequence of products for later modeling. Raw data packets are ingested on one system and passed to another system for processing and storage in a database. A third system provides web-based access to these data products. Similarly, a network of bridge-mounted flow sensors supply data to servers that are handled in a manner similar to the IFIS network. This architecture has proven scalable and reliable.
- HPC at IIHR is augmented by 18 Silicon Mechanics storage units, providing 750 TB of storage in a RAID 60 configuration. This storage space is replicated to an offsite location with hourly snapshots taken for user-invoked file recovery.
- Very large-scale computations are done at national and international computation centers accessed through longstanding IIHR-center relationships. In addition to the NSF and DOD/DOE centers (e.g., NCSA, Argonne National Labs), IIHR has developed a continuing collaboration with the National Center for High Performance Computing (NCHC) in Taiwan.
- Eighty Linux workstations and more than 300 individual PCs running MS Windows 7 support the local centralized facilities. Thirty PC-based servers handle web, ftp, security, and specialized database services. Many of the servers are virtualized using VMWare hosts at IIHR and the centralized Information Technology Facility (ITF). In addition, a number of user-located storage devices, publication-quality color printers, scanners, cameras, and other peripherals are in use.
- This hardware is complemented by a carefully selected set of public domain, commercial, and proprietary software packages that include Tecplot, Gridgen, Fluent, FlowLab, Matlab, Origin, ERDAS, ERMapper, ERSI, Skyview, and the core GNU utilities. Additionally, software such as AutoCAD, MS Windows, MS Office, OS X, Mathematica, IDL, SigmaPlot, and SAS are used under university-wide site licenses.