4.1 Article

Inference-Optimized AI and High Performance Computing for Gravitational Wave Detection at Scale

Journal

FRONTIERS IN ARTIFICIAL INTELLIGENCE
Volume 5, Issue -, Pages -

Publisher

FRONTIERS MEDIA SA
DOI: 10.3389/frai.2022.828672

Keywords

gravitational waves; black holes; AI; HPC; GPU-accelerated computing

Funding

  1. National Science Foundation (NSF) [OAC-1931561, OAC-1934757]
  2. NSF's Major Research Instrumentation program
  3. HAL cluster [DE-AC05-00OR22725]
  4. University of Illinois at Urbana-Champaign
  5. NVIDIA
  6. DOE Office of Science User Facility
  7. [OAC-1725729]
  8. [DE-AC02-06CH11357]

Ask authors/readers for more resources

This study presents an ensemble of AI models for gravitational wave detection, trained and optimized on supercomputers, which achieved fast processing and maintained the accuracy of traditional models. It offers the necessary tools for accelerated, AI-driven gravitational wave detection at scale.
We introduce an ensemble of artificial intelligence models for gravitational wave detection that we trained in the Summit supercomputer using 32 nodes, equivalent to 192 NVIDIA V100 GPUs, within 2 h. Once fully trained, we optimized these models for accelerated inference using NVIDIA TensorRT. We deployed our inference-optimized AI ensemble in the ThetaGPU supercomputer at Argonne Leadership Computer Facility to conduct distributed inference. Using the entire ThetaGPU supercomputer, consisting of 20 nodes each of which has 8 NVIDIA A100 Tensor Core GPUs and 2 AMD Rome CPUs, our NVIDIA TensorRT-optimized AI ensemble processed an entire month of advanced LIGO data (including Hanford and Livingston data streams) within 50 s. Our inference-optimized AI ensemble retains the same sensitivity of traditional AI models, namely, it identifies all known binary black hole mergers previously identified in this advanced LIGO dataset and reports no misclassifications, while also providing a 3X inference speedup compared to traditional artificial intelligence models. We used time slides to quantify the performance of our AI ensemble to process up to 5 years worth of advanced LIGO data. In this synthetically enhanced dataset, our AI ensemble reports an average of one misclassification for every month of searched advanced LIGO data. We also present the receiver operating characteristic curve of our AI ensemble using this 5 year long advanced LIGO dataset. This approach provides the required tools to conduct accelerated, AI-driven gravitational wave detection at scale.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.1
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available