4.6 Article

Detecting Marine Organisms Via Joint Attention-Relation Learning for Marine Video Surveillance

Journal

IEEE JOURNAL OF OCEANIC ENGINEERING
Volume 47, Issue 4, Pages 959-974

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JOE.2022.3162864

Keywords

Detectors; Organisms; Video surveillance; Oceans; Monitoring; Visualization; Real-time systems; Convolutional neural network (CNN); marine organism detection; marine video surveillance; relation model; visual attention

Funding

  1. National Natural Science Foundation of China [62171421, 61771440]
  2. Qingdao Postdoctoral Applied Research Project of China

Ask authors/readers for more resources

This article presents a method for marine organism detection based on visual attention and relation mechanism. By applying an improved attention-relation module on an efficient marine organism detector, the discrimination of organisms in complex underwater environments can be enhanced. The proposed method outperforms state-of-the-art approaches on experimental data sets.
The better way to understand marine life and ecosystems is to surveil and analyze the activities of marine organisms. Recently, research on marine video surveillance is becoming increasingly popular. With the rapid development of deep learning (DL), convolutional neural networks (CNNs) have made remarkable progresses in image/video understanding tasks. In this article, we explore a visual attention and relation mechanism for marine organism detection, and propose a new way to apply an improved attention-relation (AR) module on an efficient marine organism detector (EMOD), which can well enhance the discrimination of organisms in complex underwater environments. We design our EMOD via integrating current state-of-the-art (SOTA) detection methods in order to detect organisms and surveil marine environments in a real time and fast fashion for high-resolution marine video surveillance. We implement our EMOD and AR on the annotated video data sets provided by the public data challenges in conjunction with the workshops (CVPR 2018 and 2019), which are supported by National Oceanic and Atmospheric Administration (NOAA) and their research works (NMFS-PIFSC-83). Experimental results and visualizations demonstrate that our application of AR module is effective and efficient, and our EMOD equipped with AR modules can outperform SOTA performance on the experimental data sets. For application requirements, we also provide the application suggestions of EMOD framework. Our code is publicly available at https://github.com/zhenglab/EMOD.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available