期刊
IEEE SENSORS JOURNAL
卷 23, 期 6, 页码 5988-5996出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSEN.2023.3240470
关键词
Sensors; Sensor arrays; Gases; Gas detectors; Quantization (signal); Feature extraction; Sensor phenomena and characterization; Gas classification; gas sensor array (GSA); long short-term memory (LSTM); self-attention mechanism
This study discovers the existence of self-attention mechanisms in gas sensor array (GSA) data, which can be utilized for gas classification and recognition. GSA data is transformed into a 1-D token series through sampling and quantization, and an enhanced LSTM revision network (LSTM-attention) is employed to extract the self-attention mechanism. LSTM-attention outperforms CNN-based networks and other GSA data processing techniques with a performance of 99.6% on the UCI dynamic gases dataset. Additionally, the self-attention mechanism varies with different sampling and quantization levels during data acquisition.
Gas sensor array (GSA) data is a sequential series of values that represents the temporal conditions of the existence/absence/mixture of gases and exhibits similarities to the textual stream of natural languages that represents semantic information. We speculate and subsequently prove that there also exist self-attention mechanisms in GSA data that can be exploited for gas classification and recognition. We first convert GSA data into a 1-D token series (called WORDs in this work) through sampling and quantization of the sensor values and then use an enhanced long short-term memory (LSTM) revision network, called LSTM-attention, to extract the self-attention mechanism in the GSA data. We demonstrate that LSTM-attention achieves a much better performance (99.6%) than CNN-based networks as well as other GSA data process techniques on UCI dynamic gases dataset. We also find out that the self-attention mechanism varies with different sampling and quantization levels during data acquisition.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据