3.8 Proceedings Paper

Faulty Requirements Made Valuable: On the Role of Data Quality in Deep Learning

出版社

IEEE COMPUTER SOC
DOI: 10.1109/AIRE51212.2020.00016

关键词

Data quality; stationarity; recurrent neural network; metamorphic testing; smart sewer systems

资金

  1. U.S. National Science Foundation [CCF 1350487]

向作者/读者索取更多资源

Large collections of data help evolve deep learning into the state-of-the-art in solving many artificial intelligence problems. However, the requirements engineering (RE) community has yet to adapt to such sweeping changes caused exclusively by data. One reason is that the traditional requirements quality like unambiguity becomes less applicable to data, and so do requirements fault detection techniques like inspections. In this paper, we view deep learning as a class of machines whose effects must be evaluated with direct consideration of inherent data quality attributes: accuracy, consistency, currentness, etc. We substantiate this view by altering stationarity of the multivariate time-series data, and by further analyzing how the stationarity changes affect the behavior of a recurrent neural network in the context of predicting combined sewer overflow. Our work sheds light on the active role RE plays in deep learning.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据