3.8 Proceedings Paper

Faulty Requirements Made Valuable: On the Role of Data Quality in Deep Learning

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/AIRE51212.2020.00016

Keywords

Data quality; stationarity; recurrent neural network; metamorphic testing; smart sewer systems

Funding

  1. U.S. National Science Foundation [CCF 1350487]

Ask authors/readers for more resources

Large collections of data help evolve deep learning into the state-of-the-art in solving many artificial intelligence problems. However, the requirements engineering (RE) community has yet to adapt to such sweeping changes caused exclusively by data. One reason is that the traditional requirements quality like unambiguity becomes less applicable to data, and so do requirements fault detection techniques like inspections. In this paper, we view deep learning as a class of machines whose effects must be evaluated with direct consideration of inherent data quality attributes: accuracy, consistency, currentness, etc. We substantiate this view by altering stationarity of the multivariate time-series data, and by further analyzing how the stationarity changes affect the behavior of a recurrent neural network in the context of predicting combined sewer overflow. Our work sheds light on the active role RE plays in deep learning.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available