Journal
APPLIED SCIENCES-BASEL
Volume 9, Issue 23, Pages -Publisher
MDPI
DOI: 10.3390/app9235191
Keywords
machine learning; black box model; model interpretability; feature interaction; feature importance
Categories
Funding
- ICT & RND program of MIST/IITP [2018-0-00242]
Ask authors/readers for more resources
There has been considerable development in machine learning in recent years with some remarkable successes. Although there are many high-performance methods, the interpretation of learning models remains challenging. Understanding the underlying theory behind the specific prediction of various models is difficult. Various studies have attempted to explain the working principle behind learning models using techniques like feature importance, partial dependency, feature interaction, and the Shapley value. This study introduces a new feature interaction measure. While recent studies have measured feature interaction using partial dependency, this study redefines feature interaction in terms of prediction performance. The proposed measure is easy to interpret, faster than partial dependency-based measures, and useful to explain feature interaction, which affects prediction performance in both regression and classification models.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available