3.8 Proceedings Paper

Shapley-NAS: Discovering Operation Contribution for Neural Architecture Search

Related references

Note: Only part of the references are listed.
Article Computer Science, Artificial Intelligence

Exploiting Operation Importance for Differentiable Neural Architecture Search

Yuan Zhou et al.

Summary: Recent differentiable neural architecture search (NAS) methods have made significant progress in reducing computational costs. A novel indicator is proposed to represent operation importance, guiding model search effectively. Experimental results show that the method is capable of discovering high-performance architectures while guaranteeing search efficiency.

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2022)

Proceedings Paper Computer Science, Artificial Intelligence

Who's Responsible? Jointly Quantifying the Contribution of the Learning Algorithm and Data

Gal Yona et al.

Summary: When a learning algorithm trained on a dataset performs poorly on specific subpopulations, the responsibility can be attributed to either the dataset or the algorithm itself. As machine learning becomes more widespread, the issue of joint credit assignment between algorithms and datasets becomes increasingly significant.

AIES '21: PROCEEDINGS OF THE 2021 AAAI/ACM CONFERENCE ON AI, ETHICS, AND SOCIETY (2021)

Article Computer Science, Artificial Intelligence

From local explanations to global understanding with explainable AI for trees

Scott M. Lundberg et al.

NATURE MACHINE INTELLIGENCE (2020)

Proceedings Paper Computer Science, Artificial Intelligence

Densely Connected Convolutional Networks

Gao Huang et al.

30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017) (2017)

Article Computer Science, Interdisciplinary Applications

Polynomial calculation of the Shapley value based on sampling

Javier Castro et al.

COMPUTERS & OPERATIONS RESEARCH (2009)