Journal
PATTERN RECOGNITION
Volume 88, Issue -, Pages 321-330Publisher
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2018.11.032
Keywords
Naive Bayes; Attribute weighting; Weight optimization
Funding
- National Natural Science Foundation of China [U1711267]
- Fundamental Research Funds for the Central Universities [CUG2018JM18]
Ask authors/readers for more resources
Due to its easiness to construct and interpret, along with its good performance, naive Bayes (NB) is widely used to address classification problems in real-world applications. In order to alleviate its conditional independence assumption, a mass of attribute weighting approaches have been proposed. However, almost all these approaches assign each attribute a same (global) weight for all classes. In this paper, we call them the general attribute weighting and argue that for NB attribute weighting should be class-specific (class-dependent). Based on this premise, we propose a new paradigm for attribute weighting called the class-specific attribute weighting, which discriminatively assigns each attribute a specific weight for each class. We call the resulting model class-specific attribute weighted naive Bayes (CAWNB). CAWNB selects class-specific attribute weights to maximize the conditional log likelihood (CLL) objective function or minimize the mean squared error (MSE) objective function, and thus two different versions are created, which we denote as CAWNBCLL and CAWNBMSE, respectively. Extensive empirical studies show that CAWNBCLL and CAWNBMSE all obtain more satisfactory experimental results compared with NB and other existing state-of-the-art general attribute weighting approaches. We believe that for NB class-specific attribute weighting could be a more fine-grained attribute weighting approach than general attribute weighting. (C) 2018 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available