4.7 Article

Three-way fusion measures and three-level feature selections based on neighborhood decision systems

Journal

APPLIED SOFT COMPUTING
Volume 148, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.asoc.2023.110842

Keywords

Neighborhood decision system; Uncertainty measure; Feature selection; Three-way decision; Three-level analysis; Granular computing

Ask authors/readers for more resources

This paper systematically constructs three-way fusion measures by combining algebraic and informational measures, and hierarchically investigates three-level feature selections. The new algorithms outperform existing ones in classification performance according to data experiments.
Uncertainty measures exhibit algebraic and informational perspectives, and the two-view measure integration facilitates feature selections in classification learning. According to neighborhood decision systems (NDSs), two basic algorithms of feature selections (called JE-FS and DE-FS) already exist by using joint and decisional entropies, respectively, but they have advancement space for informationally fusing algebraic measures. In this paper on NDSs, three-way fusion measures are systematically constructed by combining three-way algebraic and informational measures, and thus three-level feature selections are hierarchically investigated by using corresponding monotonic and nonmonotonic measures and strategies. At first, the accuracy, granularity, and composite granularityaccuracy constitute three-way algebraic measures, while the joint, conditional, and decisional entropies (JE, CE, DE) formulate three-way informational measures. Then, three-way algebraic and informational measures are combined via normalization and multiplication, so three-way fusion measures based on JE, CE, DE are established. These new measures acquire granulation monotonicity and nonmonotonicity. Furthermore by relevant measures and monotonicity/nonmonotonicity, three-level feature selections (with null, single, and double fusion levels) related to JE, CE, DE are proposed, and corresponding heuristic algorithms are designed by monotonic and nonmonotonic principles. 4 x 3 = 12 selection algorithms comprehensively emerge, and they extend and improve current JE-FS and DE-FS. Finally by data experiments, related uncertainty measures and granulation properties are validated, and all 12 selection algorithms are compared in classification learning. As a result, new algorithms outperform JE-FS and DE-FS for classification performance, and the algorithmic improvements accord with the fusion-hierarchical deepening and entropy-systematic development of uncertainty measures. (c) 2023 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available