4.7 Article

Feature selection based on the best-path algorithm in high dimensional graphical models

Journal

INFORMATION SCIENCES
Volume 649, Issue -, Pages -

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2023.119601

Keywords

Graphical models; Chow-Liu algorithm; Automatic feature selection; Mutual information; Econometric linear models

Ask authors/readers for more resources

This paper proposes a new algorithm called Best-Path Algorithm (BPA) for automatic feature selection in High Dimensional Graphical Models. The BPA, based on mutual information, selects the best subset of features using a filter method. By taking advantage of the links between variables brought to the fore by the Edwards's algorithm, the BPA overcomes the limitations of existing filter algorithms. The BPA application to simulated and real-world benchmark datasets demonstrates its potential and greater effectiveness compared to alternative methods.
This paper proposes a new algorithm for an automatic feature selection procedure in High Dimensional Graphical Models. The algorithm, called Best-Path Algorithm (BPA), rests on a filter method and performs feature selection based on mutual information. Over the last years, filter methods have been successfully employed to reduce the size of the input dataset and retain, at the same time, the relevant feature information for modeling and classification problems. However, the extant filter algorithms are mostly heuristic or require high computational effort. The BPA overcomes these drawbacks by taking advantage of the links between variables brought to the fore by the Edwards's algorithm. Once the High Dimensional Graphical Model, depicting the probabilistic structure of the variables, is determined, the BPA selects the best subset of features by analyzing its path-steps. The path-step that includes the variables with the most predictive power for the target one is then determined via the computation of the entropy correlation coefficient. This index, being based on the notion of (symmetric) Kullback-Leibler divergence, is closely connected to the mutual information that the path-step variables share with that of interest. The BPA application to simulated and real-word benchmark datasets highlights its potential and greater effectiveness compared to alternative extant methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available