4.1 Article

Normalized Mutual Information Feature Selection

期刊

IEEE TRANSACTIONS ON NEURAL NETWORKS
卷 20, 期 2, 页码 189-201

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNN.2008.2005601

关键词

Feature selection; genetic algorithms; multilayer perceptron (MLP) neural networks; normalized mutual information (MI)

资金

  1. Conicyt-Chile
  2. Fondecyt [1050751, 7060211]

向作者/读者索取更多资源

A filter method of feature selection based on mutual information, called normalized mutual information feature selection (NMIFS), is presented. NMIFS is an enhancement over Battiti's MIFS, MIFS-U, and mRMR methods. The average normalized mutual information is proposed as a measure of redundancy among features. NMIFS outperformed MIFS, MIFS-U, and mRMR on several artificial and benchmark date sets without requiring a user-defined parameter. In addition, MIFS is combined with a genetic algorithm to form a hybrid filter/wrapper method called GAMIFS. This includes an initialization procedure and a mutation operator based on NMIFS to speed up the convergence of the genetic algorithm. GAMIFS overcomes the limitations of incremental search algorithms that are unable to find dependencies between groups of features.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据