4.5 Article

Variable selection for linear regression in large databases: exact methods

Journal

APPLIED INTELLIGENCE
Volume 51, Issue 6, Pages 3736-3756

Publisher

SPRINGER
DOI: 10.1007/s10489-020-01927-6

Keywords

Variable selection; Linear regression; Branch & Bound methods; Heuristics

Funding

  1. FEDER funds [BU062U16, COV2000375]
  2. Spanish Ministry of Economy and Competitiveness [ECO2016-76567-C4-2-R, PID2019-104263RB-C44]
  3. Regional Government of Castilla y Leon, Spain [BU329U14, BU071G19]
  4. Regional Government of Castilla y Leon

Ask authors/readers for more resources

This paper analyzes the variable selection problem in the context of Linear Regression for large databases, proposing a Branch & Bound method to tackle the issue effectively in very large databases. Computational experiments show that this method performs well compared to other known methods and commercial software.
This paper analyzes the variable selection problem in the context of Linear Regression for large databases. The problem consists of selecting a small subset of independent variables that can perform the prediction task optimally. This problem has a wide range of applications. One important type of application is the design of composite indicators in various areas (sociology and economics, for example). Other important applications of variable selection in linear regression can be found in fields such as chemometrics, genetics, and climate prediction, among many others. For this problem, we propose a Branch & Bound method. This is an exact method and therefore guarantees optimal solutions. We also provide strategies that enable this method to be applied in very large databases (with hundreds of thousands of cases) in a moderate computation time. A series of computational experiments shows that our method performs well compared to well-known methods in the literature and with commercial software.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available