4.5 Article

Application of gradient boosting regression model for the evaluation of feature selection techniques in improving reservoir characterisation predictions

Journal

Publisher

ELSEVIER
DOI: 10.1016/j.petrol.2021.109244

Keywords

Feature selection techniques; Dimensionality reduction techniques; Reservoir characterisation; Ensemble machine learning; Decision tree algorithm

Funding

  1. University Teknologi Petronas
  2. Centre of Research in Enhanced Oil recovery [015LCO-105]

Ask authors/readers for more resources

Feature selection is a critical data preprocessing step in machine learning that helps improve model performance on new data by removing irrelevant variables.
Feature Selection, a critical data preprocessing step in machine learning, is an effective way in removing irrelevant variables, thus reducing the dimensionality of input features. Removing uninformative or, even worse, misinformative input columns helps train a machine learning model on a more generalised data with better performances on new and unseen data. In this paper, eight feature selection techniques paired with the gradient boosting regressor model were evaluated based on the statistical comparison of their prediction errors and computational efficiency in characterising a shallow marine reservoir. Analysis of the results shows that the best technique in selecting relevant logs for permeability, porosity and water saturation prediction was the Random Forest, SelectKBest and Lasso regularisation methods, respectively. These techniques did not only reduce the features of the high dimensional dataset but also achieved low prediction errors based on MAE and RMSE and improved computational efficiency. This indicates that the Random Forest, SelectKBest, and Lasso regularisation can identify the best input features for permeability, porosity and water saturation predictions, respectively.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available