4.6 Article

Reducing Octane Number Loss in Gasoline Refining Process by Using the Improved Sparrow Search Algorithm

期刊

SUSTAINABILITY
卷 15, 期 8, 页码 -

出版社

MDPI
DOI: 10.3390/su15086571

关键词

research octane number (RON) loss; sparrow search algorithm (SSA); extreme gradient boosting (XGboost); optimization model; fitness function

向作者/读者索取更多资源

This study aimed to minimize the loss of octane by investigating various machine learning algorithms and developing an improved octane loss optimization model. Four different machine learning techniques were used to build octane loss prediction models, and the results showed that the XGboost model was optimal. An octane loss optimization model was then established, using the XGboost prediction model as the fitness function.
Gasoline is the primary fuel used in small cars, and the exhaust emissions from gasoline combustion have a significant impact on the atmosphere. Efforts to clean up gasoline have therefore focused primarily on reducing the olefin and sulfur content of gasoline, while maintaining as much of the octane content as possible. With the aim of minimizing the loss of octane, this study investigated various machine learning algorithms to identify the best self-fitness function. An improved octane loss optimization model was developed, and the best octane loss calculation algorithm was identified. Firstly, the operational and non-operational variables were separated in the data pre-processing section, and the variables were then filtered using the random forest method and the grey correlation degree, respectively. Secondly, octane loss prediction models were built using four different machine learning techniques: back propagation (BP), radial basis function (RBF), ensemble learning representing extreme gradient boosting (XGboost) and support vector regression (SVR). The prediction results show that the XGboost model is optimal. Finally, taking the minimum octane loss as the optimization object and a sulfur content of less than 5 mu g/g as the constraint, an octane loss optimization model was established. The XGboost prediction model trained above as the fitness function was substituted into the genetic algorithm (GA), sparrow search algorithm (SSA), particle swarm optimization (PSO) and the grey wolf optimization (GWO) algorithm, respectively. The optimization results of these four types of algorithms were compared. The findings demonstrate that among the nine randomly selected sample points, SSA outperforms all other three methods with respect to optimization stability and slightly outperforms them with respect to optimization accuracy. For the RON loss, 252 out of 326 samples (about 77% of the samples) reached 30%, which is better than the optimization results published in the previous literature.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据