4.6 Article

Spatial-Temporal Self-Attention Transformer Networks for Battery State of Charge Estimation

期刊

ELECTRONICS
卷 12, 期 12, 页码 -

出版社

MDPI
DOI: 10.3390/electronics12122598

关键词

lithium-ion battery; SOC; deep learning; estimation; transformer; electric vehicle

向作者/读者索取更多资源

This study proposes a specialized Transformer-based network architecture called BERTtery, which uses time-resolved battery data as input to estimate the state-of-charge (SOC) of batteries accurately. The model was trained and tested under various working conditions, demonstrating its ability to predict SOC in different scenarios of battery operation. These results showcase the predictive power of the self-attention Transformer-based model in complex battery systems.
Over the past ten years, breakthroughs in battery technology have dramatically propelled the evolution of electric vehicle (EV) technologies. For EV applications, accurately estimating the state-of-charge (SOC) is critical for ensuring safe operation and prolonging the lifespan of batteries, particularly under complex loading scenarios. Despite progress in this area, modeling and forecasting the evaluation of multiphysics and multiscale electrochemical systems under realistic conditions using first-principles and atomistic calculations remains challenging. This study proposes a solution by designing a specialized Transformer-based network architecture, called Bidirectional Encoder Representations from Transformers for Batteries (BERTtery), which only uses time-resolved battery data (i.e., current, voltage, and temperature) as an input to estimate SOC. To enhance the Transformer model's generalization, it was trained and tested under a wide range of working conditions, including diverse aging conditions (ranging from 100% to 80% of the nominal capacity) and varying temperature windows (from 35 & DEG;C to -5 & DEG;C). To ensure the model's effectiveness, a rigorous test of its performance was conducted at the pack level, which allows for the translation of cell-level predictions into real-life problems with hundreds of cells in-series conditions possible. The best models achieve a root mean square error (RMSE) of less than 0.5 test error and approximately 0.1% average percentage error (APE), with maximum absolute errors (MAE) of 2% on the test dataset, accurately estimating SOC under dynamic operating and aging conditions with widely varying operational profiles. These results demonstrate the power of the self-attention Transformer-based model to predict the behavior of complex multiphysics and multiscale battery systems.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据