4.7 Article

Explainable machine learning for understanding and predicting geometry and defect types in Fe-Ni alloys fabricated by laser metal deposition additive manufacturing

期刊

出版社

ELSEVIER
DOI: 10.1016/j.jmrt.2022.11.137

关键词

Metal additive manufacturing; Porosity; Geometry; Explainable machine learning; Shapley additive explanations

向作者/读者索取更多资源

Recently, there has been development in metal additive manufacturing (MAM) due to its advantages such as complex geometries, waste reduction, design flexibility, and cost-effectiveness. However, the influence of processing parameters on MAM product properties is not well understood or easily predictable. This study applies explainable machine learning (xML) models to predict and understand the geometry and defects of MAM-processed Fe-Ni alloys. Gaussian process regression (GPR) predicts as-printed height and porosity, while a support vector machine (SVM) classifies defect types based on predicted and measured porosities. The Shapley additive explanation (SHAP) approach is used to analyze feature importance. This study provides insights into the use of xML models to link processing with results in MAM.
Recently, there has been development toward metal additive manufacturing (MAM) because of its benefits like fabrication of complex geometries, waste minimization, freedom of design, and low-cost customization. Despite these advantages, the influence of the processing parameters on the properties of MAM products is neither well understood nor easily predictable. In this study, explainable machine learning (xML) models were applied to predict and understand the geometry and types of defects in MAM-processed Fe-Ni alloys. Gaussian process regression (GPR) was used to predict the as-printed height and porosity using data from Fe-Ni alloys produced via laser metal deposition (LMD) processing. Defect types (gas porosity, keyhole, and lack of fusion) were classified using a support vector machine (SVM) by comparing the measured and predicted porosities based on GPR. The Shapley additive explanation (SHAP) approach for xML was utilized to analyze feature importance based on both GPR and SVM data. This study provides insight into the use of the xML model in MAM to link processing with results.(c) 2022 The Author(s). Published by Elsevier B.V. This is an open access article under the CCBY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据