Emergent Mind

Abstract

This study presents novel predictive models using Graph Neural Networks (GNNs) for simulating thermal dynamics in Laser Powder Bed Fusion (L-PBF) processes. By developing and validating Single-Laser GNN (SL-GNN) and Multi-Laser GNN (ML-GNN) surrogates, the research introduces a scalable data-driven approach that learns fundamental physics from small-scale Finite Element Analysis (FEA) simulations and applies them to larger domains. Achieving a Mean Absolute Percentage Error (MAPE) of 3.77% with the baseline SL-GNN model, GNNs effectively learn from high-resolution simulations and generalize well across larger geometries. The proposed models capture the complexity of the heat transfer process in L-PBF while significantly reducing computational costs. For example, a thermomechanical simulation for a 2 mm x 2 mm domain typically requires about 4 hours, whereas the SL-GNN model can predict thermal distributions almost instantly. Calibrating models to larger domains enhances predictive performance, with significant drops in MAPE for 3 mm x 3 mm and 4 mm x 4 mm domains, highlighting the scalability and efficiency of this approach. Additionally, models show a decreasing trend in Root Mean Square Error (RMSE) when tuned to larger domains, suggesting potential for becoming geometry-agnostic. The interaction of multiple lasers complicates heat transfer, necessitating larger model architectures and advanced feature engineering. Using hyperparameters from Gaussian process-based Bayesian optimization, the best ML-GNN model demonstrates a 46.4% improvement in MAPE over the baseline ML-GNN model. In summary, this approach enables more efficient and flexible predictive modeling in L-PBF additive manufacturing.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.