Please login to view abstract download link
Multiscale modeling is the branch of computational mechanics that has suffered from the limits imposed by computational costs probably the most, owing to computations being performed on more than one scale. Consequently, the use of machine learning (ML) methods to bypass costly lower-scale or scale-bridging simulations has tremendous potential to accelerate multiscale modeling and provide new opportunities for simulating the complex mechanics of materials and structures. We will discuss a few recent applications of ML to the field of computational mechanics, all united by the ambition to accelerate simulations across scales. One example is ML-based surrogate material modeling to bypass expensive material point calculations, e.g., by using Recurrent Neural Operators (RNOs) as surrogates for computationally expensive constitutive laws. This is demonstrated for finite-strain crystal plasticity models for magnesium, which are typi-cally too expensive to be used in a multiscale setting. Similarly, FE2 simulations can be considerably accelerated when replacing the microscale boundary value problem by an efficient ML-based surrogate model [1] – as we will show for multiscale simulations of beam-based metamaterials, accelerated by a Kolmogorov-Arnold (KAN) network. Another example are constitutive artificial neural networks for architected materials, which learn the complex constitutive response as a function of the underlying structural architecture [2] for use as surrogate models in macroscale boundary value problems. Overall, we demonstrate how the integration of the principles of mechanics and thermodynamics with ML tools can result in (sufficiently) accurate and efficient AI-accelerated solutions for computational multiscale modeling. We close by highlighting the potential of such tools also for design optimization, e.g., of architected materials.