Emergent Mind

Abstract

Differentiable numerical simulations of physical systems have gained rising attention in the past few years with the development of automatic differentiation tools. This paper presents JAX-SSO, a differentiable finite element analysis solver built with JAX, Google's high-performance computing library, to assist efficient structural design in the built environment. With the adjoint method and automatic differentiation feature, JAX-SSO can efficiently evaluate gradients of physical quantities in an automatic way, enabling accurate sensitivity calculation in structural optimization problems. Written in Python and JAX, JAX-SSO is naturally within the machine learning ecosystem so it can be seamlessly integrated with neural networks to train machine learning models with inclusion of physics. Moreover, JAX-SSO supports GPU acceleration to further boost finite element analysis. Several examples are presented to showcase the capabilities and efficiency of JAX-SSO: i) shape optimization of grid-shells and continuous shells; ii) size (thickness) optimization of continuous shells; iii) simultaneous shape and topology optimization of continuous shells; and iv) training of physics-informed neural networks for structural optimization. We believe that JAX-SSO can facilitate research related to differentiable physics and machine learning to further address problems in structural and architectural design.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.