ElastoGen: 4D Generative Elastodynamics

University of Utah1, Zhejiang University2, University of California, San Diego3, University of California, Los Angeles4
*: contribute equally.
Utah ZJU UCSD UCLA

ElastoGen is a knowledge-driven model that generates physically accurate and coherent 4D elastodynamics.

Abstract

We present ElastoGen, a knowledge-driven AI model for physically accurate 4D elastodynamics generation. Unlike deep models that learn from video- or image-based observations, ElastoGen leverages the principles of physics and learns from established mathematical and optimization procedures. The core idea of ElastoGen is converting the differential equation, corresponding to the nonlinear force equilibrium, into a series of iterative local convolution-like operations, which naturally fit deep architectures. We carefully build our network module following this overarching design philosophy. ElastoGen is much more lightweight in terms of both training requirements and network scale than deep generative models. Because of its alignment with actual physical procedures, ElastoGen efficiently generates accurate dynamics for a wide range of hyperelastic materials and can be easily integrated with upstream and downstream deep modules to enable end-to-end 4D generation.

Pipeline & Network Architecture



ElastoGen rasterizes an input 3D model (with boundary conditions) and generates parameters filling our NeuralMTL module. Conceptually, NeuralMTL predicts locally concentrated strain of the object, which is relaxed by a nested RNN loop.


The RNN predicts the future trajectory of the object. There are two sub RNN modules. RNN-1 repeatedly relaxes the local stress in a 3D convolution manner. Those relaxed strains are converted to positional signals, and RNN-2 merges local deformation into a displacement field of the object. ElastoGen automatically checks the accuracy of the prediction of both RNN loops, and outputs the final prediction of the next position once the prediction error reaches the prescribed threshold.

Experiments


Feasibility across Various Shapes: To demonstrate the feasibility of our method across various shapes, we conduct experiments on multiple models from ShapeNet with different force and pin settings. Click on image to zoom in.

Versatility across Geometric Representations: Our method can be applied to any geometric representation. For instance, when using implicit Neural Radiance Fields (NeRF) to describe 3D models, we employ the technique from PIE-NeRF. We first voxelize the NeRF based on the density fields. Then, we generate dynamics using ElastoGen and finally obtain a dynamic NeRF through linear ray warping.

Capability on Complex Meshes: Our method is also applicable to complex explicit meshes. We test our approach on meshes with intricate geometries, achieving similarly impressive results. With adaptive resolution for voxelization, ElastoGen produces visually pleasing and physically accurate dynamics while preserving the dynamic details of the fine structures.

Fullscreen Image

BibTeX

@misc{feng2024elastogen,
    title={ElastoGen: 4D Generative Elastodynamics}, 
    author={Yutao Feng and Yintong Shang and Xiang Feng and Lei Lan and Shandian Zhe and Tianjia Shao and Hongzhi Wu and Kun Zhou and Hao Su and Chenfanfu Jiang and Yin Yang},
    year={2024},
    eprint={2405.15056},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}