Professional Documents
Culture Documents
Complex Physics
with Graph Networks
02 Related Work
04 Experimental Methods
05 Results
06 Conclusion
2
01
Introduction
3
Idea
traditional simulators
● High-quality simulators require substantial
computational resources.
● Even the best are often inaccurate due to
insufficient knowledge of, or difficulty in machine learning simulators
approximating, the underlying physics and
parameters. ● Can train simulators directly from observed data.
● The large state spaces and complex dynamics
have been difficult for standard end-to-end
learning approaches to overcome.
4
Graph Network-based Simulators (GNS)
The framework imposes strong inductive biases, where
rich physical states are represented by graphs of
interacting particles, and complex dynamics are
approximated by learned message-passing among nodes.
5
02
Related Work
6
Traditional Ways
7
Graph Networks
● A type of graph neural network.
● A GN maps an input graph to an output graph with the
same structure but potentially different node, edge, and
graph-level attributes, and can be trained to learn a form of
messagepassing , where latent information is propagated
between nodes via the edges.
● GNs and their variants, e.g., "interaction networks", can
learn to simulate rigid body, mass-spring, n-body, and
robotic control systems , as well as non-physical systems.
8
03
GNS Model
Framework
9
Model Structure
10
Graph Networks
11
04
Experimental
Methods
12
GNS Implementation Details
the components of the GNS framework using standard deep learning building blocks,
and used standard nearest neighbor algorithms to construct the graph.
Input ENCODER
xitk = [pitk , pi˙ tk-C+1 , . . . , pi˙ tk,fi ], constructs the graph structure G0
C = 5,
pi = position MLP encode node features and edge
pi˙ = previous velocities features into the latent vectors, vi and
fi = features that capture static ei,j , of size 128.
material properties
13
GNS Implementation Details
PROCESSOR DECODER
Uses a stack of M GNs with Decoder’s learned function, δv ,
identical structure, MLPs as internal is an MLP.
edge and node update functions. After the DECODER, the future position
and velocity are updated using an Euler
integrator, so the yi corresponds to
accelerations, p¨i
14
Neural network parameters
15
05
Results
16
Simulating Complex Materials
● Substantially greater than demonstrated in previous
methods—GNS can operate at resolutions high enough
for practical prediction tasks and high-quality 3D
renderings.
● Although our models were trained to make one-step
predictions, the long-term trajectories remain plausible
even over thousands of rollout timesteps.
MSE
SPH GNS 17
Multiple Interacting Materials
Train a single architecture with a single set of parameters to simulate all of our different materials,
interacting with each other in a single system.
18
Generalization
The GNS generalizes well even beyond its training distributions, which suggests it learns a more
general-purpose understanding of the materials and physical processes experienced during training.
trained:
1x1 domain,
2.5k particles, 600 steps
inference:
8x4 domain,
19
Key Architectural Choices
While our GNS model was generally robust to architectural and hyperparameter settings, we also
identified several factors which had more substantial impact:
1. the number of message-passing steps,
2. shared vs. unshared PROCESSOR GN parameters,
3. the connectivity radius,
4. the scale of noise added to the inputs during training,
5. relative vs. absolute ENCODER.
20
Comparisons to Previous Models
● ● CConv:
DPI:
CConv performs well for domains like water,
While DPI uses hard-coded constraints to keep
which it was built for, but struggles with some
the box shape consistent, our model achieves
of our more complex materials.
this without any special treatment of the solid
Similarly, in a CConv rollout of the BOXBATH
particles.
DOMAIN the rigid box loses its shape.
21
06
Conclusion
22
Conclusion
Simpler
More accurate
Better generalization
23