You are on page 1of 21

Numerical Gaussian Process for Solving

Differential Equations

Hao Li

University of California
hao li@ucsb.com

December 10, 2020


Overview

1 Introduction
Movitiving Example

2 Gaussian Process

3 Numerical Gaussian Process


Burgers Equation
Numerical Gaussian Process

Numerical Gaussian processes, which we define as


Gaussian processes with covariance functions
resulting from temporal discretization of
time-dependent partial differential equations.
Motivating Example

Road traffic densities ρi (t, x); Velocity u(t,x)


Motivating Example

Just consider the one dimensional case:


∂ρ ∂(ρu)
+ =0
∂t ∂x
Motivating Example

Let umax is the maximum speed and let ρmax be


maximum density of the car on the road.

u(ρ) = umax (1 − ρ/ρmax )

if u is measured in 1/umax and ρ is measured in


1/ρmax then we have the following equation
∂ρ ∂(ρ(1 − ρ))
+ =0
∂t ∂x
Motivating Example

∂ρ ∂(ρ(1 − ρ))
+ =0
∂t ∂x

The initial conditions: ρ(0, x) = ρ0 (x); Sometimes


ρ(x) is black-box function, which means they can
only be observed through noisy measurement
Gaussian Process

f (x) ∼ GP (0, K (x, x 0 ; θ))


then we have
K (x, x; θ) K (x, x 0 ; θ)
    
f (x)
∼ GP 0, .
f (x 0 ) K (x 0 , x; θ) K (x 0 , x 0 ; θ)
Training and Prediction of Gaussian Process

Given a data set (X,Y) and kernel function with parameter θ,


we could train the parameter θ by minimizing the Negative log
likelihood function

Once we have the trained parameters and we could use the


posterior distribution to make predictions at a new test point
xstar
Numerical Gaussian Process
Example - Burgers Equation

Burgers’ equation is a fundamental non-linear partial


differential equation arising in various areas of applied
mathematics, including fluid mechanics, nonlinear acoustics,
gas dynamics, and traffic flow
In one space dimension the Burgers’ equation reads:
ut + uux = νuxx with Drichlet boundary condition
u(t,-1)=u(t,1)=0; u(t,x) is unknown solution and ν is just
viscosity
Problem Setup

Let us assume that the initial condition we observed


(X 0 , U 0 ) is from the black-box initial function,
which means we only could get the noisy
measurement.

Given the noisy measurements, we could solving the


burgers equation while propagating the uncertainty
associated with the noisy initial data through time
Solution:
Backward Euler

Applied the backward euler method to Burger


equation we have
d n d2
u n + ∆tu n u − ν∆t 2 u n = u n−1 .
dx dx

Approximate the nonlinear term,

n n−1 d n d2 n
u + ∆tµ u − ν∆t 2 u = u n−1 .(∗)
dx dx
Backward Euler

Then let the prior assumption that


u n (x) ∼ GP(0, k(x,
 x 0 ; θ)), is a Gaussian process
with θ = σ02 , σ 2 Then
 n    n,n
k n,n−1 u, u

u k u, u
∼ GP 0, n−1,n .
u n−1 k u, u k n−1,n−1 u, u

; By the (*), we could get the k n,n−1 u, u and


k n−1,n−1 u, u
Training

The hyper-parameters θ and the noise parameters


σn2 , σn−1
2
can be trained by Negative Log Marginal
Likelihood resulting from
 n 
ub
∼ N (0, K) ,
un−1
Predicting and Propagating the Uncertainty

we use the conditional distribution

u n (x n ) | unb ∼ N (µn (x n ), Σn,n (x n , x n )) ,

to predict u n (x n ) at a new test point x n ,


Error Analysis:
Error Analysis:
Conclusion

The numerical Gaussian process are designed to deal with


cases:
Block 1
all we observe is noisy data on black-box initial conditions.

Block 2
we are interested in quantifying the uncertainty associated with
noisy data in our solutions to partial differential equations.
The End

You might also like