You are on page 1of 1

Article

Bayesian Causal Inference with Information Field


Theory†
Andrija Kostić 1,2 , Philipp Frank 1,2 , Matteo Guardiani 1,2 , Sebastian Hutschenreuter 4 ,
Maximilian Kurthen 5 , Reimar Leike 1,2 , and Torsten Enßlin 1,2,3
1 Max Planck Institute for Astrophysics, Karl-Schwarzschild-Straße 1, 85748 Garching bei München, Germany;
{akostic, matteani}@mpa-garching.mpg.de
2 Fakultät für Physik, Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, 80539 München,
Germany; {XXX, matteo.guardiani}@physik.lmu.de
3 Excellence Cluster ORIGINS, Boltzmannstraße 2, 85748 Garching bei München, Germany
4 Radboud University, Houtlaan 4, 6525 XZ Nijmegen, The Netherlands
5 b.telligent, München, Germany
* Andrija Kostić and Matteo Guardiani.
† Submitted to International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and
Engineering, IHP, Paris, July 18-22, 2022.

Version April 17, 2022 submitted to Entropy

1 Abstract: The fundamental problem of causal inference is to discover causal relations between
2 variables used to describe observational data. We address this problem within the formalism of
3 Information Field Theory (IFT). Specifically, we focus on the problems of bivariate causal discovery
4 (X → Y, Y → X) and of the inference of a confounder (X ← Z → Y) from an observational
5 dataset ( X, Y ). Bivariate causal discovery is especially interesting because the usual methods of
6 statistical-independence testing are not useful in this regime. Even more so, the problem of inferring
7 the existence of a confounder Z requires both the inference of the correct causal direction and of
8 the distribution of the latent variable Z. Here, we propose different solutions to these problems
9 which exploit Bayesian hierarchical modeling and Additive Noise Models to provide non-parametric
10 reconstructions of the observational distributions. In order to identify the correct causal direction,
11 we compare the performance of our newly-developed Bayesian inference algorithms for different
12 causal models (X → Y, Y → X, X ← Z → Y) by calculating the evidence lower bound (ELBO). We
13 develop a new method for the ELBO estimation that exploits the variational inference scheme used for
14 parameter inference. Finally, we compare our approach to state-of-the-art causal inference methods
15 and show that our methods have comparable accuracy on typical benchmark datasets.

16 Keywords: Causal inference, Bayesian inference; Machine Learning; Artificial Intelligence; Bayesian
17 model selection, Information Field Theory.

18 © 2022 by the authors. Submitted to Entropy for possible open access publication under the terms and conditions
19 of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

Submitted to Entropy, pages 1 – 1 www.mdpi.com/journal/entropy

You might also like