Keywords: causality, computational statistics, machine learning, independence testing.
My work focuses mainly on causal inference: we try to learn causal structures either from purely observational data or from a combination of observational and interventional data. We therefore develop both theory and methodology. Our work relates to areas like high-dimensional statistics, computational statistics or graphical models. It's an exciting research area with lots of open questions!
Most of the publications are also on Google Scholar.
Even, if there is no PhD call and you think about doing a PhD or postdoc in causality, please send me an email.
We have written a book on causality that is now being copy-edited and that will appear as open access at MIT Press. A final draft (6.10.2017) is now available for download (see below).
Jonas Peters, Dominik Janzing, Bernhard Schölkopf: Elements of Causal Inference: Foundations and Learning Algorithms
Link to bibtex
Link to MIT Press
The pdf can be downloaded for free from the
MIT Press website
(look for "This is an open access title" on the left-hand side).
- Together with Miriam Akkermann and Ulrike Endesfelder an other members/alumni from the Young Academy, we have created a calendar for 2019 (Thorbecke Verlag) with open scientific questions (in German).
- 'Die Vermessung unserer Persoenlichkeit' published in Junge Akademie Magazin, Heft 24, September 2017
- 'Can chocolate make you smart?' published in Junge Akademie Magazin, Heft 23, Februar 2017
Jonas is associate professor at the Department of Mathematical Sciences at
the University of Copenhagen,
and is a member of the Junge Akademie.
Previously, Jonas has been leading the causality group at the
MPI for Intelligent Systems in Tübingen
was a Marie Curie fellow at the
Seminar für Statistik, ETH Zurich.
He studied Mathematics at the University of Heidelberg and the University of Cambridge and did his PhD with
Jonas has done research internships with Leon Bottou at Microsoft Research (WA, USA),
Martin Wainwright at UC Berkeley (CA, USA)
and Peter Spirtes at CMU (Pittsburgh, USA).
Jonas is interested in inferring causal relationships from observational data and works both on theory and methodology.
His work relates to areas as computational statistics, graphical models, independence testing or high-dimensional statistics.
A full CV can be found here.
Danish Society for Theoretical Statistics,
Royal Statistical Society
Causality in 4 Steps
- Consider the following problem: we are given data from gene A (or B) and a phenotype. Clearly, both variables are correlated. What is the best prediction for the phenotype given we are deleting gene A (or B), such that its activity becomes zero?
- Causality matters: Intuitively, the optimal prediction should depend on the underlying causal structure:
But then, if we do not accept any form of causal notion, we cannot distinguish between these two cases and our best prediction must be: "I do not know."!
Causal Model: If we want to be able to describe the above situation properly, we need a so-called causal model that (1) models observational data and (2) interventional data (e.g., the distribution that arises after the gene deletion) and that (3) outputs a graph. Functional Causal Models (also called Structural Equation Models) are one class of such models, see the figure on the right. If you are interested in more details, see the script below, for example.
Examples of questions that are studied in this field: How can one compute intervention distributions from the graph and the observational distribution efficiently? What if some of the variables are unobserved? What are nice graphical representations? Under which assumptions can we reconstruct the causal model from the observational distribution ("causal discovery")? What if we are also given data from some of the intervention distributions? Does causal knowledge help in more "classical" tasks in machine learning and statistics?
I have written a script on causality that I am more than happy to receive feedback on. Please note that it is still missing some sections. It can be downloaded here.