David Pfau

I'm a research scientist at Google DeepMind. We're on a mission to solve artificial general intelligence. My own research interests span artificial intelligence, machine learning and computational neuroscience.

As a PhD student at the Center for Theoretical Neuroscience at Columbia, I worked on algorithms for analyzing and understanding high-dimensional data from neural recordings with Liam Paninski and nonparametric Bayesian methods for predicting time series data with Frank Wood. Prior to joining DeepMind I also consulted for Qadium, working on Data Microscopes, an open source library of fast, modular nonparametric Bayesian models.


Research

2016

2015

2014

2013

2012

2010

2009


Talks


Software

You can find my personal Github here. Notable projects include a collection of methods for learning state space models for neuroscience data, some of which has been integrated into the pop_spik_dyn package, a Matlab implementation of Learning Recurrent Neural Networks with Hessian-Free Optimization, and the Java implementation of the Probabilistic Deterministic Infinite Automata used our paper. For those interested in probabilistic programming, I have also provided a PDIA implementation in WebChurch. I also contributed a C++ implementation of Beam Sampling for the Infinite Hidden Markov Model to the Data Microscopes project. At a factor of 40 faster than existing Matlab code, it's likely the fastest beam sampler for the iHMM in the world.


Other Writing

Not everything makes it into a paper, but that doesn't mean it's not important. You can find short notes and other writings that don't have a home elsewhere here.

  • A Generalized Bias-Variance Decomposition for Bregman Divergences
    [note]   [tl;dr]  

    A simple result that I haven't seen published elsewhere. Other research on generalized bias-variance decompositions historically has focused on 0-1 loss and is relevant to classificiation and boosting. In probabilistic modeling, error is measured through log probabilities instead of classification accuracy, often with distributions in the exponential family. Exponential family likelihoods and Bregman divergences are closely related, and it turns out it's straightforward to generalize the bias-variance decomposition for squared error to all Bregman divergences.