David Wolpert’s lecture takes place on April 4 at 11am at the Hub. He will give a high level overview of two of the several projects he is involved with.
1) Reducing the Error of Monte Carlo Algorithms by Learning Control Variates (joint work with Brendan Tracey)
Control variates are a powerful technique to reduce the error of Monte Carlo, but conventionally it is necessary to know a good function approximation a priori. Stacked Monte Carlo (StackMC) is a post-processing technique that overcomes this limitation by constructing a control variate from data samples. This talk shows extensions to StackMC for use with importance sampling, Latin-hypercube sampling and quasi-Monte Carlo sampling, as well as use with multiple fitting functions and discrete input spaces.
2) THERMODYNAMICS OF COMPUTATION: BEYOND BIT ERASURE (joint work with Artemy Kolchinsky and Jeremy Owen)
5% of the current energy usage in the US goes to computation. Moreover, 50% of the lifetime budget of a modern high-performance computing center is to pay the energy bill. All of that energy ultimately results only in heat, with no physical work done at all. Indeed, Google has placed some of their data servers next to rivers – in order to use the water in the river to cool the servers, i.e., to remove the heat they generate. As another example, one of the major challenges facing current efforts to build exascale computers is how to keep them from melting. Clearly the thermodynamics of computation has major engineering consequences, which are becoming a huge challenge to humanity.
Fortunately, the last two decades have seen a revolution in nonequilibrium statistical physics. This has resulted in some extremely powerful tools for anayzing the thermodynamic properties and fundamental limits of far-from-equilibrium systems – like computers. As a result we are witnessing the genesis of a new field of science and engineering: a modern thermodynamics of computation.
In this talk I discuss these developments. I also highlight some of the major extensions to theoretical computer science that are arising as we combine thermodynamic considerations with the traditional computer science concerns of space / time tradeoffs. In particular, I present some novel issues in the thermodynamics of computation that involve machine learning.