log in  |  register  |  feedback?  |  help  |  web accessibility
Logo
Running Probabilistic Programs Backward
Wednesday, December 18, 2013, 11:00 am-12:00 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

To be useful in Bayesian practice, a probabilistic language must support conditioning: imposing constraints in a way that preserves the relative probabilities of program outputs. Every language to date that supports probabilistic conditioning also places seemingly artificial restrictions on legal programs, such as disallowing unbounded loops and recursion, and restricting conditions to simple equality constraints such as x = 2.

These common restrictions arise from reasoning about probability using only probability densities. To overcome them, we have developed a probabilistic language semantics using a measure-theoretic approach: defining program output distributions by the probabilities of preimages.

The translation compiles probabilistic programs into any functional language capable of uncountably infinite operations. Unfortunately, no such language is actually realizable (as far as we know). We therefore have derived an approximating semantics, whose translation compiles programs into Turing-equivalent functional languages, and have proved that it is sound. We have implemented the approximating semantics almost directly in Typed Racket.

We demonstrate the implementation's expressive power on some typically Bayesian problems. We also demonstrate it on some problems whose straightforward solutions cannot be expressed using densities, such as problems involving bounded measuring devices, and stochastic ray tracing.

Bio

Neil Toronto is a doctoral student in Computer Science at Brigham Young University.

This talk is organized by David Van Horn