Papers & code for selected research projects. See my complete publication list for more detail.
Hierarchical Dirichlet processes (HDPs) lead to Bayesian nonparametric mixture models, topic models, temporal models, and relational models. We develop a scalable family of variational inference algorithms that allows the number of clusters/topics/states/communities to be adapted online as data is observed.
We use RGB-D images to learn contextual relationships between object categories and the 3D layout of indoor scenes. Our cloud of oriented gradient (COG) descriptor links the 2D appearance and 3D pose of object categories, accounting for perspective projection to produce state-of-the-art object detectors.
A fusion of max-product belief propagation and particle filters which reliably finds modes of continuous posterior distributions. A submodular optimization algorithm replaces the fragile stochastic resampling of standard particle methods. Applications include human pose estimation and protein structure prediction.
The automated processing of multiple seismic signals to detect and localize seismic events is a central tool in both geophysics and nuclear treaty verification. Our Bayesian seismic monitoring system, NET-VISA, is learned from historical data provided by the UN preparatory commission for the comprehensive nuclear-test-ban treaty organization (CTBTO). We reduce the number of missed events by 60%.
Layered models simultaneously segment scenes into regions of coherent structure and estimate dense motion (optical flow) fields. By explicitly modeling occlusion relationships, and designing sophisticated CRF priors, we achieve state-of-the-art motion estimates and interpretable segmentations.
The distance dependent Chinese restaurant process (ddCRP) is a flexible nonparametric prior for data clustering. Using a hierarchical generalization of the ddCRP and spatio-temporal distance measures, we use MCMC learning to segment text, image, video, and 3D mesh data.
Loopy belief propagation (BP) is an often highly accurate approximate inference algorithm for probabilistic graphical models with cycles. We have contributed to the theory of loopy BP, written tutorial articles, and developed improved optimization algorithms for the associated Bethe variational objective.
Using infinite feature representations derived from the beta process, this Bayesian nonparametric model discovers a set of latent dynamical behaviors, and uses them to segment a library of observed time series. Applications include human activity understanding from video or motion capture data.
The sticky hierarchical Dirichlet process HMM allows an unbounded number of latent states to be learned from unlabeled sequential data. By capturing the "sticky" temporal persistence of real dynamical states we learn improved models of financial indices, human speech, and honeybee dances.
A stick-breaking representation allows Bayesian nonparametric learning of an unbounded number of topics from text data, or communities from network data. We learn correlations in topic/community membership, and better predict relationships by exploiting document/entity metadata.
Hierarchical probabilistic models for objects, the parts composing them, and the visual scenes surrounding them. We capture geometric context via spatial transformations, and use hierarchical Dirichlet processes to learn new objects and parts from partially labeled images or stereo pairs.
Nonparametric belief propagation (NBP) generalizes discrete BP to graphical models with non-Gaussian continuous variables, using sample-based marginal approximations inspired by particle filters. Applications include kinematic tracking of visual motion and distributed localization in sensor networks.
An inference algorithm for Gaussian graphical models that exploits embedded, tree-structured graphs. Generalizing loopy belief propagation, our embedded trees algorithm not only rapidly computes posterior means, but also correctly estimates posterior variances (uncertainties).