January 9th, Wednesday 14:15, Room 303, Jacobs Building

Title: Perturbation, Optimization and Statistics for Effective Machine Learning

Lecturer: Tamir Hazan

Lecturer homepage : http://ttic.uchicago.edu/~tamir/Tamir_Hazans_web_page.html

Affiliation : Toyota Technological Institute at Chicago

 

Predictions in modern statistical inference problems can be increasingly understood in terms of discrete structures such as arrangements of objects in computer vision, phonemes in speech recognition, parses in natural language processing, or molecular structures in computational biology. For example, in image scene understanding one needs to jointly predict discrete semantic labels for every pixel, e.g., whether it describes a person, bicycle, bed, etc. In a fully probabilistic treatment, all possible alternative assignments are considered thus requiring to estimate exponentially many structures with their respective weights. To relax the exponential complexity we describe two different approaches: Dual decomposition (e.g., convex belief propagation) and random maximum a-posteriori (MAP) perturbations. The second approach leads us to a new approximate inference framework that is based on MAP-statistics, thus does not depend on pseudo-probabilities, contrasting the current framework of dual decomposition. We demonstrate the effectiveness of our approaches on different computer vision tasks, outperforming the state-of-the-art results in scene understanding, depth estimation, semantic segmentation and shape reconstruction.