Approximate Inference in Graphical Models using LP Relaxations
Graphical models such as Markov random fields have been successfully applied to a wide variety of fields, from computer vision and natural language processing, to computational biology. Exact probabilistic inference is generally intractable in complex models having many dependencies between the variables.
In this talk, I will discuss recent work on using linear programming relaxations to perform approximate inference. By solving the LP relaxations in the dual, we obtain efficient message-passing algorithms that, when the relaxations are tight, can provably find the most likely (MAP) configuration.
Our algorithms succeed at finding the MAP configuration in protein side-chain placement, protein design, and stereo vision problems. More broadly, this talk will highlight emerging connections between machine learning, polyhedral combinatorics, and combinatorial optimization.
Speaker Bios
David Sontag is a post-doc at Microsoft Research New England. He received his Ph.D. in Computer Science at MIT under the supervision of Tommi Jaakkola. His research interests include areas of machine learning such as structured prediction and inference in graphical models. While at MSR he has been working on information retrieval and medical informatics. David is the recipient of an outstanding student paper award at NIPS 07, a best paper award at UAI 08, a best paper award at EMNLP 10, and the Google Fellowship in Machine Learning in 2009.
- Date:
- Haut-parleurs:
- David Sontag
- Affiliation:
- MIT
-
-
David Shirley
-
David Sontag
-
Jeff Running
-
-