GRATIS
NaN vía Coursera
GRATIS

Probabilistic Graphical Models 2: Inference

  • money

    Cursos gratis (Auditar)

    question-mark
  • earth

    Inglés

  • folder

    Siempre Abierto

  • certificate

    Guía de Registro en Coursera

    arrow
Acerca de este curso

  • Inference Overview
    • This module provides a high-level overview of the main types of inference tasks typically encountered in graphical models: conditional probability queries, and finding the most likely assignment (MAP inference).
  • Variable Elimination
    • This module presents the simplest algorithm for exact inference in graphical models: variable elimination. We describe the algorithm, and analyze its complexity in terms of properties of the graph structure.
  • Belief Propagation Algorithms
    • This module describes an alternative view of exact inference in graphical models: that of message passing between clusters each of which encodes a factor over a subset of variables. This framework provides a basis for a variety of exact and approximate inference algorithms. We focus here on the basic framework and on its instantiation in the exact case of clique tree propagation. An optional lesson describes the loopy belief propagation (LBP) algorithm and its properties.
  • MAP Algorithms
    • This module describes algorithms for finding the most likely assignment for a distribution encoded as a PGM (a task known as MAP inference). We describe message passing algorithms, which are very similar to the algorithms for computing conditional probabilities, except that we need to also consider how to decode the results to construct a single assignment. In an optional module, we describe a few other algorithms that are able to use very different techniques by exploiting the combinatorial optimization nature of the MAP task.
  • Sampling Methods
    • In this module, we discuss a class of algorithms that uses random sampling to provide approximate answers to conditional probability queries. Most commonly used among these is the class of Markov Chain Monte Carlo (MCMC) algorithms, which includes the simple Gibbs sampling algorithm, as well as a family of methods known as Metropolis-Hastings.
  • Inference in Temporal Models
    • In this brief lesson, we discuss some of the complexities of applying some of the exact or approximate inference algorithms that we learned earlier in this course to dynamic Bayesian networks.
  • Inference Summary
    • This module summarizes some of the topics that we covered in this course and discusses tradeoffs between different algorithms. It also includes the course final exam.