Gibbs sampler and coordinate ascent variational inference: A set-theoretical review

@article{Lee2020GibbsSA,
  title={Gibbs sampler and coordinate ascent variational inference: A set-theoretical review},
  author={Se Yoon Lee},
  journal={Communications in Statistics - Theory and Methods},
  year={2020},
  volume={51},
  pages={1549 - 1568},
  url={https://api.semanticscholar.org/CorpusID:220935477}
}
  • Se Yoon Lee
  • Published in 3 August 2020
  • Mathematics
  • Communications in Statistics - Theory and Methods
This paper defines fundamental sets of densities frequently used in Bayesian inference, and provides an alternative mechanism for analyzing the two schemes endowed with pedagogical insights.

Unsupervised AoA Estimation Based on Dual-Path Knowledge-Aware Auto-Encoders

An unsupervised deep learning-based framework based on dual-path model-driven auto-encoders for angle-of-arrivals (AoAs) estimation in massive MIMO systems that addresses two key challenges in unsupervised learning: the lack of interpretability and the convergence to local optima.

Effects of Multi-Omics Characteristics on Identification of Driver Genes Using Machine Learning Algorithms

This study presents a framework to analyze the effects of multi-omics characteristics on the identification of cancer driver genes and provides a more comprehensive understanding of cancer mechanisms.

Regularization for Wasserstein distributionally robust optimization                                                              

This paper derives a general strong duality result of regularized Wasserstein distributionally robust problems in the case of entropic regularization and provides an approximation result when the regularization parameters vanish.

Course Project Report: Comparing MCMC and Variational Inference for Bayesian Probabilistic Matrix Factorization on the MovieLens Dataset

Two Bayesian inference methods are employed: Markov Chain Monte Carlo and Variational Inference to approximate the posterior and Experimental results demonstrate that VI offers faster convergence, while MCMC provides more accurate posterior estimates.

Singular Control in Inventory Management with Smooth Ambiguity

We consider singular control in inventory management under Knightian uncertainty, where decision makers have a smooth ambiguity preference over Gaussian-generated priors. We demonstrate that…

Combining Bayesian Inference and Reinforcement Learning for Agent Decision Making: A Review

This paper discusses the summary of how Bayesian methods work in the data collection, data processing and policy learning stages of RL to pave the way for better agent decision-making strategies.

Quantum Neural Network Restatement of Markov Jump Process

In this strand of research, direct treatment and description of such abstract notions of learning theory in terms of quantum information be one of the most favorable candidates for direct treatment and description of such abstract notions of learning theory in terms of quantum mechanical systems.

Hierarchical mixtures of Unigram models for short text clustering: the role of Beta-Liouville priors

This paper presents a variant of the Multinomial mixture model tailored to the unsupervised classification of short text data based on the Beta-Liouville distribution, which offers a more flexible correlation structure than the Dirichlet.

Provable Accuracy Bounds for Hybrid Dynamical Optimization and Sampling

This work provides non-asymptotic convergence guarantees for hybrid LNLS by reducing to block Langevin Diffusion (BLD) algorithms, and proves exponential KL-divergence convergence for randomized and cyclic block selection strategies using ideal DXs.

Variational Inference: A Review for Statisticians

Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived.

Explaining Variational Approximations

The ideas of variational approximation are illustrated using examples that are familiar to statisticians using terminology, notation, and examples from the former field.

Information Theory and Statistics: A Tutorial

This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting, and an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory.

Explaining the Gibbs Sampler

A simple explanation of how and why the Gibbs sampler works is given and analytically establish its properties in a simple case and insight is provided for more complicated cases.

Probability and Measure

Probability. Measure. Integration. Random Variables and Expected Values. Convergence of Distributions. Derivatives and Conditional Probability. Stochastic Processes. Appendix. Notes on the Problems.…

Diagnostics: A Comparative Review

    Mathematics, Computer Science

Yes, but Did It Work?: Evaluating Variational Inference

Two diagnostic algorithms are proposed that give a goodness of fit measurement for joint distributions, while simultaneously improving the error in the estimate.

Log-concave sampling: Metropolis-Hastings algorithms are fast!

A non-asymptotic upper bound on the mixing time of the Metropolis-adjusted Langevin algorithm (MALA) is proved, and the gains of MALA over ULA for weakly log-concave densities are demonstrated.

Advances in Variational Inference

An overview of recent trends in variational inference is given and a summary of promising future research directions is provided.

Theoretical and Computational Guarantees of Mean Field Variational Inference for Community Detection

The mean field method for community detection under the Stochastic Block Model has a linear convergence rate and converges to the minimax rate within $\log n$ iterations and similar optimality results for Gibbs sampling and an iterative procedure to calculate maximum likelihood estimation are obtained, which can be of independent interest.
...