Gibbs sampler and coordinate ascent variational inference: A set-theoretical review
@article{Lee2020GibbsSA, title={Gibbs sampler and coordinate ascent variational inference: A set-theoretical review}, author={Se Yoon Lee}, journal={Communications in Statistics - Theory and Methods}, year={2020}, volume={51}, pages={1549 - 1568}, url={https://api.semanticscholar.org/CorpusID:220935477} }
This paper defines fundamental sets of densities frequently used in Bayesian inference, and provides an alternative mechanism for analyzing the two schemes endowed with pedagogical insights.
34 Citations
Unsupervised AoA Estimation Based on Dual-Path Knowledge-Aware Auto-Encoders
- 2025
Computer Science, Engineering
An unsupervised deep learning-based framework based on dual-path model-driven auto-encoders for angle-of-arrivals (AoAs) estimation in massive MIMO systems that addresses two key challenges in unsupervised learning: the lack of interpretability and the convergence to local optima.
Effects of Multi-Omics Characteristics on Identification of Driver Genes Using Machine Learning Algorithms
- 2022
Computer Science, Medicine
This study presents a framework to analyze the effects of multi-omics characteristics on the identification of cancer driver genes and provides a more comprehensive understanding of cancer mechanisms.
Regularization for Wasserstein distributionally robust optimization
- 2023
Computer Science, Mathematics
This paper derives a general strong duality result of regularized Wasserstein distributionally robust problems in the case of entropic regularization and provides an approximation result when the regularization parameters vanish.
Course Project Report: Comparing MCMC and Variational Inference for Bayesian Probabilistic Matrix Factorization on the MovieLens Dataset
- 2025
Computer Science, Mathematics
Two Bayesian inference methods are employed: Markov Chain Monte Carlo and Variational Inference to approximate the posterior and Experimental results demonstrate that VI offers faster convergence, while MCMC provides more accurate posterior estimates.
Singular Control in Inventory Management with Smooth Ambiguity
- 2025
Business, Mathematics
We consider singular control in inventory management under Knightian uncertainty, where decision makers have a smooth ambiguity preference over Gaussian-generated priors. We demonstrate that…
Combining Bayesian Inference and Reinforcement Learning for Agent Decision Making: A Review
- 2025
Computer Science
This paper discusses the summary of how Bayesian methods work in the data collection, data processing and policy learning stages of RL to pave the way for better agent decision-making strategies.
Quantum Neural Network Restatement of Markov Jump Process
- 2025
Computer Science, Physics
In this strand of research, direct treatment and description of such abstract notions of learning theory in terms of quantum information be one of the most favorable candidates for direct treatment and description of such abstract notions of learning theory in terms of quantum mechanical systems.
Hierarchical mixtures of Unigram models for short text clustering: the role of Beta-Liouville priors
- 2025
Computer Science, Mathematics
This paper presents a variant of the Multinomial mixture model tailored to the unsupervised classification of short text data based on the Beta-Liouville distribution, which offers a more flexible correlation structure than the Dirichlet.
Simple variational inference based on minimizing Kullback–Leibler divergence
- 2024
Mathematics, Computer Science
Provable Accuracy Bounds for Hybrid Dynamical Optimization and Sampling
- 2024
Computer Science, Engineering
This work provides non-asymptotic convergence guarantees for hybrid LNLS by reducing to block Langevin Diffusion (BLD) algorithms, and proves exponential KL-divergence convergence for randomized and cyclic block selection strategies using ideal DXs.
53 References
Variational Inference: A Review for Statisticians
- 2016
Mathematics, Computer Science
Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived.
Explaining Variational Approximations
- 2010
Mathematics, Computer Science
The ideas of variational approximation are illustrated using examples that are familiar to statisticians using terminology, notation, and examples from the former field.
Information Theory and Statistics: A Tutorial
- 2004
Mathematics, Computer Science
This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting, and an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory.
Explaining the Gibbs Sampler
- 1992
Mathematics, Computer Science
A simple explanation of how and why the Gibbs sampler works is given and analytically establish its properties in a simple case and insight is provided for more complicated cases.
Probability and Measure
- 1979
Mathematics
Probability. Measure. Integration. Random Variables and Expected Values. Convergence of Distributions. Derivatives and Conditional Probability. Stochastic Processes. Appendix. Notes on the Problems.…
Yes, but Did It Work?: Evaluating Variational Inference
- 2018
Mathematics, Computer Science
Two diagnostic algorithms are proposed that give a goodness of fit measurement for joint distributions, while simultaneously improving the error in the estimate.
Log-concave sampling: Metropolis-Hastings algorithms are fast!
- 2018
Mathematics, Computer Science
A non-asymptotic upper bound on the mixing time of the Metropolis-adjusted Langevin algorithm (MALA) is proved, and the gains of MALA over ULA for weakly log-concave densities are demonstrated.
Advances in Variational Inference
- 2019
Computer Science, Mathematics
An overview of recent trends in variational inference is given and a summary of promising future research directions is provided.
Theoretical and Computational Guarantees of Mean Field Variational Inference for Community Detection
- 2020
Computer Science, Mathematics
The mean field method for community detection under the Stochastic Block Model has a linear convergence rate and converges to the minimax rate within $\log n$ iterations and similar optimality results for Gibbs sampling and an iterative procedure to calculate maximum likelihood estimation are obtained, which can be of independent interest.