dataset
2024 年 10 月 7 日
Stochastic Gradient Variational Bayes in the Stochastic Blockmodel
title: Stochastic Gradient Variational Bayes in the Stochastic Blockmodel
publish date:
2024-10-03
authors:
Pedro Regueiro et.al.
paper id
2410.02649v1
download
abstracts:
Stochastic variational Bayes algorithms have become very popular in the machine learning literature, particularly in the context of nonparametric Bayesian inference. These algorithms replace the true but intractable posterior distribution with the best (in the sense of Kullback-Leibler divergence) member of a tractable family of distributions, using stochastic gradient algorithms to perform the optimization step. stochastic variational Bayes inference implicitly trades off computational speed for accuracy, but the loss of accuracy is highly model (and even dataset) specific. In this paper we carry out an empirical evaluation of this trade off in the context of stochastic blockmodels, which are a widely used class of probabilistic models for network and relational data. Our experiments indicate that, in the context of stochastic blockmodels, relatively large subsamples are required for these algorithms to find accurate approximations of the posterior, and that even then the quality of the approximations provided by stochastic gradient variational algorithms can be highly variable.
QA:
coming soon
编辑整理: wanghaisheng 更新日期:2024 年 10 月 7 日