计算机科学 ›› 2023, Vol. 50 ›› Issue (2): 89-105.doi: 10.11896/jsjkx.220100001

• 数据库&大数据&数据科学 • 上一篇    下一篇

贝叶斯推理与并行回火研究综述

湛进, 王雪飞, 成雨蓉, 袁野   

  1. 北京理工大学计算机学院 北京 100081
  • 收稿日期:2022-01-04 修回日期:2022-06-23 出版日期:2023-02-15 发布日期:2023-02-22
  • 通讯作者: 成雨蓉(yrcheng@bit.edu.cn)
  • 作者简介:(yukiho1124z@gmail.com)
  • 基金资助:
    国家自然科学基金(61902023,U1811262,U21B2007,61932004,61572119,61622202);中央高校基本科研业务费专项资金(N181605012)

Overview of Research on Bayesian Inference and Parallel Tempering

ZHAN Jin, WANG Xuefei, CHENG Yurong, YUAN Ye   

  1. School of Computer,Beijing Institute of Technology,Beijing 100081,China
  • Received:2022-01-04 Revised:2022-06-23 Online:2023-02-15 Published:2023-02-22
  • Supported by:
    National Natural Science Foundation of China(61902023,U1811262,U21B2007,61932004,61572119,61622202) and Fundamental Research Funds for the Central Universities of Ministry of Education of China(N181605012)

摘要: 贝叶斯推理是统计学中的主要问题之一,旨在根据观测数据更新概率分布模型的先验知识。对于真实情况下常遇到的无法观测或难以直接计算的后验概率,贝叶斯推理可以对其进行近似,它是一种以贝叶斯定理为基础的重要方法。在许多机器学习问题中都涉及对包含各类特征数据的真实分布进行模拟和近似的过程,如分类模型、主题建模和数据挖掘等,因此贝叶斯推理在当今机器学习领域里具有重要而独特的研究价值。随着大数据时代的开始,研究者经由实际信息采集到海量的实验数据,导致需要模拟和计算的目标分布也非常复杂,如何在复杂数据下对目标分布进行结果精确和时间高效的近似推理,成为了当今贝叶斯推理问题的重难点。针对这一复杂分布模型下的推理问题,文中对近年来解决贝叶斯推理问题的两大主要方法——变分推理和采样方法,进行系统性地介绍和综述。首先,给出变分推理的问题定义与理论知识,详细介绍以坐标上升为基础的变分推理算法,并给出这一方法的已有应用与未来展望。然后,对国内外现有的采样方法的研究成果进行综述,给出各类主要采样方法的具体算法流程,并总结和对比这些方法的特性与优缺点。最后,引入并行回火技术,对其基本理论和方法进行概述,探讨并行回火与采样方法的结合与应用,为未来贝叶斯推理问题的发展探讨了新的研究方向。

关键词: 变分推理, 采样算法, 并行回火, 近似计算

Abstract: Bayesian inference is one of the main problems in statistics.It aims to update the prior knowledge of the probability distribution model based on the observation data.For the posterior probability that cannot be observed or is difficult to directly calculate,which is often encountered in real situations,Bayesian inference can obtain a good approximation.It is a kind of important method based on Bayesian theorem.Many machine learning problems involve the process of simulating and approximating the target distribution of various types of feature data,such as classification models,topic modeling,and data mining.Therefore,Bayesian inference has shown important and unique research value in the field of machine learning.With the beginning of the big data era,the experimental data collected by researchers through actual information is very large,resulting in the complex distribution of targets to be simulated and calculated.How to perform accurate and time-efficient approximation inferences on target distributions under complex data has become a major and difficult point in Bayesian inference problems today.Aiming at the infe-rence problem under this complex distribution model,this paper systematically introduces and summarizes the two main methods for solving Bayesian inference problems in recent years,which are variational inference and sampling methods.Firsly,this paper gives the problem definition and theoretical knowledge of variational inference,introduces in detail the variational inference algorithm based on coordinate ascent,and gives the existing applications and future prospects of this method.Next,it reviews the research results of existing sampling methods at home and abroad,gives the specific algorithm procedure of various main sampling methods,as well as summarizes and compares the characteristics,advantages and disadvantages of these methods.Finally,this paper introduces parallel tempering technique,outlines its basic theories and methods,discusses the combination and application of parallel tempering and sampling methods,and explores new research directions for the future development of Bayesian inference problems.

Key words: Variational inference, Sampling methods, Parallel tempering, Approximate computation

中图分类号: 

  • TP181
[1]PETERSON C,ANDERSON J R.A mean field learning algorithm for neural networks [J].Complex Systems,1987,1(5):995-1019.
[2]SAUL L K,JORDAN M I.Exploiting Tractable Substructures in Intractable Networks [J].Advances in Neural Information Processing Systems,1998,8:486-492.
[3]SAUL L K,JAAKKOLA T,JORDAN M I.Mean field theory for sigmoid belief networks [J].Mean Field Theory for Sigmoid Belief Networks,1996,4(1):61-76.
[4]HINTON G E,VAN CAMP D.Keeping the neural networks simple by minimizing the description length of the weights[C]//Proceedings of the Sixth Annual Conference on Computational Learning Theory.1993:5-13.
[5]NEAL R M,HINTON G E.A view of the EM algorithm that justifies incremental,sparse,and other variants[M]//Learning in Graphical Models.Dordrecht:Springer,1998:355-368.
[6]BISHOP C M,SVENSÉN M.Bayesian hierarchical mixtures of experts[J].arXiv:1212.2447,2012.
[7]GELFAND A E,SMITH A F M.Sampling-based approaches to calculating marginal densities[J].Journal of the American Statistical Association,1990,85(410):398-409.
[8]HOFFMAN M D,BLEI D,WANG C,et al.Stochastic VariationalInference[J].Journal of Machine Learning Research,2013,14:1303-1347.
[9]MINKA T P.Expectation propagation for approximate Bayesian inference[J].arXiv:1301.2294,2013.
[10]YEDIDIA J S,FREEMAN W T,WEISS Y.Bethe free energy,Kikuchi approximations,and belief propagation algorithms[J].Advances in Neural Information Processing Systems,2001,13:689.
[11]BISHOP C M,NASRABADI N M.Pattern recognition and machine learning[M].New York:Springer,2006.
[12]WINN J,BISHOP C M,JAAKKOLA T.Variational message passing[J].Journal of Machine Learning Research,2005,6(4):661-694.
[13]PEARL J.Probabilistic Reasoning in Intelligent Systems:Networks of Plausible Inference(Judea Pearl)[J].Artificial Intelligence,1990,48(8):117-124.
[14]KNOWLES D,MINKA T.Non-conjugate variational messagepassing for multinomial and binary regression[J].Advances in Neural Information Processing Systems,2011,24:110.
[15]GEMAN S,GEMAN D.Stochastic relaxation,Gibbs distribu-tions,and the Bayesian restoration of images[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1984(6):721-741.
[16]AMARI S I.Natural gradient works efficiently in learning[J].Neural Computation,1998,10(2):251-276.
[17]ROBBINS H,MONRO S.A Stochastic Approximation Method[J].Annals of Mathematical Statistics,1951,22(3):400-407.
[18]JAAKKOLA T S,JORDAN M I.A variational approach toBayesian logistic regression models and their extensions[C]//Sixth International Workshop on Artificial Intelligence and Statistics.PMLR,1997:283-294.
[19]BLEI D M,LAFFERTY J D.A correlated topic model of Science [J].Annals of Applied Statistics,2007,1(1):17-35.
[20]BRAUN M,MCAULIFFE J.Variational inference for large-scale models of discrete choice [J].Publications of the American Statistical Association,2010,105(489):324-335.
[21]WAND M P,ORMEROD J T,PADOAN S A,et al.Mean field variational Bayes for elaborate distributions [J].Bayesian Ana-lysis,2011,6(4):847-900.
[22]WANG C,BLEI D M.Variational Inference in NonconjugateModels [J].Journal of Machine Learning Research,2012,14(1):1005-1031.
[23]BUGBEE B D,BREIDT F J,VAN DER WOERD M J.Laplace variational approximation for semiparametric regression in the presence of heteroscedastic errors[J].Journal of Computational and Graphical Statistics,2016,25(1):225-245.
[24]KNOWLES D,MINKA T.Non-conjugate variational messagepassing for multinomial and binary regression[J].Advances in Neural Information Processing Systems,2011,24:110.
[25]JAAKKOLA T S,JORDAN M I.Bayesian parameter estimation via variational methods[J].Statistics & Computing,2000,10(1):25-37.
[26]WAND M P.Fully simplified multivariate normal updates in non-conjugate variational message passing[J].Journal of Machine Learning Research,2014,15:1351-1369.
[27]TAN L,NOTT D J.A stochastic variational framework for fitting and diagnosing generalized linear mixed models[J].Baye-sian Analysis,2012,9(4):963-1004.
[28]ROHDE D,WAND M P.Semiparametric mean field variational Bayes:General principles and numerical issues[J].The Journal of Machine Learning Research,2016,17(1):5975-6021.
[29]BERNARDO J M,BAYARRI M J,BERGER J O,et al.Thevariational Bayesian EM algorithm for incomplete data:with application to scoring graphical model structures[J].Bayesian Statistics,2003,7:453-464.
[30]NOTT D J,KOHN R.Regression Density Estimation With Va-riational Methods and Stochastic Approximation[J].Journal of Computational & Graphical Statistics,2012,21(3):797-820.
[31]PAISLEY J,BLEI D,JORDAN M.Variational Bayesian infe-rence with stochastic search[J].arXiv:1206.6430,2012.
[32]WINGATE D,WEBER T.Automated variational inference inprobabilistic programming[J].arXiv:1301.1299,2013.
[33]KINGMA D P,WELLING M.Auto-encoding variational bayes[J].arXiv:1312.6114,2013.
[34]REZENDE D J,MOHAMED S,WIERSTRA D.Stochasticbackpropagation and approximate inference in deep generative models[C]//International Conference on Machine Learning.PMLR,2014:1278-1286.
[35]RANGANATH R,TRAN D,BLEI D.Hierarchical variational models[C]//International Conference on Machine Learning.PMLR,2016:324-333.
[36]TRAN D,RANGANATH R,BLEI D M.The variational Gaus-sian process[J].arXiv:1511.06499,2015.
[37]LOGSDON B A,HOFFMAN G E,MEZEY J G.A variationalBayes algorithm for fast and accurate multiple locus genome-wide association analysis[J].BMC Bioinformatics,2010,11(1):12-23.
[38]SANGUINETTI G,LAWRENCE N,RATTRAY M.Probabi-listic inference of transcription factor concentrations and gene-specific regulatory activities[J].Bioinformatics,2006,22(22):2775-2781.
[39]XING E P,WEI W U,JORDAN M I,et al.LOGOS:a modular Bayesian model for de novo motif detection[J].Journal of Bioinformatics & Computational Biology,2003,2(1):127-154.
[40]VLADIMIR J,NEBOJSA J,CHRIS M,et al.Efficient approximations for learning phylogenetic HMM models from data[J].Bioinformatics,2004,20(suppl_1):i161-i168.
[41]RAJ A,STEPHENS M,PRITCHARD J K.fastSTRUCTURE:Variational Inference of Population Structure in Large SNP Data Sets[J].Genetics,2014,197(2):573-589.
[42]STEGLE O,PARTS L,DURBIN R,et al.A Bayesian Framework to Account for Complex Non-Genetic Factors in Gene Expression Levels Greatly Increases Power in eQTL Studies[J].PLOS Computational Biology,2010,6(5):e1000770.
[43]BISHOP C M,WINN J M.Non-linear Bayesian image modelling[C]//European Conference on Computer Vision.Berlin:Sprin-ger,2000:3-17.
[44]PAWAN KUMAR M,TORR P H S,ZISSERMAN A.Learning layered motion segmentations of video[J].International Journal of Computer Vision,2008,76(3):301-319.
[45]CHAN A B,VASCONCELOS N.Layered Dynamic Textures[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,2009,31(10):1862-1879.
[46]LIKAS A C,GALATSANOS N P.A Variational Approach for Bayesian Blind Image Deconvolution[J].IEEE Transactions on Signal Processing,2004,52(8):2222-2233.
[47]YU T,WU Y.Decentralized multiple target tracking using netted collaborative autonomous trackers[C]//2005 IEEE Compu-ter Society Conference on Computer Vision and Pattern Recognition(CVPR'05).IEEE,2005:939-946.
[48]VERMAAK J,LAWRENCE N D,PEREZ P.Variational infe-rence for visual tracking[C]//2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.IEEE,2003:33-43.
[49]CUMMINS M,NEWMAN P.FAB-MAP:Probabilistic Localization and Mapping in the Space of Appearance[J].The International Journal of Robotics Research,2008,27(6):647-665.
[50]RAMOS F,UPCROFT B,KUMAR S,et al.A Bayesian approach for place recognition[J].Robotics & Autonomous Systems,2012,60(4):487-497.
[51]SUDDERTH E,JORDAN M.Shared segmentation of natural scenes using dependent Pitman-Yor processes[J].Advances in Neural Information Processing Systems,2008,21:1585-1592.
[52]ROBERTS S J,PENNY W D.Variational Bayes for generalized autoregressive models[J].IEEE Transactions on Signal Proces-sing,2002,50(9):2245-2257.
[53]FLANDIN G,PENNY W D.Bayesian fMRI data analysis with sparse spatial basis function priors[J].NeuroImage,2007,34(3):1108-1125.
[54]HARRISON L M,GREEN G.A Bayesian spatiotemporal model for very large data sets[J].Neuroimage,2010,50(3):1126-1141.
[55]WOOLRICH M W,BEHRENS T,BECKMANN C F,et al.Multilevel linear modelling for FMRI group analysis using Bayesian inference[J].NeuroImage,2004,21(4):1732-1747.
[56]SATO M A,YOSHIOKA T,KAJIHARA S,et al.Hierarchical Bayesian estimation for MEG inverse problem[J].Neuroimage,2004,23(3):806-826.
[57]ZUMER J M,ATTIAS H T,SEKIHARA K,et al.A probabilistic algorithm integrating source localization and noise suppression for MEG and EEG data[J].Neuroimage,2007,37(1):102-115.
[58]LASHKARI D,SRIDHARAN R,VUL E,et al.Search for patterns of functional specificity in the brain:A nonparametric hie-rarchical Bayesian model for group fMRI data[J].Neuroimage,2012,59(2):1348-1368.
[59]NATHOO F S,BABUL A,MOISEEV A,et al.A variational Bayes spatiotemporal model for electromagnetic brain mapping[J].Biometrics,2014,70(1):132-143.
[60]SYKACEK P,ROBERTS S J,STOKES M.Adaptive BCI based on variational Bayesian Kalman filtering:an empirical evaluation[J].IEEE Transactions on Biomedical Engineering,2004,51(5):719-727.
[61]MANNING J R,RANGANATH R,NORMAN K A,et al.To-pographic Factor Analysis:A Bayesian Model for Inferring Brain Networks from Neural Data[J].Plos One,2014,9(5):e94914.
[62]GERSHMAN S J,BLEI D M,NORMAN K A,et al.Decomposing spatiotemporal brain patterns into topographic latent sources[J].Neuroimage,2014,98:91-102.
[63]LIANG P,JORDAN M I,KLEIN D.Probabilistic grammars and hierarchical Dirichlet processes[J].The Handbook of Applied Bayesian Analysis,2009,104:776-819.
[64]KURIHARA K,SATO T.Variational Bayesian grammar induction for natural language[C]//International Colloquium on Grammatical Inference.Berlin:Springer,2006:84-96.
[65]NASEEM T,CHEN H,BARZILAY R,et al.Using universal linguistic knowledge to guide grammar induction[J].Empirical Methods in Natural Language Processing,2010,1:1234-1244.
[66]COHEN S B,SMITH N A.Covariance in Unsupervised Lear-ning of Probabilistic Grammars[J].Journal of Machine Learning Research,2010,11(6):3017-3051.
[67]YOGATAMA D,WANG C,ROUTLEDGE B R,et al.Dynamic language models for streaming text[J].Transactions of the Association for Computational Linguistics,2014,2:181-192.
[68]BERNARDO J M,BAYARRI M J,BERGER J O,et al.Thevariational Bayesian EM algorithm for incomplete data:with application to scoring graphical model structures[J].Bayesian Statistics,2003,7:453-464.
[69]WANG P,BLUNSOM P.Collapsed variational Bayesian infe-rence for hidden Markov models[C]//Artificial Intelligence and Statistics.PMLR,2013:599-607.
[70]REYES-GOMEZ M J,ELLIS D P W,JOJIC N.Multiband audio modeling for single-channel acoustic source separation[C]//2004 IEEE International Conference on Acoustics,Speech,and Signal Processing.IEEE,2004,5:641-644.
[71]DENG L.Switching dynamic system models for speech articulation and acoustics[M]//Mathematical Foundations of Speech and Language Processing.New York:Springer,2004:115-133.
[72]BRAUN M,MCAULIFFE J.Variational inference for large-scale models of discrete choice[J].Publications of the American Statistical Association,2010,105(489):324-335.
[73]BROEK B V D,WIEGERINCK W,KAPPEN B.Graphical mo-del inference in optimal control of stochastic multi-agent systems.[J].Journal of Artificial Intelligence Research,2008,32:95-122.
[74]FURMSTON T,BARBER D.Variational methods for Rein-forcement Learning[J].Journal of Machine Learning Research,2009,9:241-248.
[75]HOFMAN J M,WIGGINS C H.A Bayesian Approach to Network Modularity[J].Physical Review Letters,2007,100(25):258701.
[76]AIROLDI E M,BLEI D M,FIENBERG S E,et al.Mixed Membership Stochastic Blockmodels[J].Journal of Machine Learning Research,2008,9:1981-2014.
[77]REGIER J,MILLER A,MCAULIFFE J,et al.Celeste:Varia-tional inference for a generative model of astronomical images[C]//International Conference on Machine Learning.PMLR,2015:2095-2103.
[78]EROSHEVA E A,FIENBERG S E,JOUTARD C.Describingdisability through individual-level mixture models formultiva-riate binary data[J].The Annals of Applied Statistics,2007,1(2):502-537.
[79]GRIMMER J.An Introduction to Bayesian Inference via Variational Approximations[J].Political Analysis,2011,19(1):32-47.
[80]BARBER D,DE V L P.Variational Cumulant Expansions for Intractable Distributions[J].Computer Science,1999,10(1):435-455.
[81]LEISINK M,KAPPEN H.A tighter bound for graphical models[J].Advances in Neural Information Processing Systems,2000,13(9):2149-2171.
[82]WATSON-GANDY A.Elements of Sampling Theory[J].Journal of the Operational Research Society,1976,27(1):138-139.
[83]SMITH T.The Foundations of Survey Sampling:A Review[J].Journal of the Royal Statistical Society,1976,139(2):183-204.
[84]HANSEN M H.Some history and reminiscences on survey sampling[J].Statistical Science,1987,2(2):180-190.
[85]CHAUDHURI A,STENGER H.Survey Sampling:Theory and Methods[M].CRC Press,1992:241-242.
[86]UEBE G.Yves Tillé:Sampling algorithms[J].AStA Advances in Statistical Analysis,2009,93(1):117-118.
[87]CHAUDHURI A,STENGER H.Survey Sampling:Theory and Methods[M].CRC Press,1992:241-242.
[88]FULLER W A.Sampling statistics[M].John Wiley & Sons.2011.
[89]HIBBERTS M,BURKE JOHNSON R,HUDSON K.Common survey sampling techniques[M]//Handbook of survey metho-dology for the social sciences.Springer,New York,NY,2012:53-74.
[90]SINGH R,MANGAT N S.Stratified sampling[M]//Elements of Survey Sampling.Dordrecht:Springer,1996:102-144.
[91]BREWER K,GREGOIRE T G.Introduction to Survey Sampling[J].Handbook of Statistics,2009,29:9-37.
[92]MICHAEL B J.The Future of Survey Sampling[J].PublicOpinion Quarterly,2011(5):872-888.
[93]HANNEKE S,XING E P.Network completion and survey sampling[C]//Artificial Intelligence and Statistics.PMLR,2009:209-215.
[94]HECKATHORN D D,CAMERON C J.Network Sampling:From Snowball and Multiplicity to Respondent-Driven Sampling[J].Annual Review of Sociology,2017,43(1):101-119.
[95]WATSON-GANDY A.Elements of Sampling Theory[J].Journal of the Operational Research Society,1976,27(1):138-139.
[96]EFRON B,TIBSHIRANI R J.An introduction to the bootstrap[M].FLORIDA:CRC press,1994.
[97]WATSON-GANDY A.Elements of Sampling Theory[J].Journal of the Operational Research Society,1976,27(1):138-139.
[98]ERICKSON B H,NOSANCHUK T A.Applied network sampling[J].Social Networks,1983,5(4):367-382.
[99]GOODMAN L A.Snowball Sampling[J].Annals of Mathematical Statistics,1961,32(1):148-170.
[100]GHOJOGH B,CROWLEY M.The theory behind overfitting,cross validation,regularization,bagging,and boosting:tutorial[J].arXiv:1905.12787,2019.
[101]MACKAY D J C.Introduction to monte carlo methods[M]//Learning in Graphical Models.Dordrecht:Springer,1998:175-204.
[102]CAFLISCH R E.Monte carlo and quasi-monte carlo methods[J].Acta Numerica,1998,7:1-49.
[103]HAMMERSLEY J.Monte carlo methods[M].Berlin:SpringerScience & Business Media,2013.
[104]KROESE D P,TAIMRE T,BOTEV Z I.Handbook of monte carlo methods[M].Berlin:John Wiley & Sons,2013.
[105]MACKAY D J C.Information Theory,Inference,and LearningAlgorithms[J].IEEE Transactions on Information Theory,2003,50(10):1461-1462.
[106]GLYNN P W,IGLEHART D L.Importance Sampling for Stochastic Simulations[J].Management Science,1989,35(11):1367-1392.
[107]WELLS M T,CASELLA G,ROBERT C P.Generalized accept-reject sampling schemes[M]//A Festschrift for Herman Rubin.Institute of Mathematical Statistics,2004:342-348.
[108]MURRAY I.Advances in Markov chain Monte Carlo methods[M].London:University of London,2007.
[109]KOLLER D,FRIEDMAN N.Probabilistic graphical models:principles and techniques[M].Massachusetts:MIT Press,2009.
[110]METROPOLIS N,ROSENBLUTH A W,ROSENBLUTH MN,et al.Equation of state calculations by fast computing machines[J].Journal of Chemical Physics,1953,21(6):1087-1092.
[111]HASTINGS W K.Monte Carlo sampling methods using Markov chains and their applications[J].Biometrika,1970,57:97-109.
[112]GEMAN S,GEMAN D.Stochastic relaxation,Gibbs distribu-tions,and the Bayesian restoration of images[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1984(6):721-741.
[113]NEAL R M.Slice sampling[J].Annals of Statistics,2003,31(3):705-767.
[114]SKILLING J,MACKAY D J C.[Slice Sampling]:Discussion[J].Annals of Statistics,2003:31(3):753-755.
[115]SPITZER F L.Principles of Random Walk[M].Berlin:Principles of Random Walk,1975.
[116]DUANE S,KENNEDY A D,PENDLETON B J,et al.Hybrid monte carlo[J].Physics Letters B,1987,195(2):216-222.
[117]ADLER S L.Over-relaxation method for the Monte Carlo eva-luation of the partition function for multiquadratic actions[J].Physical Review D Particles & Fields,1981,23(12):2901-2904.
[118]NEAL R M.Suppressing random walks in Markov chain Monte Carlo using ordered overrelaxation[M]//Learning in Graphical Models.Dordrecht:Springer,1998:205-228.
[119]NEWMAN M E J,BARKEMA G T.Monte Carlo methods instatistical physics[M].Clarendon Press,1999.
[120]KURT B.Monte Carlo methods in statistical physics[J].Topics in Current Physics,1990,46(2):252-253.
[121]GLASSERMAN P.Monte Carlo methods in financial enginee-ring[M].New York:Springer,2004.
[122]BARTO A,DUFF M.Monte Carlo Matrix Inversion and Reinforcement Learning[J].Advances in Neural Information Processing Systems,1994,6(6):687-694.
[123]WANG Y,WON K S,HSU D,et al.Monte carlo bayesian reinforcement learning[J].arXiv:1206.6449,2012.
[124]MONTAGUE P R.Reinforcement learning:an introduction,by Sutton,RS and Barto,AG[J].Trends in Cognitive Sciences,1999,3(9):360.
[125]NEAL R M.Annealed importance sampling[J].Statistics and Computing,2001,11(2):125-139.
[126]GILKS W R,WILD P.Adaptive rejection sampling for Gibbssampling[J].Journal of the Royal Statistical Society:Series C(Applied Statistics),1992,41(2):337-348.
[127]TEH D.Concave-Convex Adaptive Rejection Sampling[J].Journal of Computational & Graphical Statistics,2011,20(3):670-691.
[128]MARTINO L,MÍGUEZ J.A generalization of the adaptive rejection sampling algorithm[J].Statistics and Computing,2011,21(4):633-647.
[129]DELIGIANNIDIS G,DOUCET A,RUBENTHALER S.En-semble rejection sampling[J].arXiv:2001.09188,2020.
[130]AZADI S,OLSSON C,DARRELL T,et al.Discriminator rejection sampling[J].arXiv:1810.06758,2018.
[131]GROVER A,GUMMADI R,LAZARO-GREDILLA M,et al.Variational rejection sampling[C]//International Conference on Artificial Intelligence and Statistics.PMLR,2018:823-832.
[132]SCOLLNIK D P M.An introduction to Markov Chain Monte Carlo methods and their actuarial applications[C]//Proceedings of the Casualty Actuarial Society.1996,83:114-165.
[133]GELFAND A E.Gibbs sampling[J].Journal of the American statistical Association,2000,95(452):1300-1304..
[134]WU T T,LANGE K.Coordinat descent algorithms for lasso penalized regression[J].Annals of Applied Stats,2008,2(1):224-244.
[135]WRIGHT S J.Coordinate descent algorithms[J].Mathematical Programming,2015,151(1):3-34.
[136]DWIVEDI R,CHEN Y,WAINWRIGHT M J,et al.Log-concave sampling:Metropolis-Hastings algorithms are fast![C]//Conference on learning theory.PMLR,2018:793-797.
[137]SWENDSEN R H,WANG J S.Replica Monte Carlo Simulation of Spin-Glasses[J].Physical Review Letters,1986,57(21):2607-2609.
[138]GEYER C J.Markov chain Monte Carlo Maximum Likelihood[J].Computing Science & Statistics,1992,91(8):133-169.
[139]HANSMANN U.Parallel Tempering Algorithm for Conformational Studies of Biological Molecules[J].Chemical Physics Letters,1997,281(1/2/3):140-150.
[140]FALCIONI M,DEEM M W.A Biased Monte Carlo Scheme for Zeolite Structure Solution[J].The Journal of Chemical Physics,1998,110(3):1754-1766.
[141]SUGITA Y,OKAMOTO Y.Replica-exchange molecular dy-namics method for protein folding[J].Chemical Physics Letters,1999,314(1/2):141-151.
[142]MANOUSIOUTHAKIS V I,DEEM M W.Strict Detailed Ba-lance is Unnecessary in Monte Carlo Simulation[J].The Journal of Chemical Physics,1999,110(6):2753-2756.
[143]KOFKE D A.On the acceptance probability of replica-exchange Monte Carlo trials[J].Journal of Chemical Physics,2002,117(15):6911-6914.
[144]SANBONMATSU K Y,GARCÍA A E.Structure of Met-en-kephalin in explicit aqueous solution using replica exchange molecular dynamics[J].Proteins Structure Function and Bioinformatics,2002,46(2):225-234.
[145]SCHUG A,HERGES T,WENZEL W.All-atom folding of the three-helix HIV accessory protein with an adaptive parallel tempering method[J].Proteins Structure Function & Bioinforma-tics,2004,57(4):792-798.
[146]PREDESCU C,PREDESCU M,CIOBANU C V.On the effi-ciency of exchange in parallel tempering Monte Carlo simulations[J].The Journal of Physical Chemistry B,2005,109(9):4189-4196.
[147]KONE A,KOFKE D A.Selection of temperature intervals for parallel-tempering simulations[J].Journal of Chemical Physics,2005,122(20):2607.
[148]TREBST S,HUSE D A,TROYER M.Optimizing the ensemble for equilibration in broad-histogram Monte Carlo simulations[J].Physical Review E Statistical Nonlinear & Soft Matter Physics,2004,70(4):46701.
[149]FUKUNISHI H,WATANABE O,TAKADA S.On the Hamiltonian replica exchange method for efficient sampling of biomolecular systems:Application to protein structure prediction[J].Journal of Chemical Physics,2002,116(20):9058-9067.
[150]YAN Q,DE PABLO J J.Hyper-parallel tempering MonteCarlo:Application to the Lennard-Jones fluid and the restricted primitive model[J].Journal of Chemical Physics,1999,111(21):9509-9516.
[151]YAN Q,PABLO J D.Hyperparallel tempering Monte Carlosimulation of polymeric systems[J].Journal of Chemical Phy-sics,2000,113(3):1276-1282.
[152]SUGITA Y,OKAMOTO Y.Replica-exchange molecular dy-namics method for protein folding[J].Chemical Physics Letters,1999,314(1/2):141-151.
[153]FALLER R,YAN Q,PABLO J D.Multicanonical parallel tempering[J].Journal of Chemical Physics,2002,116(13):5419-5423.
[154]MITSUTAKE A,SUGITA Y,OKAMOTO Y.Replica-ex-change multicanonical and multicanonical replica-exchange Monte Carlo simulations of peptides.I.Formulation and benchmark test[J].Journal of Chemical Physics,2003,118(14):6664-6675.
[155]MURATA K,SUGITA Y,OKAMOTO Y.Free energy calculations for DNA base stacking by replica-exchange umbrella sampling[J].Chemical Physics Letters,2004,385(1/2):1-7.
[156]OKAMOTO Y.Generalized-Ensemble Algorithms:EnhancedSampling Techniques for Monte Carlo and Molecular Dynamics Simulations[J].Journal of Molecular Graphics & Modelling,2004,22(5):425-439.
[157]WANG F,LANDAU D P.Efficient,Multiple-Range RandomWalk Algorithm to Calculate the Density of States[J].Physical Review Letters,2001,86(10):2050-2053.
[158]YAN Q,PABLO J D.Fast Calculation of the Density of States of a Fluid by Monte Carlo Simulations[J].Physical Review Letters,2003,90(3):35701.
[159]FASNACHT M,SWENDSEN R H,ROSENBERG J M.Adaptive integration method for Monte Carlo simulations[J].Physical Review E,2004,69(5 Pt 2):56704.
[160]KIM E B,FALLER R,YAN Q,et al.Potential of Mean Force between a Spherical Particle Suspended in a Nematic Liquid Crystal and a Substrate[J].The Journal of Chemical Physics,2002,117(16):7781-7787.
[161]RATHORE N,KNOTTS IV T A,DE PABLO J J.Density of states simulations of proteins[J].Journal of Chemical Physics,2003,118(9):4285-4290.
[162]RATHORE N,YAN Q,PABLO J D.Molecular simulation of the reversible mechanical unfolding of proteins[J].Journal of Chemical Physics,2004,120(12):5781.
[163]MASTNY E A,PABLO J D.Direct calculation of solid-liquid equilibria from density-of-states Monte Carlo simulations[J].Journal of Chemical Physics,2005,122(12):9352.
[164]HABECK M,NILGES M,RIEPING W.Replica-ExchangeMonte Carlo Scheme for Bayesian Data Analysis[J].Physical Review Letters,2005,94(1):18105.
[1] 张墨华, 彭建华.
基于狄利克雷过程混合模型的内外先验融合
Integration of Internal and External Priors Based on Dirichlet Process Mixture Model
计算机科学, 2020, 47(5): 172-180. https://doi.org/10.11896/jsjkx.190400060
[2] 赵倩倩,吕敏,许胤龙.
基于两种子结构感知的社交网络Graphlets采样估计算法
Estimating Graphlets via Two Common Substructures Aware Sampling in Social Networks
计算机科学, 2019, 46(3): 314-320. https://doi.org/10.11896/j.issn.1002-137X.2019.03.046
[3] 刘付勇,高贤强,张著.
基于改进贝叶斯概率模型的推荐算法
Improved Bayesian Probabilistic Model Based Recommender System
计算机科学, 2017, 44(5): 285-289. https://doi.org/10.11896/j.issn.1002-137X.2017.05.052
[4] 刘建伟,崔立鹏,黎海恩,罗雄麟.
概率图模型推理方法的研究进展
Research and Development on Inference Technique in Probabilistic Graphical Models
计算机科学, 2015, 42(4): 1-18. https://doi.org/10.11896/j.issn.1002-137X.2015.04.001
[5] 江涛.
基于D-S证据理论的信息融合算法
Information Fusion Algorithm Based on D-S Evidence Theory
计算机科学, 2013, 40(Z11): 120-124.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!