学术报告:Deterministic Sampling of Expensive Posteriors Using Kullback-Leibler Divergence


报告人:孙法省教授     东北师范大学

时间:2020年11月1日上午8:30-10:30

地点:数学与信息学院201报告厅


摘要:This paper introduces  a new way of discrete approximation  a continuous probability distribution $F$ into a set of representative points.  These points  are generated  by minimizing the Kullback-Leibler divergence,  a statistical potential measure of two probability distributions for  testing goodness-of-fit. The Kullback-Leibler divergence is  nonnegative, with the value is zero when the  two probability distributions are equal.  With  this feature, we show that the empirical distribution of these representative points converges to the distribution $F$. The advantage of these points over Monte Carlo  and other deterministic sampling are illustrated in the simulation.  Two important applications of such points are then highlighted: (a) simulation from the complex probability densities, and (b) exploration and optimization of expensive black-box functions.


报告人简介:

        孙法省,东北师范大学教授、博士生导师,教育部“长江学者奖励计划”青年学者,吉林省优秀教师。主要从事大数据抽样与分析、计算机试验设计与分析、及高维数据分析等方面的研究。在统计学顶级期刊Annals of Statistics,JASA,Biometrika等发表论文近30篇。获教育部自然科学二等奖、全国统计科学研究优秀成果奖、吉林省自然科学学术成果二等奖奖各一项。


欢迎广大师生莅临参加!