报告人:杨浩然 (北京大学)
报告时间:2025年12月12日(星期五)10:00-11:30
报告地点:科技楼南楼711室
报告摘要:This talk presents two results concerning the rate of convergence in stochastic systems: the convergence rate of stable central limit theorem (stable CLT) and the error estimates of the noised online stochastic gradient descent with momentum (noised SGDM) algorithm. The stable CLT is an extension of the classical CLT for heavy-tailed distributions. We prove that the stable CLT holds in the total variation distance and get its optimal convergence rate for all α ∈ (0, 2). Our method relies on measure decompositions, one-step estimates, and a delicate bootstrapping method with respect to α. Regarding the noised SGDM, we establish the time-uniform error estimates in both L^1-Wasserstein distance and total variation distance. Leveraging Malliavin analysis and the exponential ergodicity of stochastic processes, we characterize the long-time behavior of this algorithm, even for non-convex objective functions. This talk is based on joint work with Arnaud Guillin (LMBP), Xiang Li (SUSTech), Yu Wang (UM), and Lihu Xu (UM), covering two related projects.
报告人简介:杨浩然,北京大学博士,主要研究方向为随机分析与生物的交叉
邀请人:吴付科