# 耶鲁大学教授亲自授课，教你EM算法的全局收敛

Online Seminar on Mathematical Foundations of Data Science (Math for DS) [1]是在线的、每周举办的系列研讨会。研讨会旨在讨论数据科学、机器学习、统计以及优化背后的数学原理，邀请了北美诸多知名学者进行主题演讲。『运筹OR帷幄』和『机器之心』作为合作媒体，将在B站发布往期的回放视频。本期，受邀嘉宾将为我们带来主题为“Global Convergence of EM?”的演讲。

Online Seminar on  Mathematical Foundations of Data Science（Math4DS）是在线的、每周举办的系列研讨会，其内容涵盖数据科学、机器学习、统计以及优化背后的数学基础。

Math for DS 第三十八期线上直播预告

In this talk I will first discuss a recent joint work with Yihong Wu: https://arxiv.org/abs/1908.10935. We show that the randomly initialized EM algorithm for parameter estimation in the symmetric two-component Gaussian mixtures converges to the MLE in at most $\sqrt(n)$ iterations with high probability. Then I will mention the limitations of that work and propose an extension to general Gaussian mixtures by overparameterization.

Harrison Zhou is a Henry Ford II Professor and Chair of the Department of Statistics and Data Science at Yale. His main research interests include asymptotic decision theory, large covariance matrices estimation, graphical models, Bayesian nonparametrics, statistical network analysis, sparse canonical correlation analysis and principal component analysis, and analysis of iterative algorithms. His research has been acknowledged with awards including the National Science foundation Career Award, the Noether Young Scholar Award from the American Statistical Association, the Tweedie Award, the IMS Medallion lecture and IMS Fellow from the Institute of mathematical Statistics.

B站官方号：运筹OR帷幄

https://space.bilibili.com/403058474

Ethan X. Fang, Niao He, Junwei Lu, Zhaoran Wang,  Zhuoran Yang, Tuo Zhao