报告时间:2018年11月9日 16:00-17:00
报告地点:明德主楼1016会议室
报告题目: Distributed stochastic gradient descent with diverging dimensions
报告摘要:
In this talk, we will investigate the statistical estimation error of stochastic gradient descent (SGD) method when the dimension goes to infinity. The results are then applied to distributed statistical estimation with divide-conquer SGD. In particular, we will show that to achieve the optimal estimation rates, a necessary condition for the number of machines $K$ is $K=O(sqrt{n/p})$, where $n$ is the samples size and $p$ is the dimension. To avoid this strict condition, we will further introduce a stochastic approximate Newton-type method for distributed statistical estimation.
报告简介:
刘卫东,上海交通大学自然科学研究院,教授。2008年在浙江大学获博士学位,2008-2011年在香港科技大学和美国宾夕法尼亚大学沃顿商学院从事博士后研究。
2018年获得国家杰出青年科学基金。研究领域为高维数据的统计推断,在统计学四大顶级期刊 AOS, JASA, JRSSB, Biometrika和概率论顶级期刊 AOP, PTRF等发表四十余篇论文。