报告时间:2023年3月31日上午10:00-11:00
报告地点:中国人民大学明德主楼1037(腾讯会议ID:549-125-631)
报告嘉宾:林乾
报告题目:Generalization ability of wide neural networks on R
报告摘要:
Generalization ability of wide neural networks on R
We perform a study on the generalization ability of the wide two-layer ReLU neural network on . We first establish some spectral properties of the neural tangent kernel (NTK): a) Kd, the NTK defined on Rd , is positive definite; b) λi (K1) , the i-th largest eigenvalue of K1, is proportional to i-2. We then show that: i) when the width m→∞, the neural network kernel (NNK) uniformly converges to the NTK; ii) the minimax rate of regression over the RKHS associated to K1 is n-2/3; iii) if one adopts the early stopping strategy in training a wide neural network, the resulting neural network achieves the minimax rate; iv) if one trains the neural network till it overfits the data, the resulting neural network can not generalize well. Finally, we provide an explanation to reconcile our theory and the widely observed “benign overfitting phenomenon”.
个人简介:
林乾,清华大学统计学研究中心副教授, 2010年在麻省理工数学系获得博士学位。2017年8月至今在清华大学任教。从事高维充分性降维,深度学习的数理基础等问题的研究。
扫描下方二维码报名↘
所有消息会在两个群中同步通知
请大家不要重复加群~