勷勤数学•专家报告
题 目:Deep Neural Networks and Finite Elements
报 告 人:何俊材 讲师 (研究科学家) (邀请人:钟柳强 )
阿卜杜拉国王科技大学
时 间:7月7日 14:30-15:30
地 点:数学院西楼二楼会议室
报告人简介:
Juncai He is currently a research scientist at King Abdullah University of Science and Technology (KAUST). Before that, he received the B.S. degree in Pure and Applied Mathematics from Sichuan University in 2014 and Ph.D. degree in Computational Mathematics under the supervision of Prof. Jinchao Xu and Prof. Jun Hu at Peking University. From 2019 to 2020, he worked as a Postdoctoral Scholar supervised by Prof. Jinchao Xu at The Pennsylvania State University. From 2020 to 2022, he was an R.H. Bing instructor fellow working with Prof. Richard Tsai and Prof. Rachel Ward at UT Austin. His research focuses on mathematical analysis, algorithm development, and their applications in machine learning and scientific computing, spanning both data and physical sciences.
摘 要:
In this talk, I will discuss our new research on the connection between finite element methods and deep neural network (DNN) functions. At the beginning of the talk, we will first showcase some successful applications of neural networks in both physical and data sciences. Then, I will recall our previous studies that any linear finite element functions, regardless of dimension or mesh, can be represented by DNNs with the ReLU activation function. Extending this finding to finite element functions of arbitrary order has been a challenging problem. In this presentation, we will unveil a solution to this open problem. Specifically, we will demonstrate that finite element functions of any order, constructed on arbitrary simplicial meshes in any dimension, can be represented by a type of DNN equipped with appropriately selected activation functions. Additionally, we will illustrate how this capability to express finite element functions can be utilized to determine an approximation rate for such DNNs.
欢迎老师、同学们参加、交流!