勷勤数学•专家报告-郭汝驰

勷勤数学•专家报告


题      目:Transformer Meets Boundary Value Inverse Problems: Structure-Conforming Operator Learning


报  告  人:郭汝驰   教授  (邀请人:钟柳强 )

                                   香港中文大学


时      间:7月6日  10:30-11:30


地     点:数学院西楼二楼会议室


报告人简介:

       郭汝驰,于2019年在弗吉尼亚理工大学取得博士学位,先后在俄亥俄州立大学担任Zassenhaus Assistant Professor、加州大学尔湾分校担任Visiting Assistant Professor,期间获美国国家自然科学基金(NSF)资助。现于香港中文大学担任Research Assistant Professor。主要研究领域为科学计算,特别是针对偏微分方程的数值方法,包括界面问题的非匹配网格算法,以及界面反问题的重构算法,包括浸入有限元算法、虚拟元算法,以及优化算法和深度学习算法等。在 SIAM J. Numer. Anal., M3AS, SIAM J. Sci. Comput., J. Comput. Phys., IMA J. Numer. Anal., ESAIM:M2AN, J. Sci. Comput., Comput. Methods Appl. Mech. Eng.等计算数学领域杂志,以及深度学习会议ICLR上发表多篇文章。


摘      要:

       A Transformer-based deep direct sampling method is proposed for solving a class of boundary value inverse problem. A real-time reconstruction is achieved by evaluating the learned inverse operator between carefully designed data and the reconstructed images. An effort is made to give a case study for a fundamental and critical question: whether and how one can benefit from the theoretical structure of a mathematical problem to develop task-oriented and structure-conforming deep neural network? Inspired by direct sampling methods for inverse problems, the 1D boundary data are preprocessed by a partial differential equation-based feature map to yield 2D harmonic extensions in different frequency input channels. Then, by introducing learnable non-local kernel, the approximation of direct sampling is recast to a modified attention mechanism. The proposed method is then applied to electrical impedance tomography, a well-known severely ill-posed nonlinear inverse problem. The new method achieves superior accuracy over its predecessors and contemporary operator learners, as well as shows robustness with respect to noise. This research shall strengthen the insights that the attention mechanism, despite being invented for natural language processing tasks, offers great flexibility to be modified in conformity with the a priori mathematical knowledge, which ultimately leads to the design of more physics-compatible neural architectures.


          欢迎老师、同学们参加、交流!