Transformer meets boundary value inverse problems: structure-conforming operator learning

发布者:文明办发布时间:2023-04-26浏览次数:340


主讲人:郭汝驰 加州大学尔湾分校访问助理教授


时间:2023年4月26日10:00


地点:腾讯会议 345 803 597


举办单位:数理学院


主讲人介绍:郭汝驰博士,于2019年在弗吉尼亚理工大学取得博士学位,后于俄亥俄州立大学担任Zassenhaus Assistant Professor,现于加州大学尔湾分校担任Visiting Assistant Professor。主要研究领域为科学计算,特别是针对偏微分方程的数值方法,包括界面问题的非匹配网格算法,以及界面反问题的重构算法,包括浸入有限元算法、虚拟元算法,以及反问题的优化算法、直接法和深度学习算法等。在 SIAM J. Numer. Anal., M3AS, SIAM J. Sci. Comput., J. Comput. Phys., IMA J. Numer. Anal., ESAIM:M2AN, J. Sci. Comput.等计算数学领域杂志上发表多篇文章。


内容介绍:A Transformer-based deep direct sampling method is proposed for solving a class of boundary value inverse problem. A real-time reconstruction is achieved by evaluating the learned inverse operator between carefully designed data and the reconstructed images. An effort is made to give a case study for a fundamental and critical question: whether and how one can benefit from the theoretical structure of a mathematical problem to develop task-oriented and structure-conforming deep neural network? Inspired by direct sampling methods for inverse problems, the 1D boundary data are preprocessed by a partial differential equa-tion-based feature map to yield 2D harmonic extensions in different frequency input channels. Then, by introducing learnable non-local kernel, the approxima-tion of direct sampling is recast to a modified attention mechanism. The proposed method is then applied to electrical impedance tomography, a well-known severe-ly ill-posed nonlinear inverse problem. The new method achieves superior accura-cy over its predecessors and contemporary operator learners, as well as shows robustness with respect to noise. This research shall strengthen the insights that the attention mechanism, despite being invented for natural language processing tasks, offers great flexibility to be modified in conformity with the a priori math-ematical knowledge, which ultimately leads to the design of more physics-compatible neural architectures.

热点新闻
最新要闻
Baidu
sogou