About me
I am a postdoctoral fellow at Institute for Foundations of Data Science (FDS), Yale University. I obtained my PhD degree in Applied Mathematics and Computational Science at University of Pennsylvania in 2023, advised by Prof. Paris Perdikaris. Before that, I received my B.S. degree in Mathematics at Wuhan University. My research interests lie in the intersection of machine learning, scientific computing, and computational physics. I am particularly interested in developing scalable and robust algorithms for solving partial differential equations, and leveraging these algorithms to solve challenging problems in science and engineering.
News
[02/2025] Our new paper “Sharp-PINNs: Staggered hard-constrained physics-informed neural networks for phase field modelling of corrosion” is now available on arXiv.
[02/2025] Our new paper “Gradient Alignment in Physics-informed Neural Networks: A Second-Order Optimization Perspective” is now available on arXiv.
[01/2025] Our work “CViT: Continuous Vision Transformer for Operator Learning” has been accepted to ICLR 2025.
[12/2024] Our paper “PirateNets: Physics-informed Deep Learning with Residual Adaptive Networks” has been accepted to JMLR.
[09/2024] Our paper “Micrometer: Micromechanics Transformer for Predicting Mechanical Responses of Heterogeneous Materials” is now available on arXiv.
[05/2024] Our paper “Bridging Operator Learning and Conditioned Neural Fields: A Unifying Perspective” is now available on arXiv.
[02/2024] Our new paper “PirateNets: Physics-informed Deep Learning with Residual Adaptive Networks” is now available on arXiv.
[01/2024] Our paper “Respecting causality for training physics-informed neural networks” has been accepted to CMAME.