In the realm of scientific computing, the intricate dance of multiscale interactions within nonlinear physical phenomena has long been a challenge. Researchers Changhong Mou, Yeyu Zhang, Xuewen Zhu, and Qiao Zhuang have taken a significant stride in this domain with their innovative proposal: PAS-Net, a physics-informed Adaptive-Scale Deep Operator Network. This novel approach is designed to learn solution operators of nonlinear and singularly perturbed evolution partial differential equations (PDEs), which are often characterized by small parameters and localized features.
PAS-Net builds upon the foundation of the physics-informed Deep Operator Network (PI-DeepONet) by augmenting its trunk input with a prescribed or learnable locally rescaled coordinate transformation. This transformation is centered at reference points and introduces a multiscale feature embedding. The beauty of this addition lies in its ability to act as an architecture-independent preconditioner, significantly enhancing the representation of localized, stiff, and multiscale dynamics.
From an optimization perspective, the adaptive-scale embedding in PAS-Net modifies the geometry of the Neural Tangent Kernel (NTK) associated with the neural network. This modification increases the smallest eigenvalue of the NTK, thereby improving spectral conditioning and accelerating gradient-based convergence. The researchers have demonstrated that this adaptive-scale mechanism explicitly accelerates neural network training in approximating functions with steep transitions and strong asymptotic behavior. They have also provided a rigorous proof of this function-approximation result within the finite-dimensional NTK matrix framework.
To validate the efficacy of PAS-Net, the researchers tested it on three distinct problems: the one-dimensional viscous Burgers equation, a nonlinear diffusion-reaction system with sharp spatial gradients, and a two-dimensional eikonal equation. The numerical results were compelling, showing that PAS-Net consistently achieved higher accuracy and faster convergence than the standard DeepONet and PI-DeepONet models, all under a similar training cost.
This groundbreaking research opens new avenues for understanding and solving complex multiscale interactions in nonlinear physical phenomena. The potential applications of PAS-Net are vast, ranging from fluid dynamics to materials science, and even to the realm of artificial intelligence. As we continue to push the boundaries of scientific computing, PAS-Net stands as a testament to the power of innovative thinking and interdisciplinary collaboration.



