Self-supervised neural operator for solving partial differential equations
Wen You, Shaoqian Zhou, Xuhui Meng
Published: 2025/8/31
Abstract
Neural operators (NOs) provide a new paradigm for efficiently solving partial differential equations (PDEs), but their training depends on costly high-fidelity data from numerical solvers, limiting applications in complex systems. We propose a self-supervised neural operator (SNO) that generates accurate and diverse training data on the fly without numerical solvers. SNO consists of three parts: a physics-informed sampler (PI-sampler) based on Bayesian PINNs for efficient data generation, a function encoder (FE) for compact input-output representations, and an encoder-only Transformer for operator learning, mapping boundary/initial conditions, source terms, and geometries to PDE solutions. We validate SNO on 1D steady/unsteady nonlinear reaction-diffusion equations, a 2D nonlinear PDE with varying geometries, and vortex-induced vibration of a flexible cylinder in fluid dynamics. SNO achieves high accuracy in all cases, and lightweight finetuning (O(100) trainable variables) further improves predictions with only a few hundred steps. This work provides a new route toward pretrained foundation models as efficient PDE surrogates.