PINNAS: Physics-Informed Neural Network Architecture Search for Solving Integral Equations
-
Abstract
Physics-informed neural networks (PINNs) have emerged as a powerful framework for solving partial differential equations (PDEs) by incorporating physical laws into the training of neural networks. However, their effectiveness in solving integral equations (IEs) remains constrained by suboptimal manual architecture design and computational inefficiencies in existing neural architecture search (NAS) methods. This paper introduces PINNAS (Physics-Informed Neural Architecture Search), a novel framework that integrates gradient-based NAS with PINNs to automate architecture optimization for solving IEs. Key innovations include: 1) a differentiable neural architecture search (DNAS) mechanism that relaxes discrete search spaces into continuous domains, enabling joint optimization of network weights and architecture parameters, 2) dynamic masking techniques to resolve tensor shape mismatches in variable-width layers, and 3) domain-specific search spaces tailored for integral operators. Extensive experiments on six IE types (one-dimensional/two-dimensional, linear/nonlinear, Volterra/Fredholm) demonstrate that PINNAS-optimized architectures achieve 15%--30% lower mean squared error (MSE) than manually designed networks while using 40% fewer parameters. Crucially, we reveal that non-uniform layer widths outperform uniform configurations, challenging conventional NAS practices. This work bridges the gap between automated machine learning and scientific computing, offering a scalable strategy for IE solutions.
-
-