Evolutionary Optimization of Physics-Informed Neural Networks: Evo-PINN Frontiers and Opportunities

Page view(s)
0
Checked on
Evolutionary Optimization of Physics-Informed Neural Networks: Evo-PINN Frontiers and Opportunities
Title:
Evolutionary Optimization of Physics-Informed Neural Networks: Evo-PINN Frontiers and Opportunities
Journal Title:
IEEE Computational Intelligence Magazine
Keywords:
Publication Date:
14 January 2026
Citation:
J. C. Wong, A. Gupta, C. C. Ooi, P. -H. Chiu, J. Liu and Y. -S. Ong, "Evolutionary Optimization of Physics-Informed Neural Networks: Evo-PINN Frontiers and Opportunities," in IEEE Computational Intelligence Magazine, vol. 21, no. 1, pp. 16-36, Feb. 2026, doi: 10.1109/MCI.2025.3607749. keywords: {Deep learning;Technological innovation;Machine learning algorithms;Translation;Computational modeling;Neural networks;Mathematical models;Data models;Optimization;Evolutionary computation},
Abstract:
Deep learning models trained on finite data lack a complete understanding of the physical world. On the other hand, physics-informed neural networks (PINNs) are infused with such knowledge through the incorporation of mathematically expressible laws of nature into their training loss function. By complying with physical laws, PINNs provide advantages over purely data-driven models in limited-data regimes and present as a promising route towards Physical AI. This feature has propelled them to the forefront of scientific machine learning, a domain characterized by scarce and costly data. However, the vision of accurate physics-informed learning comes with significant challenges. This work examines PINNs in terms of model optimization and generalization, shedding light on the need for new algorithmic advances to overcome issues pertaining to the training speed, precision, and generalizability of today’s PINN models. Of particular interest are gradient-free evolutionary algorithms (EAs) for optimizing the uniquely complex loss landscapes arising in PINN training. Methods synergizing gradient descent and EAs for discovering bespoke neural architectures and balancing multiple terms in physics-informed learning objectives are positioned as important avenues for future research. Another exciting track is to cast EAs as a meta-learner of generalizable PINN models. To substantiate these proposed avenues, we further highlight results from recent literature to showcase the early success of such approaches in addressing the aforementioned challenges in PINN optimization and generalization.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the NRF - AI-based urban cooling technology development
Grant Reference no. : AISG3-TC-2024-014-SGKR
Description:
© 2026 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
ISSN:
1556-603X
1556-6048
Files uploaded:

File Size Format Action
250106572v5pdf-safe.pdf 2.65 MB PDF Open