Current Research and Scholarly Interests
Professor Darve's research is focused on the development of numerical methods for high-performance scientific computing, numerical linear algebra, fast algorithms, parallel computing, anomaly detection, and machine learning with applications in engineering. In many engineering applications, the computational expense of simulating large and complex systems is very significant and in many instances beyond current computer capabilities. The Darve research group is developing innovative numerical techniques to reduce this computational expense and enable the simulation of complex systems over realistic timescales. Keywords: numerical linear algebra (fast linear solvers, fast QR factorization, eigenvalue solvers, applications in geoscience and electric power grid), physics-informed machine learning (inverse modeling using PhysML, auto-encoders, GAN for uncertainty in predictive and inverse modeling, Kriging and statistical inversing, applications in geoscience, fluid mechanics and computational mechanics), anomaly detection (GAN-based algorithms, self-supervised machine learning, applications with Ford and SLAC linear accelerator), reinforcement learning for engineering applications (optimal control, application in 3D metal printing).