Numerical Instability in Automatic Differentiation for Scientific Machine Learning

2025-09-18
Numerical Instability in Automatic Differentiation for Scientific Machine Learning

Scientific machine learning (SciML) heavily relies on automatic differentiation (AD) for gradient-based optimization. However, this talk reveals the numerical challenges of AD, particularly concerning its stability and robustness when applied to ordinary differential equations (ODEs) and partial differential equations (PDEs). Using examples from Jax and PyTorch, the presentation demonstrates how inaccuracies in AD can lead to significant errors (60% or more) even in simple linear ODEs. The speaker will discuss non-standard modifications implemented in Julia SciML libraries to address these issues and the necessary engineering trade-offs involved.

Read more

Explicit vs. Implicit ODE Solvers: Stability, Robustness, and Practical Implications

2025-09-16
Explicit vs. Implicit ODE Solvers: Stability, Robustness, and Practical Implications

This article delves into the strengths and weaknesses of explicit and implicit ordinary differential equation (ODE) solvers. While implicit methods are often considered more robust due to their superior stability, the author argues that explicit methods can be preferable for certain problems, especially those requiring the preservation of oscillations. Through linear ODE analysis, the concept of stability regions, and real-world examples (like cooling and oscillatory systems), the article illustrates the performance of both methods in different scenarios. It emphasizes that selecting the appropriate solver requires a nuanced understanding of the problem at hand, rather than a blanket approach.

Read more