Numerical Instability in Automatic Differentiation for Scientific Machine Learning
Scientific machine learning (SciML) heavily relies on automatic differentiation (AD) for gradient-based optimization. However, this talk reveals the numerical challenges of AD, particularly concerning its stability and robustness when applied to ordinary differential equations (ODEs) and partial differential equations (PDEs). Using examples from Jax and PyTorch, the presentation demonstrates how inaccuracies in AD can lead to significant errors (60% or more) even in simple linear ODEs. The speaker will discuss non-standard modifications implemented in Julia SciML libraries to address these issues and the necessary engineering trade-offs involved.
Read more