Scientific Machine Learning

Objective

Scientific Machine Learning (SciML) aims to solve the complicated problems of science and engineering with the help of scientific computing and Machine Learning (ML). SciML incorporates uncertainty quantification, propagation and physics-based models to handle high dimensional, noisy, multi-scale and sparse data. This course provides state-of-the-art tools such as physics informed neural networks, neural operators and polynomial chaos theory to students to prepare them for the next wave of data science.

Modus Operandi

  • Class-room activities: 10%
    • Participate in class-room discussions and ask quality questions
    • Five minutes teaching: Pick any small topic that is taught in the class before your schedule and teach it for five minutes.
    • Attendance
  • Course project: 20%
  • Mid-sem: 30%
  • End-sem: 40%

Pre-requisite

Foundations of Machine Learning (AI60203)

Syllabus

  • Module 1: Introduction to Scientific ML
  • Module 2: Neural Differential Equations
  • Module 3: Physics Informed Neural Networks
  • Module 4: Neural Operators
  • Module 5: Gaussian Process (GP) Regression
  • Module 6: Uncertainty Quantification
S.N. Module Unit Hours (48)
1Introduction to scientific MLBasics of Artificial Intelligence, Machine Learning, Deep Learning, Optimization and Python libraries2
2Differences between general ML and scientific ML, motivating applications of scientific ML2
3Neural Differential EquationNeural ODEs, Multistep neural networks2
4Recurrent neural networks, Structure-preserving neural networks (Symplectic and Poisson nets, generic framework)2
5Physics Informed Neural NetworksPINN for Burger’s Equation and Boundary Value Problems2
6Constrained and Variational Neural Networks2
7Convergence Theory2
8Extensions (Gradient-enhanced PINNs, Conservative PINNs, Extended PINNs, PINNs for fractional PDEs, PINNs for stochastic PDEs)4
9Neural OperatorsUniversal approximation theorem for operators2
10DeepOnet (Theory, convergence, variations)2
11Transforms and operators (Fourier, Laplace and Wavelet and their extensions)4
12Gaussian Process regressionGP for multi-fidelity data2
13Physics-informed and NN-induced GP kernels4
14Deep multi-fidelity and diffusion-manifold driven GP4
15Uncertainty quantificationMethods for UQ2
16Deep Ensembles and Snapshot Ensembles2
17BNN and BPINN4
18Unified view (UQ in PINNs, UQ in DeepOnet)2
19Polynomial Chaos Theory2

Resources

  1. Deep Learning for Science and Engineering Teaching Kit by NVIDIA and Brown University
  2. Rackauckas, C., 2022. Parallel computing and scientific machine learning (SciML): Methods and applications [online]
  3. Lecture notes on “Introduction to Scientific Machine Learning” by Ilias Bilionis
  4. Smith, E., 2022. Scientific Machine Learning with Pytorch, Introduction to the Tools of Scientific Computing (Vol. 25). Springer Nature.
  5. Garikipati, K., 2024. Data-driven Modelling and Scientific Machine Learning in Continuum Physics, Springer.
  6. Baker, N., Alexander, F., Bremer, T., Hagberg, A., Kevrekidis, Y., Najm, H., Parashar, M., Patra, A., Sethian, J., Wild, S. and Willcox, K., 2019. Workshop report on basic research needs for scientific machine learning: Core technologies for artificial intelligence. USDOE Office of Science (SC), Washington, DC (United States).
  7. Brunton, S.L. and Kutz, J.N., 2022. Data-driven science and engineering: Machine learning, dynamical systems, and control. Cambridge University Press.
  8. Geron, A., 2022. Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow. " O'Reilly Media, Inc."
  9. Subramanian, S., Harrington, P., Keutzer, K., Bhimji, W., Morozov, D., Mahoney, M.W. and Gholami, A., 2024. Towards foundation models for scientific machine learning: Characterizing scaling and transfer behavior. Advances in Neural Information Processing Systems, 36.
  10. Lagaris, I.E., Likas, A. and Fotiadis, D.I., 1998. Artificial neural networks for solving ordinary and partial differential equations. IEEE Trans. on Neural networks
  11. Raissi, M., Perdikaris, P. and Karniadakis, G.E., 2019. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics.
  12. Kovachki, N., Li, Z., Liu, B., Azizzadenesheli, K., Bhattacharya, K., Stuart, A. and Anandkumar, A., 2023. Neural operator: Learning maps between function spaces with applications to pdes. Journal of Machine Learning Research.
  13. Lu, L., Jin, P., Pang, G., Zhang, Z. and Karniadakis, G.E., 2021. Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nature machine intelligence.
  14. Mishra, P.K., Paulson, J.A. and Braatz, R.D., 2024. Polynomial Chaos-based Stochastic Model Predictive Control: An Overview and Future Research Directions. arXiv preprint arXiv:2406.10734.
  15. Fan, D., Jodin, G., Consi, T.R., Bonfiglio, L., Ma, Y., Keyes, L.R., Karniadakis, G.E. and Triantafyllou, M.S., 2019. A robotic intelligent towing tank for learning complex fluid-structure dynamics. Science Robotics, 4(36).
  16. Yang, L., Zhang, D. and Karniadakis, G.E., 2020. Physics-informed generative adversarial networks for stochastic differential equations. SIAM Journal on Scientific Computing, 42(1)
  17. Cybenko, G., 1989. Approximation by superpositions of a sigmoidal function. Mathematics of control, signals and systems, 2(4).