Spring 2024: Foundations of Computational Dynamical Systems (CSE 8803)

Across science and engineering, several computational questions are asked of dynamical systems such as: i) how sensitive are the orbits and orbit structure of the dynamics to one-time perturbations, to persistent deterministic perturbations, and to noisy perturbations? ii) given some measurements or observations of the dynamics, how do we forecast orbits into the future or predict their statistical moments?, and iii) how can we construct a simpler dynamical system (with fewer variables) that captures its essential features? Putting such questions in a mathematical language and studying rigorous methods for their solutions are the primary goals of this course.

We will start with an overview of nonlinear deterministic systems and their numerical analysis, using dynamical systems and ergodic theory. We will then cover some stochastic analysis and methods from computational statistics. After establishing mathematical foundations, we will discuss rigorous computations applicable to dynamical systems that arise in the geosciences and in machine learning (optimization algorithms). Here is a sample of topics we will cover: One-dimensional dynamics: fixed points, periodic orbits, bifurcations, chaos, stability of fixed points and periodic orbits, unstable/stable manifold theorem, SRB measures, Ito calculus, Lyapunov exponents and Lyapunov vectors, data assimilation, reduced-order modeling, optimization algorithms, transformers and sequence models, and climate model hierarchy.

General information

  • 3 credits, two lectures per week, 6 homeworks and 1 final project (no exams).
  • Class time and location: Mondays and Wednesdays, 2:00 pm -- 3:15 pm, Molecular Sciences and Engineering building G021
  • Class dates: Jan 08, 2024 -- May 02, 2024
  • Office hours: Fridays 10:00 am -- 11:00 am

A detailed syllabus is available on Github. The Github site will have lecture notes, but the main website (only accessible to registered students) will be Canvas.

Fall 2023: Computational Data Analysis (CSE 6740 A/ISyE 6740)

In this course, we will learn the mathematical and computational foundations of machine learning methods with the goal of understanding i) how and when they do work and do not; and ii) how to use them in a principled manner. Besides neural networks, we will cover classical statistical models and data analysis methods that pre-date deep learning and are still widely used and/or are relevant for our fundamental understanding of learning, prediction and estimation with data.

We will start with an overview of the following topics and try to connect them with the state-of-the-art in ML/statistics research:

  • Statistical foundations of learning, learning models and algorithms: Empirical risk minimization, regression models, classifiers, PAC learning, boosting, decision trees, clustering, support vector machines (SVMs), neural networks
  • Optimization methods and statistics: Convex optimization, Stochastic gradient descent and variants, generalization, kernel methods, model selection and cross-validation, bias-variance tradeoff
  • Additional estimation/inference/learning models: Multiclass ranking, compressed sensing, principal component analysis, generative modeling, graphical models

General information

  • 3 credits, two lectures per week, 4 homeworks, 2 midterms and 1 final project.
  • Class time and location: Tuesdays and Thursdays 12:30-1:45 pm, East Architecture 123.
  • Office hours: 30 minutes after each lecture
  • Instructor email: nishac @ gatech.edu
  • TAs: Darryl Jacob, Atharva Ketkar, Chengrui Li, Akpevwe Ojameruaye, Yusen Su, and Mithilesh Vaidya

A detailed syllabus is available on Github. The Github site will be used to post some lecture summaries, but the main website (only accessible to registered students) will be Canvas.