Graduate Special Topics

Department: 
MATH
Course Number: 
8803
Hours - Lecture: 
3
Hours - Total Credit: 
3
Typical Scheduling: 
Every Fall and Spring Semester

The following table contains a list of all graduate special topics courses offered by the School of Math within the last 5 years. More information on courses offered in the current/upcoming semester follows below. 

 

Semester Instructor         Title                                                  
Spring 2025 Gong Chen Nonlinear Dispersive Equations
  Alex Dunn Analytic Number Theory II
  Jen Hom Knot Concordance and Homology Cobordism
  Heinrich Matzinger AI, Transformers, and Machine Learning Methods: Theory and Applications
Fall 2024 Alex Blumenthal Big and Noisy: Ergodic Theory for Stochastic and Infinite-Dimensional Dynamical Systems
  Mohammad Ghomi Geometric Inequalities
  Michael Lacey Discrete Harmonic Analysis
  Rose McCarty Structure for Dense Graphs
  John McCuan Mathematical Capillarity
  Haomin Zhou Machine Learning Methods for Numerical PDEs
Spring 2024 Anton Bernshteyn Set Theory
  Greg Blekherman Convex Geometry
  Hannah Choi Mathematical Neuroscience
  Alex Dunn Analytic Number Theory I
  John Etnyre 3-Dimensional Contact Topology
  Chongchun Zeng Topics in PDE Dynamics II
Fall 2023 Jen Hom Knots, 3-Manifolds, and 4-Manifolds
  Tom Kelly Absorption Methods for Hypergraph Embeddings and Decompositions
  Zhiwu Lin Topics in PDE Dynamics I
  Galyna Livshyts Concentration of Measure and Convexity
  Cheng Mao Statistical Inference in Networks
Spring 2023        Igor Belagradek Diffeomorphism Groups
Fall 2022 Hannah Choi Neuronal Dynamics and Networks
  John Etnyre Topics in Algebraic Topology
  Christopher Heil     Measure Theory for Engineers
Fall 2021 Anton Bernshetyn Descriptive Combinatorics
  John Etnyre The Topology of 3-Manifolds
  Christopher Heil Measure Theory for Engineers
  Zhiyu Wang Spectral Graph Theory
Spring 2021 Wade Bloomquist Intro to Topological Quantum Computing/Representations
  John McCuan Mathematical Capillarity
Fall 2020 Matt Baker Topics in Matroid Theory
  Jonathan Beardsley Topics in Algebraic Topology
  Michael Damron Percolation Models
  Dan Margalit Mapping Class Groups

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

In the lists below, Math 8803-XXX refers to the special topics course taught by the instructor whose last name begins with XXX. 

 

 

Prerequisites: 

Spring 2025: 

Math 8803-CHE: Real Analysis at the level of Math 4317, and PDEs at the level of Math 4347

Math 8803-DUN: Complex Analysis at the level of MATH 4320, Number Theory at the Level of MATH 4150, and completion of the Analytic Number Theory I special topics course MATH 8803-DUN taught in Spring 2024; or permission of the instructor. 

Math 8803-HOM: Math 6441

Math 8803-MAT: Probability and Statistics at at least the level of Math 3235 and 3236, and basic linear algebra

 

 

Course Text: 

Spring 2025:

Math 8803-CHE: See Syllabus

Math 8803-DUN: See Syllabus

Math 8803-HOM: See Syllabus

Math 8803-MAT: Our discussion will align with the foundational concepts presented in Artificial Intelligence: A Modern Approach by Russell and Norvig.

 

 

Topic Outline: 

Spring 2025:

Math 8803-CHE: In this class, we will study review and study techniques harmonic analysis and spectral theory, and then apply them to study dynamics of nonlinear dispersive PDEs.

Math 8803-DUN: This will be a follow up for the course taught in Spring 2024. Potential topics include an introduction to sieve theory/advanced topics on primes or modular forms.

Math 8803-HOM: We will study knot concordance and homology cobordism, using both classical and modern tools.

Math 8803-MAT: 

Recently, transformers have achieved a major breakthrough and revolutionized society, especially since Chat-GPT became publicly accessible in the fall of 2022. Despite their impact, the inner workings of transformers remain poorly understood to this day. For further progress in AI, we believe that understanding logical thinking processes in terms of first-order logic will be crucial.

      In this course, we explore various aspects such as memorization, information compression, and logical reasoning. 

Traditional AI courses typically cover topics like first-order logic and ontologies, but GPT-based transformers—also known as large language models—were not built with these concepts in mind.

      We will explore the structure of modern transformers, such as GPT-3.5 and GPT-4, which power Chat-GPT, alongside traditional AI approaches like first-order logic and ontology building.

      We will cover several key topics, including:

      a) The basic structure of transformers, first-order logic, and Boolean logic.

      b) How the transformer architecture can effectively express certain forms of first-order logic.

      c) Numerical aspects of gradient descent training in transformers, focusing on how a simple, single-layer transformer trained to learn basic logic may encounter issues with unwanted zeros, which can be resolved using normalization techniques.

      d) How a neural network (e.g., ResNet-64) can mistakenly classify grass as a bear, illustrating a phenomenon known as shortcut learning. We will present a general approach to understanding this issue, which we believe also applies to transformers.

      e) Chatbots often need to understand the caller’s intent based on the first utterance, framing this as a text-classification problem. We will review several NLP methods that can be used alongside transformers, as well as how transformers themselves can be applied for this task.

      f) We introduce the Hugging Face ecosystem, which enables you to quickly run transformer models like GPT-J (6B) and experiment with them within minutes.

      g) Ethical and philosophical questions: Is current AI truly intelligent? Could it be trained to be impartial and unbiased?