WSU Vancouver Mathematics and Statistics Seminar

WSU Vancouver Mathematics and Statistics Seminar (Spring 2025)

Welcome to the WSU Vancouver Seminar in Mathematics and Statistics! The Seminar meets on Wednesdays at 2:10–3:00 PM in VUB 122 (unless mentioned otherwise). This is the Undergraduate Building (marked "N" in the campus map). The seminar is open to the public, and here is some information for visitors.

Students could sign up for Math 592 (titled Seminar in Analysis) for 1 credit. Talks will be given by external speakers, as well as by WSUV faculty and students. Contact the organizer Bala Krishnamoorthy if you want to invite a speaker, or to give a talk.

Seminars from previous semesters


Date Speaker Topic
Jan  8 No seminar
Jan 15 WSUV Math Grad Students JMM Recap: Discussion

Abstract (click to read)

The Math grad students all attended the Joint Mathematics Meetings last week. We will have a discussion on what they learned from the conference, as well as their overall experience. We will also talk about the benefits of attending conferences for students.

Jan 22 Bala Krishnamoorthy DeepCurrents: ML in Geometric Measure Theory

Abstract (click to read)

I will present a broad overview of area-minimizing surface problems from Geometric Measure Theory (GMT). Then I'll discuss the use of deep learning frameworks in this context, using results mostly from the paper on DeepCurrents. The goal is to keep the discussion accessible to folks (even without any background knowledge).

Jan 29 Elizabeth Thompson, WSU Using Persistent Homology for Classification

Abstract (click to read)

We review the Persistent Homology Classification Algorithm (PHCA), a method that uses the lifetimes of topological features in data to predict which classes of data future points may belong to. PHCA, as opposed to other PH classifiers, learns the topological features of classes of data points rather than those of each point individually. We share results for which PHCA is compared to other binary and multi-class classifiers, and is experimentally shown to perform at or even better than some of these current methods. We present these results on diverse data sets, including iris plant, wheat seeds, and social network ads data.

Feb 12 Jeff Ovall, PSU Localization phenomena and Efficient Computation for the Magnetic Schrodinger Equation

Abstract (click to read)

The magnetic Schrodinger equation provides the standard mathematical model for the motion of a charged particle in a magnetic field. The solution of this equation, called the wave function for the particle, provides (via its modulus squared) the probability density of the particle being located in a specific region at a specific time. As with many space-time partial differential equations, significant insight can be gleaned by considering the associated time-independent eigenvalue problem, and this will be the focus of our discussion. More specifically, we will first consider the phenomenon of eigenvector localization, wherein the "mass" of some eigenvectors is strongly concentrated in relatively small portions of the underlying spatial domain. The mechanisms driving this localization are only partially understood, and we will provide both theoretical and empirical insight on the matter. We will also demonstrate that through a prudent choice of similarity transform, which physicists would call a gauge transform, we can significantly reduce the cost of approximating eigenvalues and eigenvectors to a prescribed level of accuracy.

Feb 19 Zach Fendler, WSU Machine Learning in Optimization

Abstract (click to read)

Abstract: we will introduce Mixed Integer Programs (MIPs), a class of optimization problems, and the Branch-and-Bound (BNB) method to solve them. We will then discuss machine learning (ML) techniques used to learn effective branching strategies, including an approach using a Tree Markov Decision Process (tMDP). We will highlight key results from the paper titled Learning to branch with Tree MDPs, focusing on how ML techniques are applied in optimization to improve branching decisions while addressing challenges such as the credit assignment problem.

Feb 28 Safa Mote, PSU Ensemble Oscillation Correction (EnOC): Leveraging Oscillatory Modes to Improve Forecasts of Chaotic Systems

Abstract (click to read)

Oscillatory modes of the climate system are among its most predictable features, especially at intraseasonal time scales. These oscillations can be predicted well with data-driven methods, often with better skill than dynamical models. However, since the oscillations only represent a portion of the total variance, a method for beneficially combining oscillation forecasts with dynamical forecasts of the full system was not previously known. We introduce Ensemble Oscillation Correction (EnOC), a general method to correct oscillatory modes in ensemble forecasts from dynamical models. We compute the ensemble mean—or the ensemble probability distribution—with only the best ensemble members, as determined by their discrepancy from a data-driven forecast of the oscillatory modes. We also present an alternative method that uses ensemble data assimilation to combine the oscillation forecasts with an ensemble of dynamical forecasts of the system (EnOC-DA). The oscillatory modes are extracted with a time series analysis method called multichannel singular spectrum analysis (M-SSA), and forecast using an analog method. We test these two methods using chaotic toy models with significant oscillatory components and show that they robustly reduce error compared to the uncorrected ensemble. We discuss the applications of this method to improve prediction of monsoons as well as other parts of the climate system. We also discuss possible extensions of the method to other data-driven forecasts, including machine learning.

Related reading:

  1. EnOC (2021)
  2. EnOC Monsoon (2024)

Mar   5 Izhak Shafran, Google DeepMind (VECS 120) Unpacking Large Language Models: Challenges, Costs, and Open Problems

Abstract (click to read)

In this talk, we'll explore the inner workings of Large Language Models, diving into their two-phase training process, the cost functions, and the constraints that shape their optimization. After this overview, we'll shift gears and take a look at some of the open problems in the field, including challenges with mixture-of-experts models and the complexities of scaling model sizes. It's a fascinating area with plenty of room for growth and improvement, so come along for a discussion on where things stand and where they could go next!

Mar 19 Deepti Singh, SoE, WSU The changing likelihood of heat extremes and their societal risks

Abstract (click to read)

Heat is the deadliest climate-related hazard in the US. Several recent heatwaves have shattered historical records and had catastrophic impacts on human and natural communities. This talk will focus on the drivers and impacts of extreme heat. I will discuss research from my lab that investigates the characteristics of extreme heat on global to regional scales. I will also briefly discuss how we quantify the role of human activities in shaping individual heatwaves.

Apr   2 Anne-Marie Greggs, WSU

Abstract (click to read)


Last modified: Tue Mar 25 22:21:18 PDT 2025