- Series
- Stochastics Seminar
- Time
- Thursday, April 9, 2026 - 3:30pm for 1 hour (actually 50 minutes)
- Location
- Skiles 006
- Speaker
- Jacob Aguirre – Georgia Tech – aguirre@gatech.edu – http://www.jacobaguirre.com/
- Organizer
- Dmitrii Ostrovskii
We study robust D-optimal experiment design in generalized linear models, choosing design weights to maximize the worst-case determinant of the Fisher information matrix over a convex parameter uncertainty set. This can be thought of as the natural generalization of D-optimal design in linear regression, and the key challenge is that the information matrix depends on the parameter, so the resulting minimax problem is generally not convex-concave. We show that the desired convexity-concavity, in fact, reduces to a scalar curvature condition on the log-partition function of the exponential family, namely its second derivative h must satisfy the inequality h''h ≥ q(h')² for some q > 1. This insight is connected to the notion of Volumetric Barrier (VB) convexity for self-concordant functions, a result first introduced by Tseng et al (2025) in the context of online quantum state estimation. With self-concordant barriers on the design weights simplex and the parameter uncertainty set, the regularized saddle objective becomes a self-concordant convex-concave (SCCC) function, enabling efficient minimax interior-point methods developed by Nemirovski. We also consider the generalization of the framework, where convexity in the model parameter breaks, but the q-inequality holds up to a deficit proportional to h; in this case, our methods are just as applicable. It turns out that this class includes all canonical GLMs, and we identify logistic regression as the hardest model in the class.
This joint work with Dmitrii Ostrovskii.