IFDS is located in the Central and Pacific Time zone. Please note the zone when accessing a particular event.

Loading Events

Speaker: Lang Liu
Title: Non-Asymptotic Analysis of M-Estimation for Statistical Learning and Inference under Self-Concordance

Abstract: In this talk, I discuss the problem of M-estimation for statistical learning and inference. It is well-known from the classical asymptotic theory that the properly centered and normalized estimator has a limiting Gaussian distribution with a sandwich covariance. I first establish a finite-sample bound for the estimator, characterizing its asymptotic behavior in a non-asymptotic fashion. An important feature of the bound is that its dimension dependency is characterized by the effective dimension — the trace of the limiting sandwich covariance — which can be much smaller than the parameter dimension in some regimes. I then illustrate how the bound can be used to obtain a confidence set whose shape is adapted to the local curvature of the population risk. In contrast to previous work which relied heavily on the strong convexity of the learning objective, I only assume the Hessian is lower bounded at optimum and allow it to gradually become degenerate. This property is formalized by the notion of self-concordance originating from convex optimization. Finally, I apply these techniques to semi-parametric estimation and derive state-of-the-art finite-sample bounds for double machine learning and orthogonal statistical learning.

Go to Top