IFDS is located in the Central and Pacific Time zone. Please note the zone when accessing a particular event.

Loading Events

Speaker 1: Puqian Wang

Robustly Learning Single-Index Models via Alignment Sharpness

Abstract: We study the problem of learning Single-Index Models under the L_2^2 loss in the agnostic model. We give an efficient learning algorithm, achieving a constant factor approximation to the optimal loss, that succeeds under a range of distributions (including log-concave distributions) and a broad class of monotone and Lipschitz link functions. This is the first efficient constant factor approximate agnostic learner, even for Gaussian data and for any nontrivial class of link functions. Prior work for the case of unknown link function either works in the realizable setting or does not attain constant factor approximation. The main technical ingredient enabling our algorithm and analysis is a novel notion of a local error bound in optimization that we term alignment sharpness and that may be of broader interest.

 

Speaker 2: Lisheng Ren

SQ Lower Bounds for Non-Gaussian Component Analysis with Weaker Assumptions

Abstract: The talk focuses on the complexity of Non-Gaussian Component Analysis (NGCA) in the Statistical Query (SQ) model. The NGCA has been a useful problem framework for obtaining SQ hardness for a variety of statistical problems. In particular, it was previously known that for any univariate distribution $A$ satisfying certain conditions, distinguishing between a standard multivariate Gaussian and a distribution that behaves like $A$ in a random hidden direction and like a standard Gaussian in the orthogonal complement, is SQ-hard. This required 1) $A$ matches many low-order moments with a standard Gaussian, and (2) the chi-squared norm of $A$ with respect to the standard Gaussian is finite, where the chi-squared restriction is needed for technical reasons. In this talk, we will present the new result that shows the Condition (2) above is indeed not necessary.

Go to Top