BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//IFDS - ECPv6.0.1.1//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://ifds.info
X-WR-CALDESC:Events for IFDS
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Chicago
BEGIN:DAYLIGHT
TZOFFSETFROM:-0600
TZOFFSETTO:-0500
TZNAME:CDT
DTSTART:20240310T080000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0500
TZOFFSETTO:-0600
TZNAME:CST
DTSTART:20241103T070000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20240304T123000
DTEND;TZID=America/Chicago:20240304T133000
DTSTAMP:20260514T200248
CREATED:20240318T214117Z
LAST-MODIFIED:20240318T214117Z
UID:2902-1709555400-1709559000@ifds.info
SUMMARY:IFDS Ideas Forum Double Header
DESCRIPTION:Speaker 1: Puqian Wang \nRobustly Learning Single-Index Models via Alignment Sharpness \nAbstract: We study the problem of learning Single-Index Models under the L_2^2 loss in the agnostic model. We give an efficient learning algorithm\, achieving a constant factor approximation to the optimal loss\, that succeeds under a range of distributions (including log-concave distributions) and a broad class of monotone and Lipschitz link functions. This is the first efficient constant factor approximate agnostic learner\, even for Gaussian data and for any nontrivial class of link functions. Prior work for the case of unknown link function either works in the realizable setting or does not attain constant factor approximation. The main technical ingredient enabling our algorithm and analysis is a novel notion of a local error bound in optimization that we term alignment sharpness and that may be of broader interest. \n  \nSpeaker 2: Lisheng Ren \nSQ Lower Bounds for Non-Gaussian Component Analysis with Weaker Assumptions \nAbstract: The talk focuses on the complexity of Non-Gaussian Component Analysis (NGCA) in the Statistical Query (SQ) model. The NGCA has been a useful problem framework for obtaining SQ hardness for a variety of statistical problems. In particular\, it was previously known that for any univariate distribution $A$ satisfying certain conditions\, distinguishing between a standard multivariate Gaussian and a distribution that behaves like $A$ in a random hidden direction and like a standard Gaussian in the orthogonal complement\, is SQ-hard. This required 1) $A$ matches many low-order moments with a standard Gaussian\, and (2) the chi-squared norm of $A$ with respect to the standard Gaussian is finite\, where the chi-squared restriction is needed for technical reasons. In this talk\, we will present the new result that shows the Condition (2) above is indeed not necessary.
URL:https://ifds.info/event/ifds-ideas-forum-double-header/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:IFDS Ideas Forum
END:VEVENT
END:VCALENDAR