BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//IFDS - ECPv6.0.1.1//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://ifds.info
X-WR-CALDESC:Events for IFDS
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20220313T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20221106T090000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20220311T133000
DTEND;TZID=America/Los_Angeles:20220311T143000
DTSTAMP:20260425T183112
CREATED:20220325T193738Z
LAST-MODIFIED:20220325T194845Z
UID:1893-1647005400-1647009000@ifds.info
SUMMARY:ML-Opt @ UWash: Sean Welleck
DESCRIPTION:Speaker: Sean Welleck \nTitle: Constrained text generation through discrete and continuous inference \nAbstract: Neural text generation has shifted to generating text with large-scale\, general-purpose models coupled with generic inference algorithms. An important open question is how to efficiently offer control over generated text. We describe two algorithms for enabling control through inference-time constraints: (i) A*-Neurologic\, a discrete search algorithm for incorporating logical constraints through estimates of the future\, and (ii) COLD decoding\, which treats generation as continuous gradient-based sampling from an energy function that captures task-relevant constraints. Our algorithms can be applied directly to off-the-shelf models without the need for task-specific finetuning\, and result in strong performance on a variety of generation tasks.  
URL:https://ifds.info/event/ml-opt-uwash-sean-welleck/
CATEGORIES:MLOpt@UWash
END:VEVENT
END:VCALENDAR