BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//IFDS - ECPv6.0.1.1//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://ifds.info
X-WR-CALDESC:Events for IFDS
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Chicago
BEGIN:DAYLIGHT
TZOFFSETFROM:-0600
TZOFFSETTO:-0500
TZNAME:CDT
DTSTART:20220313T080000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0500
TZOFFSETTO:-0600
TZNAME:CST
DTSTART:20221106T070000
END:STANDARD
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20220313T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20221106T090000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220131T123000
DTEND;TZID=America/Chicago:20220131T133000
DTSTAMP:20260407T035214
CREATED:20220119T200612Z
LAST-MODIFIED:20220119T200918Z
UID:1753-1643632200-1643635800@ifds.info
SUMMARY:IFDS Ideas Forum
DESCRIPTION:
URL:https://ifds.info/event/ifds-ideas-forum/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220131T140000
DTEND;TZID=America/Chicago:20220131T140000
DTSTAMP:20260407T035214
CREATED:20230313T142147Z
LAST-MODIFIED:20230313T142147Z
UID:2425-1643637600-1643637600@ifds.info
SUMMARY:IFDS Ideas Forum
DESCRIPTION:Lightning talks for ICML+COLT submissions
URL:https://ifds.info/event/ifds-ideas-forum-5/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220202T123000
DTEND;TZID=America/Chicago:20220202T133000
DTSTAMP:20260407T035214
CREATED:20220119T203515Z
LAST-MODIFIED:20220119T203750Z
UID:1787-1643805000-1643808600@ifds.info
SUMMARY:SILO: Kassem Fawaz
DESCRIPTION:
URL:https://ifds.info/event/kassem-fawaz/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20220204T123000
DTEND;TZID=America/Los_Angeles:20220204T133000
DTSTAMP:20260407T035214
CREATED:20220325T195559Z
LAST-MODIFIED:20220325T195650Z
UID:1915-1643977800-1643981400@ifds.info
SUMMARY:IFDS Monthly All-Hands: Rebecca Willett
DESCRIPTION:Speaker: Prof. Rebecca Willett\, Statistics and CS\, University of Chicago   \nTitle: The Role of Linear Layers in Nonlinear Interpolating Networks   \nAbstract: In this discussion\, we will explore the implicit bias of overparameterized neural networks of depth greater than two layers. Our framework considers a family of networks of varying depth that all have the same capacity but different implicitly defined representation costs. The representation cost of a function induced by a neural network architecture is the minimum sum of squared weights needed for the network to represent the function; it reflects the function space bias associated with the architecture. Our results show that adding linear layers to a ReLU network yields a representation cost that reflects a complex interplay between the alignment and sparsity of ReLU units. Specifically\, using a neural network to fit training data with minimum representation cost yields an interpolating function that is constant in directions perpendicular to a low-dimensional subspace on which a parsimonious interpolant exists. This is joint work with Greg Ongie.
URL:https://ifds.info/event/ifds-monthly-all-hands-rebecca-willett/
CATEGORIES:Monthly All-Hands
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220207T123000
DTEND;TZID=America/Chicago:20220207T133000
DTSTAMP:20260407T035214
CREATED:20220119T200612Z
LAST-MODIFIED:20220204T164344Z
UID:1754-1644237000-1644240600@ifds.info
SUMMARY:IFDS Ideas Forum: Rare Gems: Finding Lottery Tickets at Initialization
DESCRIPTION:Title: Rare Gems: Finding Lottery Tickets at InitializationSpeaker: Jy-Yong SoonTime+location: Monday 7 Feb 2022\, 12:30-13:30 Orchard View Room\n \nAbstract: It has been widely observed that large neural networks can be pruned to a small fraction of their original size\, with little loss in accuracy\, by typically following a time-consuming “train\, prune\, re-train” approach. Frankle and Carbin in 2019 conjecture that we can avoid this by training lottery tickets\, i.e.\, special sparse subnetworks found at initialization\, that can be trained to high accuracy. However\, a subsequent line of work presents concrete evidence that current algorithms for finding trainable networks at initialization\, fail simple baseline comparisons\, e.g.\, against training random sparse subnetworks. Finding lottery tickets that train to better accuracy compared to simple baselines remains open. In this work\, we resolve this open problem by discovering rare gems: subnetworks at initialization that attain considerable accuracy\, even before training. Refining these rare gems – by means of fine-tuning – beats current baselines and leads to accuracy competitive or better than magnitude pruning methods.Bio:Jy-yong is a post-doctoral researcher in the Department of Electrical and Computer Engineering (ECE) at the University of Wisconsin-Madison\, working with Prof. Dimitris Papailiopoulos and Prof. Kangwook Lee. He is interested in the intersection of machine learning\, information theory\, and distributed algorithms. He received his Ph.D. degree in 2020 from KAIST\, under the supervision of Prof. Jaekyun Moon. He is a recipient of the IEEE ICC Best Paper Award\, Qualcomm Innovation Awards\, and NRF Korea Post-doctoral Fellowship.
URL:https://ifds.info/event/ifds-ideas-forum-2/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220207T140000
DTEND;TZID=America/Chicago:20220207T140000
DTSTAMP:20260407T035214
CREATED:20230313T142147Z
LAST-MODIFIED:20230313T142147Z
UID:2426-1644242400-1644242400@ifds.info
SUMMARY:IFDS Ideas Forum
DESCRIPTION:Rare Gems: Finding Lottery Tickets at Initialization
URL:https://ifds.info/event/ifds-ideas-forum-6/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220209T123000
DTEND;TZID=America/Chicago:20220209T133000
DTSTAMP:20260407T035214
CREATED:20220119T203515Z
LAST-MODIFIED:20220119T205017Z
UID:1788-1644409800-1644413400@ifds.info
SUMMARY:SILO: Yea-Seul Kim
DESCRIPTION:
URL:https://ifds.info/event/yea-seul-kim/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20220211T123000
DTEND;TZID=America/Los_Angeles:20220211T133000
DTSTAMP:20260407T035214
CREATED:20220325T195326Z
LAST-MODIFIED:20220325T195446Z
UID:1910-1644582600-1644586200@ifds.info
SUMMARY:ML Opt@ UW: Vincent Roulet
DESCRIPTION:Speaker: Vincent Roulet \nTitle: Complexity Bounds of Iterative Linearization Algorithms for Discrete-Time Nonlinear Control \nAbstract: We revisit the nonlinear optimization approach to discrete-time nonlinear control and optimization algorithms based on iterative linearization. While widely popular in many domains\, these algorithms have mainly been analyzed from an asymptotic viewpoint. We establish non-asymptotic complexity bounds and global convergence for a class of generalized Gauss-Newton algorithms relying on iterative linearization of the nonlinear control problem\, henceforth calling iterative linear quadratic regulator or differential dynamic programming algorithms as subroutines. The sufficient conditions for global convergence are examined for multi-rate sampling schemes given the existence of a feedback linearization scheme. We illustrate the algorithms in synthetic experiments and provide a software library based on reverse-mode automatic differentiation to reproduce the numerical results.
URL:https://ifds.info/event/ml-opt-uw-vincent-roulet/
CATEGORIES:MLOpt@UWash
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220214T123000
DTEND;TZID=America/Chicago:20220214T133000
DTSTAMP:20260407T035214
CREATED:20220119T200612Z
LAST-MODIFIED:20220119T201137Z
UID:1755-1644841800-1644845400@ifds.info
SUMMARY:IFDS Ideas Forum: TBD
DESCRIPTION:
URL:https://ifds.info/event/ifds-ideas-forum-3/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220214T140000
DTEND;TZID=America/Chicago:20220214T140000
DTSTAMP:20260407T035214
CREATED:20230313T142147Z
LAST-MODIFIED:20230313T142147Z
UID:2427-1644847200-1644847200@ifds.info
SUMMARY:IFDS Ideas Forum
DESCRIPTION:
URL:https://ifds.info/event/ifds-ideas-forum-7/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20220216T120000
DTEND;TZID=America/Los_Angeles:20220216T130000
DTSTAMP:20260407T035214
CREATED:20210921T203244Z
LAST-MODIFIED:20210921T203302Z
UID:1686-1645012800-1645016400@ifds.info
SUMMARY:E & A SIG:
DESCRIPTION:
URL:https://ifds.info/event/e-a-sig-5/
CATEGORIES:E & A SIG
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220216T123000
DTEND;TZID=America/Chicago:20220216T133000
DTSTAMP:20260407T035214
CREATED:20220119T203515Z
LAST-MODIFIED:20220119T205129Z
UID:1789-1645014600-1645018200@ifds.info
SUMMARY:SILO: Victor M Zavala
DESCRIPTION:
URL:https://ifds.info/event/victor-m-zavala/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20220218T123000
DTEND;TZID=America/Los_Angeles:20220218T133000
DTSTAMP:20260407T035214
CREATED:20220325T194026Z
LAST-MODIFIED:20220325T195105Z
UID:1900-1645187400-1645191000@ifds.info
SUMMARY:ML Opt @ UW: Yifang Chen
DESCRIPTION:Speaker: Yifang Chen  \nTitle: Active Multi-Task Representation Learning \nAbstract: To leverage the power of big data from source tasks and overcome the scarcity of the target task samples\, representation learning based on multi-task pretraining has become a standard approach in many applications. However\, up until now\, choosing which source tasks to include in the multi-task learning has been more art than science. In this paper\, we give the first formal study on resource task sampling by leveraging the techniques from active learning. We propose an algorithm that iteratively estimates the relevance of each source task to the target task and samples from each source task based on the estimated relevance. Theoretically\, we show that for the linear representation class\, to achieve the same error rate\, our algorithm can save up to a textit{number of source tasks} factor in the source task sample complexity\, compared with the naive uniform sampling from all source tasks. We also provide experiments on real-world computer vision datasets to illustrate the effectiveness of our proposed method on both linear and convolutional neural network representation classes. 
URL:https://ifds.info/event/ml-opt-uw-yifang-chen/
CATEGORIES:MLOpt@UWash
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220221T123000
DTEND;TZID=America/Chicago:20220221T133000
DTSTAMP:20260407T035214
CREATED:20220119T200612Z
LAST-MODIFIED:20220218T203311Z
UID:1756-1645446600-1645450200@ifds.info
SUMMARY:IFDS Ideas Forum: Ankit Pensia & Thanasis Pittas
DESCRIPTION:THIS IS A ZOOM ONLY EVENT. NO LIVE AUDIENCE. \nTitle: Hypothesis Testing under Communication Constraints\nSpeaker: Ankit Pensia\nAbstract: Simple hypothesis testing is a fundamental problem in statistics and it is well-known that its sample complexity is characterized by the Hellinger distance between the two candidate distributions. In this talk\, we discuss the problem of simple hypothesis testing under communication constraints\, wherein each sample is mapped to a message from a finite set of messages before being revealed to the statistician. We show that it is possible to map samples to messages such that the sample complexity is only an extra logarithmic factor larger than the non-constrained setting. Our proofs rely on a reverse data processing inequality and a reverse Markov’s inequality\, which might be of independent interest. This is joint work with Po-Ling Loh and Varun Jog.\nBio: Ankit Pensia is a graduate student in the CS department. He is interested in robust statistics\, learning theory\, and high-dimensional statistics. Website: https://ankitp.net \n\nTitle: Streaming Algorithms for High-Dimensional Robust Statistics\nSpeaker: Thanasis Pittas\nAbstract: We study high-dimensional robust statistics tasks in the streaming model. A recent line of work obtained computationally efficient algorithms for a range of high-dimensional robust statistics tasks. Unfortunately\, all previous algorithms require storing the entire dataset\, incurring memory at least quadratic in the dimension. In this work\, we develop the first efficient streaming algorithms for high-dimensional robust statistics with near-optimal memory requirements (up to logarithmic factors). Our main result is for the task of high-dimensional robust mean estimation in (a strengthening of) Huber’s contamination model. We give an efficient single-pass streaming algorithm for this task with near-optimal error guarantees and space complexity nearly-linear in the dimension. As a corollary\, we obtain streaming algorithms with near-optimal space complexity for several more complex tasks\, including robust covariance estimation\, robust regression\, and more generally robust stochastic optimization.\nBio: Thanasis Pittas is a PhD student at the University of Wisconsin-Madison\, advised by Prof. Ilias Diakonikolas. He works on theoretical machine learning and robust statistics. Thanasis was an IFDS RA during the summer of 2021. He did his undergraduate studies in Greece\, at the National Technical University of Athens.
URL:https://ifds.info/event/ifds-ideas-forum-4/
LOCATION:Zoom
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220221T140000
DTEND;TZID=America/Chicago:20220221T140000
DTSTAMP:20260407T035214
CREATED:20230313T142147Z
LAST-MODIFIED:20230313T142147Z
UID:2428-1645452000-1645452000@ifds.info
SUMMARY:IFDS Ideas Forum
DESCRIPTION:
URL:https://ifds.info/event/ifds-ideas-forum-8/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220223T123000
DTEND;TZID=America/Chicago:20220223T133000
DTSTAMP:20260407T035214
CREATED:20220119T203515Z
LAST-MODIFIED:20220119T205253Z
UID:1790-1645619400-1645623000@ifds.info
SUMMARY:SILO: Mengdi Wang
DESCRIPTION:
URL:https://ifds.info/event/mengdi-wang/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220225T123000
DTEND;TZID=America/Chicago:20220225T133000
DTSTAMP:20260407T035214
CREATED:20220325T193820Z
LAST-MODIFIED:20220325T194938Z
UID:1896-1645792200-1645795800@ifds.info
SUMMARY:ML-Opt @ UWash: Krishna Pillutla
DESCRIPTION:Title: MAUVE: Measuring the Gap Between Neural Text and Human Text using Divergence Frontiers \nAbstract: As major progress is made in open-ended text generation\, measuring how close machine-generated text is to human language remains a critical open problem. We introduce MAUVE\, a comparison measure for open-ended text generation\, which directly compares the learnt distribution from a text generation model to the distribution of human-written text using divergence frontiers. MAUVE scales up to modern text generation models by computing information divergences in a quantized embedding space. Through an extensive empirical study on three open-ended generation tasks\, we find that MAUVE identifies known properties of generated text\, scales naturally with model size\, and correlates with human judgments\, with fewer restrictions than existing distributional evaluation metrics.
URL:https://ifds.info/event/ml-opt-uwash-zaid-harchaoui/
CATEGORIES:MLOpt@UWash
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220228T000000
DTEND;TZID=America/Chicago:20220228T000000
DTSTAMP:20260407T035214
CREATED:20220119T200613Z
LAST-MODIFIED:20220119T201515Z
UID:1757-1646006400-1646006400@ifds.info
SUMMARY:IFDS Ideas Forum: RA Talk
DESCRIPTION:Jiaxin Hu & Yuchen Zeng
URL:https://ifds.info/event/ifds-ideas-forum-ra-talk/
LOCATION:Zoom
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220228T140000
DTEND;TZID=America/Chicago:20220228T140000
DTSTAMP:20260407T035214
CREATED:20230313T142147Z
LAST-MODIFIED:20230313T142147Z
UID:2429-1646056800-1646056800@ifds.info
SUMMARY:IFDS Ideas Forum
DESCRIPTION:A Review of Iterate Retraction in First Order Optimization
URL:https://ifds.info/event/ifds-ideas-forum-9/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220302T123000
DTEND;TZID=America/Chicago:20220302T133000
DTSTAMP:20260407T035214
CREATED:20220119T203515Z
LAST-MODIFIED:20220119T205403Z
UID:1791-1646224200-1646227800@ifds.info
SUMMARY:SILO: Mario Figueiredo
DESCRIPTION:
URL:https://ifds.info/event/mario-figueiredo/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20220304T123000
DTEND;TZID=America/Los_Angeles:20220304T133000
DTSTAMP:20260407T035214
CREATED:20220325T194131Z
LAST-MODIFIED:20220325T200316Z
UID:1904-1646397000-1646400600@ifds.info
SUMMARY:IFDS All-Hands: Zaid Harchaoui
DESCRIPTION:
URL:https://ifds.info/event/ifds-all-hands-zaid-harchaoui-2/
CATEGORIES:Monthly All-Hands
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220307T123000
DTEND;TZID=America/Chicago:20220307T133000
DTSTAMP:20260407T035214
CREATED:20220119T200618Z
LAST-MODIFIED:20220308T155114Z
UID:1758-1646656200-1646659800@ifds.info
SUMMARY:IFDS Ideas Forum: Multiscale inverse problem\, from Schroedinger to Newton to Boltzmann
DESCRIPTION:Title: Multiscale inverse problem\, from Schroedinger to Newton to Boltzmann \nSpeaker: Qin Li\, Department of Mathematics \nDate + Location: 7 March (Monday)\, Orchard View Room \nAbstract: Inverse problems are ubiquitous. People probe the media with sources and measure the outputs. At the scale of quantum\, classical\, statistical and fluid\, these are inverse Schroedinger\, inverse Newton’s second law\, inverse Boltzmann problem\, and inverse diffusion respectively. The universe\, however\, should have a universal mathematical description\, as Hilbert proposed in 1900. In this talk\, we present a line of research results that unify all these inverse problems. Facing the IFDS crowd\, I’d like to ask the following question: how to integrate the mathematical equivalence into the optimization formulation for a more efficient algorithmic pipeline than the traditional PDE-constrained optimization? \nBio: Qin Li is an associate professor of mathematics at UW-Madison. Her research lies between scientific computing and PDE-constrained optimization.
URL:https://ifds.info/event/ifds-ideas-forum-some-ill-conditioned-and-well-conditioned-inverse-problems/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220307T140000
DTEND;TZID=America/Chicago:20220307T140000
DTSTAMP:20260407T035214
CREATED:20230313T142151Z
LAST-MODIFIED:20230313T142151Z
UID:2430-1646661600-1646661600@ifds.info
SUMMARY:IFDS Ideas Forum
DESCRIPTION:Multiscale inverse problem\, from Schroedinger to Newton to Boltzmann
URL:https://ifds.info/event/ifds-ideas-forum-10/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220309T123000
DTEND;TZID=America/Chicago:20220309T133000
DTSTAMP:20260407T035214
CREATED:20220119T203519Z
LAST-MODIFIED:20220119T205458Z
UID:1792-1646829000-1646832600@ifds.info
SUMMARY:SILO: Ulugbek Kamilov
DESCRIPTION:
URL:https://ifds.info/event/ulugbek-kamilov/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20220311T133000
DTEND;TZID=America/Los_Angeles:20220311T143000
DTSTAMP:20260407T035214
CREATED:20220325T193738Z
LAST-MODIFIED:20220325T194845Z
UID:1893-1647005400-1647009000@ifds.info
SUMMARY:ML-Opt @ UWash: Sean Welleck
DESCRIPTION:Speaker: Sean Welleck \nTitle: Constrained text generation through discrete and continuous inference \nAbstract: Neural text generation has shifted to generating text with large-scale\, general-purpose models coupled with generic inference algorithms. An important open question is how to efficiently offer control over generated text. We describe two algorithms for enabling control through inference-time constraints: (i) A*-Neurologic\, a discrete search algorithm for incorporating logical constraints through estimates of the future\, and (ii) COLD decoding\, which treats generation as continuous gradient-based sampling from an energy function that captures task-relevant constraints. Our algorithms can be applied directly to off-the-shelf models without the need for task-specific finetuning\, and result in strong performance on a variety of generation tasks.  
URL:https://ifds.info/event/ml-opt-uwash-sean-welleck/
CATEGORIES:MLOpt@UWash
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220314T140000
DTEND;TZID=America/Chicago:20220314T140000
DTSTAMP:20260407T035214
CREATED:20230313T142152Z
LAST-MODIFIED:20230313T142152Z
UID:2431-1647266400-1647266400@ifds.info
SUMMARY:IFDS Ideas Forum
DESCRIPTION:
URL:https://ifds.info/event/ifds-ideas-forum-11/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220321T123000
DTEND;TZID=America/Chicago:20220321T133000
DTSTAMP:20260407T035214
CREATED:20220119T200618Z
LAST-MODIFIED:20220308T161728Z
UID:1760-1647865800-1647869400@ifds.info
SUMMARY:IFDS Ideas Forum: How to Make the Gradients Small in Convex and Min-Max Optimization
DESCRIPTION:Speaker: Jelena Diakonikolas
URL:https://ifds.info/event/ifds-ideas-forum-03212022/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220321T140000
DTEND;TZID=America/Chicago:20220321T140000
DTSTAMP:20260407T035214
CREATED:20230313T142152Z
LAST-MODIFIED:20230313T142152Z
UID:2432-1647871200-1647871200@ifds.info
SUMMARY:IFDS Ideas Forum
DESCRIPTION:How to Make the Gradients Small in Convex and Min-Max Optimization
URL:https://ifds.info/event/ifds-ideas-forum-12/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220323T123000
DTEND;TZID=America/Chicago:20220323T133000
DTSTAMP:20260407T035214
CREATED:20220119T203519Z
LAST-MODIFIED:20220119T205625Z
UID:1794-1648038600-1648042200@ifds.info
SUMMARY:SILO: Gitta Kutyniok
DESCRIPTION:
URL:https://ifds.info/event/gitta-kutyniok/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20220328T123000
DTEND;TZID=America/Chicago:20220328T133000
DTSTAMP:20260407T035214
CREATED:20220119T200618Z
LAST-MODIFIED:20220411T163036Z
UID:1761-1648470600-1648474200@ifds.info
SUMMARY:IFDS Ideas Forum: A Review of Neural Collapse
DESCRIPTION:Speaker: Greg Canal \nAbstract: Neural Collapse is a recently discovered phenomenon in deep neural network training that describes class separation in the final network layers. When a classification network is trained past the point of zero training error\, it has been observed that the penultimate layer activations collapse to their respective class means\, the means themselves form a simplex equiangular tight frame\, and the final layer linear classifiers align with each respective mean. Neural collapse has been demonstrated both empirically on deep networks as well as theoretically on simplified models\, and has inspired new questions on generalization\, robustness\, and architecture design. In this talk I will review the original discovery of neural collapse\, as well as recent literature that expands on related questions.
URL:https://ifds.info/event/ifds-ideas-forum-03282022/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:IFDS Ideas Forum
END:VEVENT
END:VCALENDAR