BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//IFDS - ECPv6.0.1.1//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:IFDS
X-ORIGINAL-URL:https://ifds.info
X-WR-CALDESC:Events for IFDS
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Chicago
BEGIN:DAYLIGHT
TZOFFSETFROM:-0600
TZOFFSETTO:-0500
TZNAME:CDT
DTSTART:20210314T080000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0500
TZOFFSETTO:-0600
TZNAME:CST
DTSTART:20211107T070000
END:STANDARD
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20210314T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20211107T090000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210929T123000
DTEND;TZID=America/Chicago:20210929T123000
DTSTAMP:20260407T052611
CREATED:20210909T195344Z
LAST-MODIFIED:20210909T195344Z
UID:1661-1632918600-1632918600@ifds.info
SUMMARY:SILO-Jeff Linderoth
DESCRIPTION:Subspace cluster with missing data via integer programming
URL:https://ifds.info/event/silo-jeff-linderoth/
CATEGORIES:SILO
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210927T123000
DTEND;TZID=America/Chicago:20210927T133000
DTSTAMP:20260407T052611
CREATED:20210909T191648Z
LAST-MODIFIED:20210909T192810Z
UID:1623-1632745800-1632749400@ifds.info
SUMMARY:IFDS Ideas Forum-Greg Canal
DESCRIPTION:UNTIL FURTHER NOTICE: Seminars are hybrid in-person and via zoom.
URL:https://ifds.info/event/ifds-ideas-forum-09272021/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210922T123000
DTEND;TZID=America/Chicago:20210922T123000
DTSTAMP:20260407T052611
CREATED:20210909T195344Z
LAST-MODIFIED:20210909T195344Z
UID:1660-1632313800-1632313800@ifds.info
SUMMARY:SILO-Michael Unser
DESCRIPTION:New representer theorems for inverse problems and machine learning
URL:https://ifds.info/event/silo-michael-unser/
CATEGORIES:SILO
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210920T123000
DTEND;TZID=America/Chicago:20210920T130000
DTSTAMP:20260407T052611
CREATED:20210909T191549Z
LAST-MODIFIED:20210909T192658Z
UID:1621-1632141000-1632142800@ifds.info
SUMMARY:IFDS Ideas Forum-Jasper Lee
DESCRIPTION:UNTIL FURTHER NOTICE: Seminars are hybrid in-person and via zoom.
URL:https://ifds.info/event/ifds-ideas-forum-09202021/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210915T123000
DTEND;TZID=America/Chicago:20210915T123000
DTSTAMP:20260407T052611
CREATED:20210909T195343Z
LAST-MODIFIED:20210909T195343Z
UID:1659-1631709000-1631709000@ifds.info
SUMMARY:SILO-Greg Canal
DESCRIPTION:
URL:https://ifds.info/event/silo-greg-canal/
CATEGORIES:SILO
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210915T120000
DTEND;TZID=America/Los_Angeles:20210915T130000
DTSTAMP:20260407T052611
CREATED:20210921T202200Z
LAST-MODIFIED:20210921T202325Z
UID:1675-1631707200-1631710800@ifds.info
SUMMARY:E & A SIG: Krishna Pillutla
DESCRIPTION:
URL:https://ifds.info/event/krishna-pillutla/
CATEGORIES:E & A SIG
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210913T123000
DTEND;TZID=America/Chicago:20210913T130000
DTSTAMP:20260407T052611
CREATED:20210909T185851Z
LAST-MODIFIED:20210909T192642Z
UID:1616-1631536200-1631538000@ifds.info
SUMMARY:IFDS Ideas Forum-Ahmet Alacaoglu
DESCRIPTION:UNTIL FURTHER NOTICE: Seminars are hybrid in-person and via zoom.
URL:https://ifds.info/event/ifds-ideas-forum-09132021/
LOCATION:Orchard View Room\, 330 N. Orchard Street\, 3rd Floor NE\, Madison\, Wisconsin\, 53715\, United States
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210908T123000
DTEND;TZID=America/Chicago:20210908T123000
DTSTAMP:20260407T052611
CREATED:20210909T195343Z
LAST-MODIFIED:20210909T195343Z
UID:1658-1631104200-1631104200@ifds.info
SUMMARY:SILO-Shivani Agrawal
DESCRIPTION:
URL:https://ifds.info/event/silo-shivani-agrawal/
CATEGORIES:SILO
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20210802
DTEND;VALUE=DATE:20210805
DTSTAMP:20260407T052611
CREATED:20210527T151703Z
LAST-MODIFIED:20210527T152440Z
UID:1297-1627862400-1628121599@ifds.info
SUMMARY:MadLab Workshop
DESCRIPTION:
URL:https://ifds.info/event/madlab-workshop/
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20210726
DTEND;VALUE=DATE:20210731
DTSTAMP:20260407T052611
CREATED:20210527T151603Z
LAST-MODIFIED:20210622T185804Z
UID:1295-1627257600-1627689599@ifds.info
SUMMARY:IFDS Summer School
DESCRIPTION:
URL:https://ifds.info/event/ifds-summer-school/
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20210610
DTEND;VALUE=DATE:20210612
DTSTAMP:20260407T052611
CREATED:20210527T145754Z
LAST-MODIFIED:20210527T152640Z
UID:1293-1623283200-1623455999@ifds.info
SUMMARY:TRIPODS PI meeting
DESCRIPTION:
URL:https://ifds.info/event/tripods-pi-meeting/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210604T133000
DTEND;TZID=America/Los_Angeles:20210604T143000
DTSTAMP:20260407T052611
CREATED:20210602T191959Z
LAST-MODIFIED:20210602T192210Z
UID:1320-1622813400-1622817000@ifds.info
SUMMARY:ML-Opt: Romain Camilleri and Swati Padmanabhan
DESCRIPTION:The final talks of the ML-OPT seminar of the spring quarter will be given Friday (6/4) at 1:30pm PST by Romain Camilleri and Swati Padmanabhan.\n\n\nTitle: High-Dimensional Experimental Design and Kernel Bandits\n\nAbstract: I will talk about high-dimensional bandits. First I want to review how the classical approach to solving linear bandits motivates an experimental design problem. Then I plan to justify why common rounding techniques cannot be applied in a potentially infinite-dimensional space. Lastly\, I will show that one can avoid relying on rounding techniques by using a Catoni estimator.\n\nBio: Romain Camilleri is a 3rd year Ph.D. student at the Paul G. Allen School of Computer Science and Engineering at the University of Washington\, where he is advised by Kevin Jamieson.\n\n—————————————-\nTitle: Computing Lewis Weights to High Precision\n\nAbstract: We present an algorithm for computing high-precision approximate L_p Lewis weights for p > 2. Given an m x n real full-rank matrix A and p>=3\, our algorithm computes epsilon-approximate L_p-Lewis weights using  O(p^3 \log (m p / epsilon)) iterations\, where each iteration takes time linear in the sparsity of the input matrix plus the time to compute the leverage scores of a diagonal rescaling of A. Previously\, such iteration complexities were known only for 0< p < 4   [CohenPeng2015]. Consequently\, our result helps complete the picture on near-optimal reduction from leverage scores to L_p-Lewis weights for all p>0.\n[Joint work with Maryam Fazel\, Yin Tat Lee\, and Aaron Sidford]\n\n\nBio: Swati is a graduate student working on problems in convex optimization\, advised by Yin Tat Lee.
URL:https://ifds.info/event/ml-opt-romain-camilleri-and-swati-padmanabhan/
CATEGORIES:MLOpt@UWash
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210519T123000
DTEND;TZID=America/Chicago:20210519T133000
DTSTAMP:20260407T052611
CREATED:20210202T202131Z
LAST-MODIFIED:20210517T133821Z
UID:1029-1621427400-1621431000@ifds.info
SUMMARY:SILO: Dimitris Tsipras
DESCRIPTION:Title: Robust Machine Learning: The Worst-Case and Beyond \nAbstract:\nOne of the key challenges in the real-world deployment of machine learning models is their brittleness: their performance significantly degrades when exposed to even small variations of their training environments. \nHow can we build ML models that are more robust? \nIn this talk\, I will present a methodology for training models that are invariant to a broad family of worst-case input perturbations. I will then describe how such robust learning leads to models that learn fundamentally different data representations\, and how this can be useful even outside the adversarial context. Finally\, I will discuss model robustness beyond the worst-case: ways in which our models fail to generalize and how we can guide further progress on this front.” \nBio:\n“Dimitris Tsipras is a PhD student in the MIT EECS Department\, advised by Aleksander Mądry. His work revolves around the reliability and robustness of machine learning systems\, as well as the science of modern machine learning. He is currently being supported by a Facebook PhD Fellowship \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-05192021/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210514T084500
DTEND;TZID=America/Chicago:20210514T150000
DTSTAMP:20260407T052611
CREATED:20210512T165210Z
LAST-MODIFIED:20210512T165210Z
UID:1250-1620981900-1621004400@ifds.info
SUMMARY:Data Science Day
DESCRIPTION:Uses and Abuses of Data in Higher Education \nVirtual Event via Zoom with Opportunities for Engagement\nDetails and registration: https://citl.ucsc.edu/data-science-day-2021/
URL:https://ifds.info/event/data-science-day/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210512T123000
DTEND;TZID=America/Chicago:20210512T133000
DTSTAMP:20260407T052611
CREATED:20210202T202020Z
LAST-MODIFIED:20210512T132548Z
UID:1027-1620822600-1620826200@ifds.info
SUMMARY:SILO: Rashmi Vinayak
DESCRIPTION:Title: Convertible Codes: Efficient Conversion of Coded Data in Large-scale Storage Systems \nAbstract:\nIn large-scale data storage systems\, failures are the norm in day-to-day operations. To protect data in the face of such failures\, erasure codes (a tool from coding theory) are employed to store data in a redundant fashion.  In this setting\, a set of k data blocks to be stored is encoded using an [n\, k] code to generate n blocks that are then stored on distinct storage devices. In a recent work\, we showed that the failure rate of storage devices vary considerably over time\, and that dynamically tuning the parameters n and k of the code provides significant reduction in storage cost. However\, traditional codes suffer from prohibitively high resource overheads in changing the code parameters on already encoded data. \nMotivated by this application\, in this talk\, we:\n1. Present a new theoretical framework to formalize the notion of “code conversion”—the process of converting data encoded using an [n\, k] code into data encoded using a code with different parameters [n’\, k’]\, while maintaining desired decodability properties\,\n2. Introduce “convertible codes”\, a new class of codes that enable resource-efficient conversion\,\n3. Prove tight bounds on two important metrics for code conversion (a) the number of nodes accessed\, and (b) bandwidth consumed\,\n4. Present practical constructions of convertible codes for a broad range of parameters. \nBio:\nRashmi Vinayak is an assistant professor in the Computer Science department at Carnegie Mellon University. Her research interests broadly lie in computer/networked systems and information/coding theory\, and the wide spectrum of intersection between the two areas. Her current focus is on fault tolerance and resource efficiency in data systems. Rashmi is a recipient of NSF CAREER Award\, Tata Institute of Fundamental Research Memorial Lecture Award 2020\, Facebook Distributed Systems Research Award 2019\, Google Faculty Research Award 2018\, Facebook Communications and Networking Research Award 2017\, UC Berkeley Eli Jury Award 2016 for “outstanding achievement in the area of systems\, communications\, control\, or signal processing”. Her work has received USENIX NSDI 2021 Community (Best Paper) Award\, and IEEE Data Storage Best Paper and Best Student Paper Awards for the years 2011/2012. Rashmi received her Ph.D. from UC Berkeley in 2016\, and was a postdoctoral scholar at UC Berkeley’s AMPLab/RISELab from 2016-17. During her Ph.D. studies\, Rashmi was a recipient of Facebook Fellowship 2012-13\, the Microsoft Research PhD Fellowship 2013-15\, and the Google Anita Borg Memorial Scholarship 2015-16.\nWebpage: http://www.cs.cmu.edu/~rvinayak/ \n  \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-05122021/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210510T133000
DTEND;TZID=America/Los_Angeles:20210510T143000
DTSTAMP:20260407T052611
CREATED:20210521T152822Z
LAST-MODIFIED:20210521T153150Z
UID:1265-1620653400-1620657000@ifds.info
SUMMARY:IDFS Ideas Forum: Subjoyoti Mukherjee
DESCRIPTION:
URL:https://ifds.info/event/idfs-ideas-forum-subjoyoti-mukherjee/
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210507T133000
DTEND;TZID=America/Los_Angeles:20210507T143000
DTSTAMP:20260407T052611
CREATED:20210504T203604Z
LAST-MODIFIED:20210504T203845Z
UID:1230-1620394200-1620397800@ifds.info
SUMMARY:IFDS All-Hands: Rina Foygel Barber
DESCRIPTION:Convergence for nonconvex ADMM\, with applications to CT imaging\nThe alternating direction method of multipliers (ADMM) algorithm is a powerful and flexible tool for complex optimization problems of the form min{f(x)+g(y):Ax+By=c}. ADMM exhibits robust empirical performance across a range of challenging settings including nonsmoothness and nonconvexity of the objective functions f and g\, and provides a simple and natural approach to the inverse problem of image reconstruction for computed tomography (CT) imaging. From the theoretical point of view\, existing results for convergence in the nonconvex setting generally assume smoothness in at least one of the component functions in the objective. In this work\, our new theoretical results provide convergence guarantees under a restricted strong convexity assumption without requiring smoothness or differentiability\, while still allowing differentiable terms to be treated approximately if needed. We validate these theoretical results empirically\, with a simulated example where both f and g are nondifferentiable (and thus outside the scope of existing theory)\, as well as a simulated CT image reconstruction problem. \n\n\nBio: Rina Foygel Barber is a Louis Block Professor in the Department of Statistics at the University of Chicago. She was a NSF postdoctoral fellow during 2012-13 in the Department of Statistics at Stanford University\, supervised by Emmanuel Candès. She received her PhD in Statistics at the University of Chicago in 2012\, advised by Mathias Drton and Nati Srebro\, and a MS in Mathematics at the University of Chicago in 2009. Prior to graduate school\, she was a mathematics teacher at the Park School of Baltimore from 2005 to 2007.
URL:https://ifds.info/event/ifds-all-hands-rina-foygel-barber/
CATEGORIES:Monthly All-Hands
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210505T123000
DTEND;TZID=America/Chicago:20210505T133000
DTSTAMP:20260407T052611
CREATED:20210202T193340Z
LAST-MODIFIED:20210219T193043Z
UID:1002-1620217800-1620221400@ifds.info
SUMMARY:SILO: Yuanzhi Li
DESCRIPTION:
URL:https://ifds.info/event/silo-05052021/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210503T140000
DTEND;TZID=America/Chicago:20210503T143000
DTSTAMP:20260407T052611
CREATED:20210114T205134Z
LAST-MODIFIED:20210115T195541Z
UID:833-1620050400-1620052200@ifds.info
SUMMARY:IFDS Ideas Forum: Shuqi Yu
DESCRIPTION:Title: TBD \nAbstract: \nShuqi Yu (Mathematics)\, advised by Sebastien Roch (Mathematics) and working with Karl Rohe (Statistics) on large scale network models. She aims to establish theoretical guarantees for a new estimator of the number of communities in a stochastic blockmodel. She is also interested in phylogenetics questions\, in particular\, she works on the identifiability of the species phylogeny under an horizontal gene transfer model.
URL:https://ifds.info/event/ifds-ideas-forum-shuqi-yu/
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210503T133000
DTEND;TZID=America/Chicago:20210503T140000
DTSTAMP:20260407T052611
CREATED:20210114T204824Z
LAST-MODIFIED:20210115T200804Z
UID:831-1620048600-1620050400@ifds.info
SUMMARY:IFDS Ideas Forum: Changhun Jo
DESCRIPTION:Title: \nAbstract: \nChanghun Jo (Mathematics)\, advised by Kangwook Lee (Electrical and Computer Engineering) and Sebastien Roch (Mathematics)\, is working on the theoretical understanding of machine learning. His recent work focuses on finding an optimal data poisoning algorithm against a fairness-aware learner. He also works on finding the fundamental limit on sample complexity of matrix completion in the presence of graph side information.
URL:https://ifds.info/event/ifds-ideas-forum-changhun-jo/
LOCATION:Webex
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210428T123000
DTEND;TZID=America/Chicago:20210428T133000
DTSTAMP:20260407T052611
CREATED:20210202T201908Z
LAST-MODIFIED:20210202T201908Z
UID:1025-1619613000-1619616600@ifds.info
SUMMARY:SILO: Martin Wainwright
DESCRIPTION:Title: TBD \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-martin-wainwright/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210426T140000
DTEND;TZID=America/Chicago:20210426T143000
DTSTAMP:20260407T052611
CREATED:20210114T204543Z
LAST-MODIFIED:20210115T200856Z
UID:829-1619445600-1619447400@ifds.info
SUMMARY:IFDS Ideas Forum: Shashank Rajput
DESCRIPTION:Title: \nAbstract: \nShashank Rajput (Computer Science)\, advised by Dimitris Papailiopoulos (Electrical and Computer Engineering)\, Kangwook Lee (Electrical and Computer Engineering) and Stephen Wright (Computer Science)\, works on problems in distributed machine learning and optimization. He works on developing techniques which are fast and scalable for practical use\, and have theoretical guarantees. Currently\, he is working on finding better shuffling mechanisms that beat random reshuffling as well as developing algorithms for training neural networks by only pruning them.
URL:https://ifds.info/event/ifds-ideas-forum-shashank-rajput/
LOCATION:Webex
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210426T133000
DTEND;TZID=America/Chicago:20210426T140000
DTSTAMP:20260407T052611
CREATED:20210114T204232Z
LAST-MODIFIED:20210115T200951Z
UID:827-1619443800-1619445600@ifds.info
SUMMARY:IFDS Ideas Forum: Shi Chen
DESCRIPTION:Title: \nAbstract: \nShi Chen (Mathematics)\, advised by Qin Li (Mathematics) and Stephen J. Wright (Computer Science)\, works on problems in the interdisciplinary area of applied math and machine learning. He is interested in connecting machine learning to various mathematical physics problems\, including the homogenization of PDEs and inverse problems. His recent work focuses on applying deep learning to PDE inverse problems.
URL:https://ifds.info/event/ifds-ideas-forum-shi-chen/
LOCATION:Webex
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210423T133000
DTEND;TZID=America/Los_Angeles:20210423T143000
DTSTAMP:20260407T052611
CREATED:20210422T180907Z
LAST-MODIFIED:20210422T181817Z
UID:1146-1619184600-1619188200@ifds.info
SUMMARY:ML-Opt@UW: Yue Sun
DESCRIPTION:Title: Subspace Based Meta-learning\nAbstract:\nMeta-learning typically involves two phases. First\, one learns a suitable representation from the previously seen tasks. Secondly\, this representation is used for learning a new task using only a few samples (i.e.\, few-shot learning). In this talk I will discuss:\n1. Linear meta learning: sample complexity of representation learning with general covariance\n2. Linear meta learning: algorithm & analysis for overparameterized few-shot learning\n3. Generalization to nonlinear meta-learning\n\n\nBio:\nYue Sun is a 5th year PhD student from University of Washington\, Seattle. He is interested in theoretical understanding of optimization\, ML and control. His research works are about:\n1. Nonconvex optimization on Riemannian manifolds (UW)\n2. Low order linear system identification (UW)\n3. Subspace based meta-learning (UW)\n4. Nonconvex optimization applied in optimal control (UW)\n\n5. Online optimization for video coding (Google\, 2019)\n6. Compressive sensing and phase retrieval (Ohio State U\, 2015; Nokia Bell Labs\, 2021)
URL:https://ifds.info/event/mloptuw/
CATEGORIES:MLOpt@UWash
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210421T123000
DTEND;TZID=America/Chicago:20210421T133000
DTSTAMP:20260407T052611
CREATED:20210202T201514Z
LAST-MODIFIED:20210219T192936Z
UID:1021-1619008200-1619011800@ifds.info
SUMMARY:SILO: Emmanuel Abbe
DESCRIPTION:Title: TBD \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-04212021/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210419T133000
DTEND;TZID=America/Chicago:20210419T143000
DTSTAMP:20260407T052611
CREATED:20210202T203343Z
LAST-MODIFIED:20210316T143205Z
UID:1041-1618839000-1618842600@ifds.info
SUMMARY:IFDS Ideas Forum: Zhiyan Ding
DESCRIPTION:Person: Zhiyan Ding \nTitle:  Connection between optimization and sampling \nAbstract: \nBio: I am a third-year PhD student in the Department of Mathematics\, University of Wisconsin-Madison. My advisor is Qin Li.
URL:https://ifds.info/event/ifds-ideas-forum-04192021/
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210416T133000
DTEND;TZID=America/Los_Angeles:20210416T143000
DTSTAMP:20260407T052611
CREATED:20210412T182350Z
LAST-MODIFIED:20210412T182350Z
UID:1138-1618579800-1618583400@ifds.info
SUMMARY:ML-Opt: Adhyyan Narang
DESCRIPTION:Title: Classification vs Adversarial Examples for the Overparameterized linear model. \n(Joint work with Vidya Muthukumar and Anant Sahai at UC Berkeley) \nAbstract: In modern machine learning\, overparameterized models are often used. It has been empirically observed that these often generalize well and display double-descent\, but are susceptible to adversarial perturbations. Past theoretical explanations of these phenomena usually focus on linear models where the adversary has the power to perturb the features directly. However\, the field of meta-learning has revealed that neural networks can be interpreted as first learning a feature-representation and then learning the best linear model on these learned features. The role of lifting in adversarial susceptibility of models is largely unaddressed\, primarily because the problem of finding an adversarial example for lifted models is nonconvex and difficult to solve. \nIn this talk\, I will use concepts from signal-processing to propose a toy model that exhibits all of the aforementioned phenomena\, most crucially lifting. The toy nature of the model allows us to overcome the challenge of solving the adversarial-search problem. We learn that the adversarial vulnerability arises because of a phenomena we term spatial localization: the predictions of the learned model are markedly more sensitive in the vicinity of training points than elsewhere. Despite the adversarial susceptibility\, we find that classification using spatially localized features can be “easier” i.e. less sensitive to the strength of the prior than in independent feature setups. \nBio: Adhyyan Narang (ECE) is a first-year PhD student\, advised by Maryam Fazel and Lilian Ratliff. He is interested in fundamental theoretical questions about learning from data\, and broadly works in the intersection of machine learning\, optimization and game theory.
URL:https://ifds.info/event/mlopt-adhyyan-narang/
CATEGORIES:MLOpt@UWash
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210414T123000
DTEND;TZID=America/Chicago:20210414T133000
DTSTAMP:20260407T052611
CREATED:20210202T201325Z
LAST-MODIFIED:20210412T181842Z
UID:1019-1618403400-1618407000@ifds.info
SUMMARY:SILO: Merve Bodur
DESCRIPTION:Title: Copositive Duality for Discrete Markets and Games \nModels including binary decisions are often modelled as mixed-integer programs (MIPs). Such models are nonconvex and lack strong duality\, which prevents the use of tools such as shadow prices and KKT conditions. For example\, in convex markets\, shadow (dual) prices are associated with market equilibrium\, and for convex games the existence and uniqueness of Nash equilibrium can be proven via fixed-point theorem and KKT conditions. Those results are lacking in their nonconvex counterparts. We use copositive programming to formulate discrete problems in applications including nonconvex energy markets and nonconvex games\, to leverage its convexity and strong duality features. We obtain several novel theoretical and numerical results for those applications\, including a new revenue-adequate pricing scheme for energy markets\, and existence\, uniqueness\, and KKT conditions for the pure-strategy Nash equilibrium in discrete games. We also propose a novel and purely MIP-based cutting-plane algorithm for mixed-integer copositive programs\, and employ it in our applications. This is a joint work with Cheng Guo and Josh A. Taylor. \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-rashmi-vimayak/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210412T133000
DTEND;TZID=America/Chicago:20210412T143000
DTSTAMP:20260407T052611
CREATED:20210202T203236Z
LAST-MODIFIED:20210211T154258Z
UID:1039-1618234200-1618237800@ifds.info
SUMMARY:IFDS Ideas Forum: Vivak Patel
DESCRIPTION:Person: Vivak Patel \nTitle: Adaptive Iterative Methods for Linear Systems \nAbstract: \nBio: My research is at the intersection of numerical optimization and statistical estimation with applications to dynamical systems. On the one hand\, I interpret this as using numerical optimization ideas to design statistical estimators that are practically computable\, yet still retain certain desirable statistical properties. On the other hand\, I interpret this as using statistical ideas to design optimization methods for solving minimization problems that involve uncertainty. Beyond these two main areas of work\, I also do some work in machine learning\, numerical linear algebra and load modeling for power systems.
URL:https://ifds.info/event/ifds-ideas-forum-04122021/
CATEGORIES:IFDS Ideas Forum
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210409T133000
DTEND;TZID=America/Los_Angeles:20210409T143000
DTSTAMP:20260407T052611
CREATED:20210412T183054Z
LAST-MODIFIED:20210412T183353Z
UID:1140-1617975000-1617978600@ifds.info
SUMMARY:ML-Opt: Krishna Pillutla
DESCRIPTION:Title: Distributionally Robust Machine Learning with the Superquantile 1) For Supervised Learning\, 2) For Federated Learning\n\n\nAbstract: I will talk about distributionally robust machine learning\, a principled approach for robust performance across subpopulations\, and shifting distributions. We will focus on the superquantile\, a.k.a. the Conditional Value at Risk (CVaR)\, which was popularized by the seminal work of UW’s own R. T. Rockafellar and co-authors in the field of computational finance and economics in the early 2000s.\nWe will first review the use of the superquantile for distributionally robust supervised learning. We will prove a generalization bound from first principles.\nSecond\, we will discuss an application of the superquantile in the field of federated learning\, i.e.\, the distributed training of machine learning models on mobile phones. We will quantify the extent to which a user conforms to the population distribution and show how the superquantile can be leveraged to improve performance on users who do not conform to the population. We will round of the discussion with a communication-efficient training algorithm and experimental results and heterogeneous datasets.\n\nBio: Krishna Pillutla is a 5th year Ph.D. student at the Paul G. Allen School of Computer Science and Engineering at the University of Washington\, where he is advised by Zaid Harchaoui and Sham Kakade. Krishna is broadly interested in machine learning and optimization and works in the particular areas of structured prediction and federated learning. Krishna was a 2019-20 JP Morgan Ph.D. Fellow.
URL:https://ifds.info/event/ml-opt-krishna-pillutla/
CATEGORIES:MLOpt@UWash
END:VEVENT
END:VCALENDAR