BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//IFDS - ECPv6.0.1.1//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:IFDS
X-ORIGINAL-URL:https://ifds.info
X-WR-CALDESC:Events for IFDS
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Chicago
BEGIN:DAYLIGHT
TZOFFSETFROM:-0600
TZOFFSETTO:-0500
TZNAME:CDT
DTSTART:20210314T080000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0500
TZOFFSETTO:-0600
TZNAME:CST
DTSTART:20211107T070000
END:STANDARD
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20210314T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20211107T090000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210519T123000
DTEND;TZID=America/Chicago:20210519T133000
DTSTAMP:20260409T110159
CREATED:20210202T202131Z
LAST-MODIFIED:20210517T133821Z
UID:1029-1621427400-1621431000@ifds.info
SUMMARY:SILO: Dimitris Tsipras
DESCRIPTION:Title: Robust Machine Learning: The Worst-Case and Beyond \nAbstract:\nOne of the key challenges in the real-world deployment of machine learning models is their brittleness: their performance significantly degrades when exposed to even small variations of their training environments. \nHow can we build ML models that are more robust? \nIn this talk\, I will present a methodology for training models that are invariant to a broad family of worst-case input perturbations. I will then describe how such robust learning leads to models that learn fundamentally different data representations\, and how this can be useful even outside the adversarial context. Finally\, I will discuss model robustness beyond the worst-case: ways in which our models fail to generalize and how we can guide further progress on this front.” \nBio:\n“Dimitris Tsipras is a PhD student in the MIT EECS Department\, advised by Aleksander Mądry. His work revolves around the reliability and robustness of machine learning systems\, as well as the science of modern machine learning. He is currently being supported by a Facebook PhD Fellowship \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-05192021/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210514T084500
DTEND;TZID=America/Chicago:20210514T150000
DTSTAMP:20260409T110159
CREATED:20210512T165210Z
LAST-MODIFIED:20210512T165210Z
UID:1250-1620981900-1621004400@ifds.info
SUMMARY:Data Science Day
DESCRIPTION:Uses and Abuses of Data in Higher Education \nVirtual Event via Zoom with Opportunities for Engagement\nDetails and registration: https://citl.ucsc.edu/data-science-day-2021/
URL:https://ifds.info/event/data-science-day/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210512T123000
DTEND;TZID=America/Chicago:20210512T133000
DTSTAMP:20260409T110159
CREATED:20210202T202020Z
LAST-MODIFIED:20210512T132548Z
UID:1027-1620822600-1620826200@ifds.info
SUMMARY:SILO: Rashmi Vinayak
DESCRIPTION:Title: Convertible Codes: Efficient Conversion of Coded Data in Large-scale Storage Systems \nAbstract:\nIn large-scale data storage systems\, failures are the norm in day-to-day operations. To protect data in the face of such failures\, erasure codes (a tool from coding theory) are employed to store data in a redundant fashion.  In this setting\, a set of k data blocks to be stored is encoded using an [n\, k] code to generate n blocks that are then stored on distinct storage devices. In a recent work\, we showed that the failure rate of storage devices vary considerably over time\, and that dynamically tuning the parameters n and k of the code provides significant reduction in storage cost. However\, traditional codes suffer from prohibitively high resource overheads in changing the code parameters on already encoded data. \nMotivated by this application\, in this talk\, we:\n1. Present a new theoretical framework to formalize the notion of “code conversion”—the process of converting data encoded using an [n\, k] code into data encoded using a code with different parameters [n’\, k’]\, while maintaining desired decodability properties\,\n2. Introduce “convertible codes”\, a new class of codes that enable resource-efficient conversion\,\n3. Prove tight bounds on two important metrics for code conversion (a) the number of nodes accessed\, and (b) bandwidth consumed\,\n4. Present practical constructions of convertible codes for a broad range of parameters. \nBio:\nRashmi Vinayak is an assistant professor in the Computer Science department at Carnegie Mellon University. Her research interests broadly lie in computer/networked systems and information/coding theory\, and the wide spectrum of intersection between the two areas. Her current focus is on fault tolerance and resource efficiency in data systems. Rashmi is a recipient of NSF CAREER Award\, Tata Institute of Fundamental Research Memorial Lecture Award 2020\, Facebook Distributed Systems Research Award 2019\, Google Faculty Research Award 2018\, Facebook Communications and Networking Research Award 2017\, UC Berkeley Eli Jury Award 2016 for “outstanding achievement in the area of systems\, communications\, control\, or signal processing”. Her work has received USENIX NSDI 2021 Community (Best Paper) Award\, and IEEE Data Storage Best Paper and Best Student Paper Awards for the years 2011/2012. Rashmi received her Ph.D. from UC Berkeley in 2016\, and was a postdoctoral scholar at UC Berkeley’s AMPLab/RISELab from 2016-17. During her Ph.D. studies\, Rashmi was a recipient of Facebook Fellowship 2012-13\, the Microsoft Research PhD Fellowship 2013-15\, and the Google Anita Borg Memorial Scholarship 2015-16.\nWebpage: http://www.cs.cmu.edu/~rvinayak/ \n  \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-05122021/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210510T133000
DTEND;TZID=America/Los_Angeles:20210510T143000
DTSTAMP:20260409T110159
CREATED:20210521T152822Z
LAST-MODIFIED:20210521T153150Z
UID:1265-1620653400-1620657000@ifds.info
SUMMARY:IDFS Ideas Forum: Subjoyoti Mukherjee
DESCRIPTION:
URL:https://ifds.info/event/idfs-ideas-forum-subjoyoti-mukherjee/
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210507T133000
DTEND;TZID=America/Los_Angeles:20210507T143000
DTSTAMP:20260409T110159
CREATED:20210504T203604Z
LAST-MODIFIED:20210504T203845Z
UID:1230-1620394200-1620397800@ifds.info
SUMMARY:IFDS All-Hands: Rina Foygel Barber
DESCRIPTION:Convergence for nonconvex ADMM\, with applications to CT imaging\nThe alternating direction method of multipliers (ADMM) algorithm is a powerful and flexible tool for complex optimization problems of the form min{f(x)+g(y):Ax+By=c}. ADMM exhibits robust empirical performance across a range of challenging settings including nonsmoothness and nonconvexity of the objective functions f and g\, and provides a simple and natural approach to the inverse problem of image reconstruction for computed tomography (CT) imaging. From the theoretical point of view\, existing results for convergence in the nonconvex setting generally assume smoothness in at least one of the component functions in the objective. In this work\, our new theoretical results provide convergence guarantees under a restricted strong convexity assumption without requiring smoothness or differentiability\, while still allowing differentiable terms to be treated approximately if needed. We validate these theoretical results empirically\, with a simulated example where both f and g are nondifferentiable (and thus outside the scope of existing theory)\, as well as a simulated CT image reconstruction problem. \n\n\nBio: Rina Foygel Barber is a Louis Block Professor in the Department of Statistics at the University of Chicago. She was a NSF postdoctoral fellow during 2012-13 in the Department of Statistics at Stanford University\, supervised by Emmanuel Candès. She received her PhD in Statistics at the University of Chicago in 2012\, advised by Mathias Drton and Nati Srebro\, and a MS in Mathematics at the University of Chicago in 2009. Prior to graduate school\, she was a mathematics teacher at the Park School of Baltimore from 2005 to 2007.
URL:https://ifds.info/event/ifds-all-hands-rina-foygel-barber/
CATEGORIES:Monthly All-Hands
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210505T123000
DTEND;TZID=America/Chicago:20210505T133000
DTSTAMP:20260409T110159
CREATED:20210202T193340Z
LAST-MODIFIED:20210219T193043Z
UID:1002-1620217800-1620221400@ifds.info
SUMMARY:SILO: Yuanzhi Li
DESCRIPTION:
URL:https://ifds.info/event/silo-05052021/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210503T140000
DTEND;TZID=America/Chicago:20210503T143000
DTSTAMP:20260409T110159
CREATED:20210114T205134Z
LAST-MODIFIED:20210115T195541Z
UID:833-1620050400-1620052200@ifds.info
SUMMARY:IFDS Ideas Forum: Shuqi Yu
DESCRIPTION:Title: TBD \nAbstract: \nShuqi Yu (Mathematics)\, advised by Sebastien Roch (Mathematics) and working with Karl Rohe (Statistics) on large scale network models. She aims to establish theoretical guarantees for a new estimator of the number of communities in a stochastic blockmodel. She is also interested in phylogenetics questions\, in particular\, she works on the identifiability of the species phylogeny under an horizontal gene transfer model.
URL:https://ifds.info/event/ifds-ideas-forum-shuqi-yu/
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210503T133000
DTEND;TZID=America/Chicago:20210503T140000
DTSTAMP:20260409T110159
CREATED:20210114T204824Z
LAST-MODIFIED:20210115T200804Z
UID:831-1620048600-1620050400@ifds.info
SUMMARY:IFDS Ideas Forum: Changhun Jo
DESCRIPTION:Title: \nAbstract: \nChanghun Jo (Mathematics)\, advised by Kangwook Lee (Electrical and Computer Engineering) and Sebastien Roch (Mathematics)\, is working on the theoretical understanding of machine learning. His recent work focuses on finding an optimal data poisoning algorithm against a fairness-aware learner. He also works on finding the fundamental limit on sample complexity of matrix completion in the presence of graph side information.
URL:https://ifds.info/event/ifds-ideas-forum-changhun-jo/
LOCATION:Webex
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210428T123000
DTEND;TZID=America/Chicago:20210428T133000
DTSTAMP:20260409T110159
CREATED:20210202T201908Z
LAST-MODIFIED:20210202T201908Z
UID:1025-1619613000-1619616600@ifds.info
SUMMARY:SILO: Martin Wainwright
DESCRIPTION:Title: TBD \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-martin-wainwright/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210426T140000
DTEND;TZID=America/Chicago:20210426T143000
DTSTAMP:20260409T110159
CREATED:20210114T204543Z
LAST-MODIFIED:20210115T200856Z
UID:829-1619445600-1619447400@ifds.info
SUMMARY:IFDS Ideas Forum: Shashank Rajput
DESCRIPTION:Title: \nAbstract: \nShashank Rajput (Computer Science)\, advised by Dimitris Papailiopoulos (Electrical and Computer Engineering)\, Kangwook Lee (Electrical and Computer Engineering) and Stephen Wright (Computer Science)\, works on problems in distributed machine learning and optimization. He works on developing techniques which are fast and scalable for practical use\, and have theoretical guarantees. Currently\, he is working on finding better shuffling mechanisms that beat random reshuffling as well as developing algorithms for training neural networks by only pruning them.
URL:https://ifds.info/event/ifds-ideas-forum-shashank-rajput/
LOCATION:Webex
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210426T133000
DTEND;TZID=America/Chicago:20210426T140000
DTSTAMP:20260409T110159
CREATED:20210114T204232Z
LAST-MODIFIED:20210115T200951Z
UID:827-1619443800-1619445600@ifds.info
SUMMARY:IFDS Ideas Forum: Shi Chen
DESCRIPTION:Title: \nAbstract: \nShi Chen (Mathematics)\, advised by Qin Li (Mathematics) and Stephen J. Wright (Computer Science)\, works on problems in the interdisciplinary area of applied math and machine learning. He is interested in connecting machine learning to various mathematical physics problems\, including the homogenization of PDEs and inverse problems. His recent work focuses on applying deep learning to PDE inverse problems.
URL:https://ifds.info/event/ifds-ideas-forum-shi-chen/
LOCATION:Webex
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210423T133000
DTEND;TZID=America/Los_Angeles:20210423T143000
DTSTAMP:20260409T110159
CREATED:20210422T180907Z
LAST-MODIFIED:20210422T181817Z
UID:1146-1619184600-1619188200@ifds.info
SUMMARY:ML-Opt@UW: Yue Sun
DESCRIPTION:Title: Subspace Based Meta-learning\nAbstract:\nMeta-learning typically involves two phases. First\, one learns a suitable representation from the previously seen tasks. Secondly\, this representation is used for learning a new task using only a few samples (i.e.\, few-shot learning). In this talk I will discuss:\n1. Linear meta learning: sample complexity of representation learning with general covariance\n2. Linear meta learning: algorithm & analysis for overparameterized few-shot learning\n3. Generalization to nonlinear meta-learning\n\n\nBio:\nYue Sun is a 5th year PhD student from University of Washington\, Seattle. He is interested in theoretical understanding of optimization\, ML and control. His research works are about:\n1. Nonconvex optimization on Riemannian manifolds (UW)\n2. Low order linear system identification (UW)\n3. Subspace based meta-learning (UW)\n4. Nonconvex optimization applied in optimal control (UW)\n\n5. Online optimization for video coding (Google\, 2019)\n6. Compressive sensing and phase retrieval (Ohio State U\, 2015; Nokia Bell Labs\, 2021)
URL:https://ifds.info/event/mloptuw/
CATEGORIES:MLOpt@UWash
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210421T123000
DTEND;TZID=America/Chicago:20210421T133000
DTSTAMP:20260409T110159
CREATED:20210202T201514Z
LAST-MODIFIED:20210219T192936Z
UID:1021-1619008200-1619011800@ifds.info
SUMMARY:SILO: Emmanuel Abbe
DESCRIPTION:Title: TBD \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-04212021/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210419T133000
DTEND;TZID=America/Chicago:20210419T143000
DTSTAMP:20260409T110159
CREATED:20210202T203343Z
LAST-MODIFIED:20210316T143205Z
UID:1041-1618839000-1618842600@ifds.info
SUMMARY:IFDS Ideas Forum: Zhiyan Ding
DESCRIPTION:Person: Zhiyan Ding \nTitle:  Connection between optimization and sampling \nAbstract: \nBio: I am a third-year PhD student in the Department of Mathematics\, University of Wisconsin-Madison. My advisor is Qin Li.
URL:https://ifds.info/event/ifds-ideas-forum-04192021/
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210416T133000
DTEND;TZID=America/Los_Angeles:20210416T143000
DTSTAMP:20260409T110159
CREATED:20210412T182350Z
LAST-MODIFIED:20210412T182350Z
UID:1138-1618579800-1618583400@ifds.info
SUMMARY:ML-Opt: Adhyyan Narang
DESCRIPTION:Title: Classification vs Adversarial Examples for the Overparameterized linear model. \n(Joint work with Vidya Muthukumar and Anant Sahai at UC Berkeley) \nAbstract: In modern machine learning\, overparameterized models are often used. It has been empirically observed that these often generalize well and display double-descent\, but are susceptible to adversarial perturbations. Past theoretical explanations of these phenomena usually focus on linear models where the adversary has the power to perturb the features directly. However\, the field of meta-learning has revealed that neural networks can be interpreted as first learning a feature-representation and then learning the best linear model on these learned features. The role of lifting in adversarial susceptibility of models is largely unaddressed\, primarily because the problem of finding an adversarial example for lifted models is nonconvex and difficult to solve. \nIn this talk\, I will use concepts from signal-processing to propose a toy model that exhibits all of the aforementioned phenomena\, most crucially lifting. The toy nature of the model allows us to overcome the challenge of solving the adversarial-search problem. We learn that the adversarial vulnerability arises because of a phenomena we term spatial localization: the predictions of the learned model are markedly more sensitive in the vicinity of training points than elsewhere. Despite the adversarial susceptibility\, we find that classification using spatially localized features can be “easier” i.e. less sensitive to the strength of the prior than in independent feature setups. \nBio: Adhyyan Narang (ECE) is a first-year PhD student\, advised by Maryam Fazel and Lilian Ratliff. He is interested in fundamental theoretical questions about learning from data\, and broadly works in the intersection of machine learning\, optimization and game theory.
URL:https://ifds.info/event/mlopt-adhyyan-narang/
CATEGORIES:MLOpt@UWash
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210414T123000
DTEND;TZID=America/Chicago:20210414T133000
DTSTAMP:20260409T110159
CREATED:20210202T201325Z
LAST-MODIFIED:20210412T181842Z
UID:1019-1618403400-1618407000@ifds.info
SUMMARY:SILO: Merve Bodur
DESCRIPTION:Title: Copositive Duality for Discrete Markets and Games \nModels including binary decisions are often modelled as mixed-integer programs (MIPs). Such models are nonconvex and lack strong duality\, which prevents the use of tools such as shadow prices and KKT conditions. For example\, in convex markets\, shadow (dual) prices are associated with market equilibrium\, and for convex games the existence and uniqueness of Nash equilibrium can be proven via fixed-point theorem and KKT conditions. Those results are lacking in their nonconvex counterparts. We use copositive programming to formulate discrete problems in applications including nonconvex energy markets and nonconvex games\, to leverage its convexity and strong duality features. We obtain several novel theoretical and numerical results for those applications\, including a new revenue-adequate pricing scheme for energy markets\, and existence\, uniqueness\, and KKT conditions for the pure-strategy Nash equilibrium in discrete games. We also propose a novel and purely MIP-based cutting-plane algorithm for mixed-integer copositive programs\, and employ it in our applications. This is a joint work with Cheng Guo and Josh A. Taylor. \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-rashmi-vimayak/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210412T133000
DTEND;TZID=America/Chicago:20210412T143000
DTSTAMP:20260409T110159
CREATED:20210202T203236Z
LAST-MODIFIED:20210211T154258Z
UID:1039-1618234200-1618237800@ifds.info
SUMMARY:IFDS Ideas Forum: Vivak Patel
DESCRIPTION:Person: Vivak Patel \nTitle: Adaptive Iterative Methods for Linear Systems \nAbstract: \nBio: My research is at the intersection of numerical optimization and statistical estimation with applications to dynamical systems. On the one hand\, I interpret this as using numerical optimization ideas to design statistical estimators that are practically computable\, yet still retain certain desirable statistical properties. On the other hand\, I interpret this as using statistical ideas to design optimization methods for solving minimization problems that involve uncertainty. Beyond these two main areas of work\, I also do some work in machine learning\, numerical linear algebra and load modeling for power systems.
URL:https://ifds.info/event/ifds-ideas-forum-04122021/
CATEGORIES:IFDS Ideas Forum
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210409T133000
DTEND;TZID=America/Los_Angeles:20210409T143000
DTSTAMP:20260409T110159
CREATED:20210412T183054Z
LAST-MODIFIED:20210412T183353Z
UID:1140-1617975000-1617978600@ifds.info
SUMMARY:ML-Opt: Krishna Pillutla
DESCRIPTION:Title: Distributionally Robust Machine Learning with the Superquantile 1) For Supervised Learning\, 2) For Federated Learning\n\n\nAbstract: I will talk about distributionally robust machine learning\, a principled approach for robust performance across subpopulations\, and shifting distributions. We will focus on the superquantile\, a.k.a. the Conditional Value at Risk (CVaR)\, which was popularized by the seminal work of UW’s own R. T. Rockafellar and co-authors in the field of computational finance and economics in the early 2000s.\nWe will first review the use of the superquantile for distributionally robust supervised learning. We will prove a generalization bound from first principles.\nSecond\, we will discuss an application of the superquantile in the field of federated learning\, i.e.\, the distributed training of machine learning models on mobile phones. We will quantify the extent to which a user conforms to the population distribution and show how the superquantile can be leveraged to improve performance on users who do not conform to the population. We will round of the discussion with a communication-efficient training algorithm and experimental results and heterogeneous datasets.\n\nBio: Krishna Pillutla is a 5th year Ph.D. student at the Paul G. Allen School of Computer Science and Engineering at the University of Washington\, where he is advised by Zaid Harchaoui and Sham Kakade. Krishna is broadly interested in machine learning and optimization and works in the particular areas of structured prediction and federated learning. Krishna was a 2019-20 JP Morgan Ph.D. Fellow.
URL:https://ifds.info/event/ml-opt-krishna-pillutla/
CATEGORIES:MLOpt@UWash
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210407T123000
DTEND;TZID=America/Chicago:20210407T133000
DTSTAMP:20260409T110159
CREATED:20210202T193325Z
LAST-MODIFIED:20210412T181944Z
UID:998-1617798600-1617802200@ifds.info
SUMMARY:SILO: Rebecca Willett
DESCRIPTION:
URL:https://ifds.info/event/silo-anna-choromanska/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210405T133000
DTEND;TZID=America/Chicago:20210405T143000
DTSTAMP:20260409T110159
CREATED:20210202T203119Z
LAST-MODIFIED:20210211T154049Z
UID:1037-1617629400-1617633000@ifds.info
SUMMARY:IFDS Ideas Forum: Liu Yang
DESCRIPTION:Person: Liu Yang \nTitle: \nAbstract: \nBio:
URL:https://ifds.info/event/ifds-ideas-forum-04052021/
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20210402T133000
DTEND;TZID=America/Los_Angeles:20210402T143000
DTSTAMP:20260409T110159
CREATED:20210115T201436Z
LAST-MODIFIED:20210331T134630Z
UID:880-1617370200-1617373800@ifds.info
SUMMARY:IFDS All-Hands: Chaobing Song & Xuezhou Zhang
DESCRIPTION:Speaker 1: Dr. Chaobing Song\, IFDS Postdoctoral Scholar at U Wisconsin (advised by Prof.s Jelena Diakonikolas and Steve Wright)\n\nTitle: Closing Convergence Gaps for Both Smooth and Nonsmooth Convex Finite-Sums\n \nAbstract: In the Foundation of Data Science\, optimization problems with a finite-sum structure widely exist\, such as the classical empirical risk minimization problem and deep neural network. In the past decade\, one main concern in the study of first-order methods is to study how the finite-sum structure influences optimization efficiency and scalability. In this talk\, based on my work (https://arxiv.org/abs/2006.10281\, https://arxiv.org/abs/2102.13643)\, I will talk about a consistent approach based on dual averaging and a particularly designed initialization strategy to close convergence gaps for smooth convex finite-sums and nonsmooth convex finite-sums. For the first time\, the proposed VRADA algorithm (https://arxiv.org/abs/2006.10281) matches the lower bounds of all the three regimes for smooth convex finite-sums; for the first time\, the proposed VRPDA^2 algorithm  (https://arxiv.org/abs/2102.13643) shows an O(n) improvement theoretically over existing deterministic methods and stochastic primal-dual coordinate methods\, where n is the number of data samples. Both algorithms also have good empirical performance.\n\n \nSpeaker 2: Xuezhou Zhang\, IFDS RA (advised by Prof. Jerry Zhu)\n \n\nTitle: Statistical Robustness in Reinforcement Learning\n \nAbstract: Traditional robust statistics studies the problem of statistical estimation under data contamination. In this talk\, we extend this investigation to decision-making problems. In the first half of the talk\, I will illustrate the concept and unique challenge of robust Decision making using a multi-armed bandit as an example. In the second half of the talk\, I will present some of the lower and upper-bound results we have already established\, and discuss several open directions. This talk is partially based on our recent paper [https://arxiv.org/pdf/2102.05800.pdf].\n\n \n\n—-\nSpeaker Bios:\n \nDr. Chaobing Song: I am a postdoc researcher at University of Wisconsin-Madison with Prof. Jelena Diakonikolas and Prof. Stephen J Wright. My research interests include optimization and machine learning. I obtained my Ph.D. degree at Tsinghua University in 2020 supervised by Prof. Yi Ma from University of California\, Berkeley (who is an affiliate professor at Tsinghua-Berkeley Shenzhen Institute\, Tsinghua University). I spent two wonderful years at Berkeley and finished my Ph.D. thesis there. The main focus of my current research is to apply a general\, powerful and concise framework [arXiv] to optimization problems in machine learning. The development based on this framework substantially improves complexity results in many well-known settings\, such as the setting of nonsmooth convex finite-sums in this talk.  \n \nXuezhou Zhang: Xuezhou is a Ph.D. candidate in the computer sciences department at the University of Wisconsin-Madison\, advised by Professor Jerry Zhu. Before coming to Madison\, he obtained a Bachelor of Applied Mathematics degree from UCLA. His research interests lie in machine learning and reinforcement learning\, and his recent research focuses on designing reinforcement learning algorithms that learn more adaptively and robustly in non-stochastic environments.\n\nThese talks are remote via zoom. Please contact the organizer if you need the link. \nAll-Hands titles and abstracts are tentative\, as of the posting date.
URL:https://ifds.info/event/ifds-all-hands-040221/
CATEGORIES:Monthly All-Hands
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210331T123000
DTEND;TZID=America/Chicago:20210331T133000
DTSTAMP:20260409T110159
CREATED:20210202T201200Z
LAST-MODIFIED:20210219T193005Z
UID:1017-1617193800-1617197400@ifds.info
SUMMARY:SILO: Jean Honorio
DESCRIPTION:Title: TBD \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-03312021/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210329T133000
DTEND;TZID=America/Chicago:20210329T140000
DTSTAMP:20260409T110159
CREATED:20210114T203812Z
LAST-MODIFIED:20210115T201047Z
UID:825-1617024600-1617026400@ifds.info
SUMMARY:IFDS Ideas Forum: Jeffrey Covington
DESCRIPTION:Title: \nAbstract: \nJeffrey Covington (Mathematics)\, advised by Nan Chen (Mathematics) and Sebastien Roch (Mathematics)\, works on data assimilation for model state estimation and prediction. His primary focus is on models with nonlinear and non-Gaussian features\, which present problems for traditional data assimilation techniques. Currently he is working on developing techniques for Lagrangian data assimilation problems\, which typically involve high-dimensionality and strong nonlinear interactions.
URL:https://ifds.info/event/ifds-ideas-forum-jeffrey-covington/
LOCATION:Webex
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210324T080000
DTEND;TZID=America/Chicago:20210324T170000
DTSTAMP:20260409T110159
CREATED:20210202T201035Z
LAST-MODIFIED:20210202T201035Z
UID:1015-1616572800-1616605200@ifds.info
SUMMARY:SILO: Alon Orlitsky
DESCRIPTION:Title: TBD \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-alon-orlitsky/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210322T133000
DTEND;TZID=America/Chicago:20210322T143000
DTSTAMP:20260409T110159
CREATED:20210201T203015Z
LAST-MODIFIED:20210219T192804Z
UID:1035-1616419800-1616423400@ifds.info
SUMMARY:IFDS Ideas Forum: Tara Javidi
DESCRIPTION:Person: Tara Javadi \nTitle: \nAbstract: \nBio:
URL:https://ifds.info/event/ifds-ideas-forum-03222021/
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210317T123000
DTEND;TZID=America/Chicago:20210317T133000
DTSTAMP:20260409T110159
CREATED:20210202T200937Z
LAST-MODIFIED:20210316T152530Z
UID:1013-1615984200-1615987800@ifds.info
SUMMARY:SILO: Sebastien Bubeck
DESCRIPTION:Title:  A law of robustness for two-layers neural networks \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-sebastien-bubeck/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210315T133000
DTEND;TZID=America/Chicago:20210315T140000
DTSTAMP:20260409T110159
CREATED:20210114T195355Z
LAST-MODIFIED:20210211T153918Z
UID:818-1615815000-1615816800@ifds.info
SUMMARY:IFDS Ideas Forum: Aditya Kumar Akash & Sixu Li
DESCRIPTION:Aditya Kumar Akash & Sixu Li \nTitle: Deep fusion through Wasserstein barycenters in TLp spaces \nAbstract:
URL:https://ifds.info/event/ifds-ideas-forum-liu-yang/
LOCATION:Webex
CATEGORIES:IFDS Ideas Forum
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210312T133000
DTEND;TZID=America/Chicago:20210312T143000
DTSTAMP:20260409T110159
CREATED:20210115T203442Z
LAST-MODIFIED:20210115T203442Z
UID:896-1615555800-1615559400@ifds.info
SUMMARY:ML-Opt @ UWash: Vincent Roulet
DESCRIPTION:Title: Global convergence of first-order methods for nonlinear control problems \nAbstract:
URL:https://ifds.info/event/ml-opt-uwash-vincent-roulet/
CATEGORIES:MLOpt@UWash
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210310T130000
DTEND;TZID=America/Chicago:20210310T133000
DTSTAMP:20260409T110159
CREATED:20210202T200752Z
LAST-MODIFIED:20210202T200808Z
UID:1010-1615381200-1615383000@ifds.info
SUMMARY:SILO: Yuanzhi Li
DESCRIPTION:Title: TBD \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-yuanzhi-li/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20210310T123000
DTEND;TZID=America/Chicago:20210310T133000
DTSTAMP:20260409T110159
CREATED:20210202T195033Z
LAST-MODIFIED:20210308T174906Z
UID:1008-1615379400-1615383000@ifds.info
SUMMARY:SILO: Zhao Song
DESCRIPTION:Title: Faster Optimization: From linear programming to semidefinite programming \nAbstract: Many important real-life problems\, in both convex and non-convex settings\, can be solved using path-following optimization methods. The running time of optimization algorithms is typically governed by two components — the number of iterations and the cost-per-iteration. For decades\, the vast majority of research effort was dedicated to improving the number of iterations required for convergence. A recent line of work of ours shows that the cost-per-iteration can be dramatically improved using a careful combination of dynamic data structures with `robust’ variants of the optimization method. A central ingredient is the use of randomized linear algebra for dimensionality reduction (e.g.\,  linear sketching) for fast maintenance of dynamic matrix problems. This framework recently led to many breakthroughs on decade-old optimization problems. \nIn this talk\, I will present the framework underlying these breakthroughs\, focusing on faster algorithms for linear programming and semidefinite programming. We will first present how to use the above idea to speed up general LP solvers by providing an n^omega + n^{2+1/18} time algorithm. We then show how to apply similar ideas to SDP solvers by providing an n^omega + n^{2+1/4} time algorithm. For the current omega = 2.373\, we can solve LP and SDP as fast as solving linear systems. \nThis is a joint work with\nBaihe Huang (undergraduate at Peking University)\,\nShunhua Jiang\, Runzhou Tao\, Hengjie Zhang (Ph.D. at Columbia University)\,\nOmri Weinstein (Professor at Columbia University) \nLP paper    : https://arxiv.org/abs/2004.07470\nSDP paper : https://arxiv.org/abs/2101.08208 \n  \nUNTIL FURTHER NOTICE: Seminars are virtual. Sign up for the SILO email list to receive the links to each talk at https://groups.google.com/ and browse for silo
URL:https://ifds.info/event/silo-zhao-song/
CATEGORIES:SILO
ORGANIZER;CN="Rob%20Nowak":MAILTO:rdnowak@wisc.edu
END:VEVENT
END:VCALENDAR