D. Davis, D. Drusvyatskiy. Active strict saddles in nonsmooth optimization. To appear in Found. Comp. Math. 2021

D. Drusvyatskiy, L. Xiao. Stochastic optimization with decision-dependent distributions. Submitted, arXiv:2011.11173, 2021

D. Davis, D. Drusvyatskiy, L. Xiao, J. Zhang. From low probability to high confidence in stochastic optimization.Submitted, 2021, short version appeared in Conference on Learning Theory (pp. 1411-1427). PMLR.

V. Charisopoulos, Y. Chen, D. Davis, M. Diaz, L. Ding, D. Drusvyatskiy. Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence. To appear in Found. Comput. Math., 2021, doi:10.1007/s10208-020-09490-9

A. Stevens, R. Willett, A. Mamalakis, E. Foufoula-Georgiou, A. Tejedor, J. T. Randerson, P. Smyth, and S. Wright. Graph-guided regularized regression of pacific ocean climate variables to increase predictive skill of southwestern US winter precipitation. Journal of Climate, vol. 34, no. 2, pp. 737–754, 2021.

D. Davis, D. Drusvyatskiy. Active strict saddles in nonsmooth optimization. To appear in Found. Comp. Math. 2021

D.Drusvyatskiy, L. Xiao. Stochastic optimization with decision-dependent distributions. Submitted, arXiv:2011.11173, 2021

D. Davis, D. Drusvyatskiy, L. Xiao, J. Zhang. From low probability to high confidence in stochastic optimization. Submitted, 2021, short version appeared in Conference on Learning Theory (pp. 1411-1427). PMLR.

V. Charisopoulos, Y. Chen, D. Davis, M. Diaz, L. Ding, D. Drusvyatskiy. Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence. To appear in Found. Comput. Math., 2021, doi:10.1007/s10208-020-09490-9

Yuji Roh, Kangwook Lee, Steven Euijong Whang, Changho Suh. FairBatch: Batch Selection for Model Fairness. Ninth International Conference on Learning Representations (ICLR) 2021

Luo, Y., Raskutti, G., Yuan, M., and Zhang, A. R.  A sharp blockwise tensor perturbation bound for orthogonal iteration. Journal of Machine Learning Research, acceptable after minor revision.(2021+).

D. Wang, Z. Zhao, Y. Yu, and R. Willett. Functional linear regression with mixed predictors. arXiv preprint arXiv:2012.00460, 2020.

D. Gilton, G. Ongie, and R. Willett. Model adaptation for inverse problems in imaging. arXiv preprint arXiv:2012.00139, 2020.

D. Wang, Z. Zhao, R. Willett, and C. Y. Yau. Functional autoregressive processes in reproducing kernel hilbert spaces. arXiv preprint arXiv:2011.13993, 2020.

D. Wang, Y. Yu, and R. Willett. Detecting abrupt changes in high-dimensional self-exciting poisson processes. preprint arXiv:2006.03572, 2020.

Mukherjee, S., Tripathy, A. and Nowak, R. Generalized Chernoff Sampling for Active Learning and Structured Bandit Algorithms. arXiv preprint arXiv:2012.08073, 2020.

Malloy, M.L., Tripathy, A. and Nowak, R.D. Optimal confidence regions for the multinomial parameter. arXiv preprint arXiv:2002.01044, 2020.

Haotian Jiang, Tarun Kathuria, Yin Tat Lee, Swati Padmanabhan, Zhao Song. A Faster Interior Point Method for Semidefinite Programming. FOCS 2020

Jan van den Brand, Yin Tat Lee, Danupon Nanongkai, Richard Peng, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang. Bipartite Matching in Nearly-linear Time on Moderately Dense Graphs. FOCS 2020

Sébastien Bubeck, Ronen Eldan, Yin Tat Lee, Dan Mikulincer. Network size and size of the weights in memorization with two-layers neural networks. NeurIPS 2020

Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian. Acceleration with a Ball Optimization Oracle. NeurIPS 2020

Marek Eliás, Michael Kapralov, Janardhan Kulkarni, Yin Tat Lee. Differentially Private Release of Synthetic Graphs. SODA 2020

Sébastien Bubeck, Bo’az Klartag, Yin Tat Lee, Yuanzhi Li, Mark Sellke. Chasing Nested Convex Bodies Nearly Optimally. SODA 2020

Sally Dong, Yin Tat Lee, Kent Quanrud. Computing Circle Packing Representations of Planar Graphs. SODA 2020

Jan van den Brand, Yin Tat Lee, Aaron Sidford, Zhao Song. Solving tall dense linear programs in nearly linear time. STOC 2020

Arun Jambulapati, Yin Tat Lee, Jerry Li, Swati Padmanabhan, Kevin Tian. Positive semidefinite programming: mixed, parallel, and width-independent. STOC 2020

Haotian Jiang, Yin Tat Lee, Zhao Song, Sam Chiu-wai Wong. An improved cutting plane method for convex optimization, convex-concave games, and its applications. STOC 2020

Aditi Laddha, Yin Tat Lee, Santosh S. Vempala.  Strong self-concordance and sampling. STOC 2020

Naman Agarwal, Sham M. Kakade, Rahul Kidambi, Yin Tat Lee, Praneeth Netrapalli, Aaron Sidford Leverage Score Sampling for Faster Accelerated Regression and ERM. ALT 2020

Yin Tat Lee, Ruoqi Shen, Kevin Tian. Logsmooth Gradient Concentration and Tighter Runtimes for Metropolized Hamiltonian Monte Carlo. COLT 2020

Yin Tat Lee, Swati Padmanabhan. An Õ(m/ε3.5)-Cost Algorithm for Semidefinite Programs with Diagonal Constraints. COLT 2020

Chanwoo Lee, Miaoyan Wang. Beyond the Signs: Nonparametric Tensor Completion via Sign Series. arXiv preprint arXiv: 2102.00384, 2020.

D. Wang, Z. Zhao, Y. Yu, and R. Willett. Functional linear regression with mixed predictors. arXiv preprint arXiv:2012.00460, 2020.

D. Gilton, G. Ongie, and R. Willett. Model adaptation for inverse problems in imaging. arXiv preprint arXiv:2012.00139, 2020.

D. Wang, Z. Zhao, R.Willett, and C. Y. Yau. Functional autoregressive processes in reproducing kernel hilbert spaces. arXiv preprint arXiv:2011.13993, 2020.

D. Wang, Y. Yu, and R. Willett.  Detecting abrupt changes in high-dimensional self-excitingpoisson processes. preprint arXiv:2006.03572, 2020.

A. Pensia, S. Rajput, A. Nagle, H. Vishwakarma, D. Papailiopoulos. Optimal Lottery Tickets via SubsetSum: Logarithmic Over-Parameterization is Sufficient. NeurIPS 2020 (spotlight). 

H. Wang, K. Sreenivasan, S. Rajput, H. Vishwakarma, S. Agarwal, J.Y, Sohn, K. Lee, and D. Papailiopoulos. Attack Of The Tails: Yes, You Really Can Backdoor Federated Learning. NeurIPS 2020. 

S. Liu, D. Papailiopoulos, D. Achlioptas. Bad Global Minima Exist and SGD Can Reach Them. NeurIPS 2020 . 

Miaoyan Wang and Lexin Li. Learning from Binary Multiway Data: Probabilistic Tensor Decomposition and Its Statistical Optimality. Journal of Machine Learning Research, 21(154): 1−38, 2020

Mardia, Jay, Jiantao Jiao, Ervin Tánczos, Robert D. Nowak, and Tsachy Weissman. Concentration inequalities for the empirical distribution of discrete distributions: beyond the method of types. Information and Inference: A Journal of the IMA 9, no. 4 (2020): 813-850.

Sen, Ayon, Xiaojin Zhu, Erin Marshall, and Robert Nowak. Popular Imperceptibility Measures in Visual Adversarial Attacks are Far from Human Perception. In International Conference on Decision and Game Theory for Security, pp. 188-199. Springer, Cham, 2020.

Parhi, Rahul, and Robert D. Nowak. The role of neural network activation functions. IEEE Signal Processing Letters, 27 (2020): 1779-1783.

Karzand, Mina, and Robert D. Nowak. MaxiMin Active Learning in Overparameterized Model Classes. IEEE Journal on Selected Areas in Information Theory (2020). https://doi.org/10.1109/JSAIT.2020.2991518

Mason, Blake, Lalit Jain, Ardhendu Tripathy, and Robert Nowak. Finding All ε-Good Arms in Stochastic Bandits. Advances in Neural Information Processing Systems33 (2020).

Chanwoo Lee and Miaoyan Wang. Tensor denoising and completion based on ordinal observations. International Conference on Machine Learning (ICML), 2020.

Jiaxin Hu, Chanwoo Lee, and Miaoyan Wang. Supervised Tensor Decomposition with interactive side information. Advances in Neural Information Processing Systems 33 (NeurIPS) Workshop on Machine Learning and the Physical Sciences, 2020. 

Luo, Y., Huang, W., Li, X., and Zhang, A. R. Recursive importance sketching for rank constrained least squares: Algorithms and high-order convergence. submitted, 2020. https://arxiv.org/abs/2011.08360

Zhou, Y., Zhang, A. R., Zheng, L., and Wang, Y. Optimal Ultrahigh-order tensor SVD via tensor-train orthogonal iteration. submitted, 2020. https://arxiv.org/abs/2010.02482

Han, R., Luo, Y., Wang, M., and Zhang, A. R. Exact Clustering in Tensor Block Model: Statistical Optimality and Computational Limit. submitted, 2020. https://arxiv.org/abs/2012.09996

Luo, Y., Han, R. and Zhang, A. R. A Schatten-q matrix perturbation theory via perturbation projection error bound. submitted, 2020. https://arxiv.org/abs/2008.01312

Xia, D., Zhang, A. R., and Zhou, Y. Inference for low-rank tensors — no need to debias. submitted, 2020. https://arxiv.org/abs/2012.14844

Cai, T. T., Han, R., and Zhang, A. R. On the non-asymptotic concentration of heteroskedastic Wishart-type random matrix. submitted, 2020. https://arxiv.org/abs/2008.12434

Luo, Y. and Zhang, A. R. Tensor clustering with planted structures: Statistical optimality and computational limits. submitted, 2020. https://arxiv.org/abs/2005.10743

Zhang, C., Han, R., Zhang, A. R., and Voyles, P. M. Denoising Atomic Resolution 4D Scanning Transmission Electron Microscopy Data with Tensor Singular Value Decomposition. Ultramicroscopy, 219, 113123, 2020.

Luo, Y. and Zhang, A. R.  Open Problem: Average-Case Hardness of Hypergraphic Planted Clique Detection. Proceedings of the 33rd Conference on Learning Theory (COLT), 125, 3852-3856.(2020).

Brandon Legried, Erin Molloy, Tandy Warnow, and Sebastien Roch. Polynomial-Time Statistical Estimation of Species Trees Under Gene Duplication and Loss. Proceedings of RECOMB 2020, 120-135.

Wai-Tong Louis Fan, Brandon Legried & Sebastien Roch. Impossibility of Consistent Distance Estimation from Sequence Lengths Under the TKF91 Model. Bulletin of Mathematical Biology volume 82, Article number: 123 (2020) 

Yuling Yan, Bret Hanlon, Sebastien Roch and Karl Rohe. Asymptotic seed bias in respondent-driven sampling. Electronic Journal of Statistics, 14(1):1577-1610, 2020.

Louis Fan and Sebastien Roch. Statistically consistent and computationally efficient inference of ancestral DNA sequences in the TKF91 model under dense taxon sampling. Bulletin of Mathematical Biology, 82(2):21, 2020.

Wai-Tong Louis Fan, Brandon Legried & Sebastien Roch. Impossibility of phylogeny reconstruction from k-mer counts. Submitted, 2020. https://arxiv.org/abs/2010.14460

Max Hill, Brandon Legried and Sebastien Roch. Species tree estimation under joint modeling of coalescence and duplication: sample complexity of quartet methods. Submitted, 2020. https://arxiv.org/abs/2007.06697

Gautam Dasarathy, Elchanan Mossel, Robert Nowak and Sebastien Roch. Coalescent-based species tree estimation: a stochastic Farris transform. Submitted, 2020. https://arxiv.org/abs/1707.04300

Yilin Zhang, Karl Rohe, Sebastien Roch. Reducing Seed Bias in Respondent-Driven Sampling by Estimating Block Transition Probabilities. Submitted, 2020. https://arxiv.org/abs/1812.01188

Shashank Rajput, Anant Gupta, Dimitris Papailiopoulos. Closing the convergence gap of SGD without replacement. February 2020 https://arxiv.org/abs/2002.10400

Z Charles, S Rajput, S Wright, D Papailiopoulos. Convergence and Margin of Adversarial Training on Separable Data. May 2019, https://arxiv.org/abs/1905.09209

Ng, T. L. and Newton, M. A., Random weighting to approximate posterior inference in LASSO regression. submitted, February, 2020. https://arxiv.org/abs/2002.02629 

Ke Chen, Qin Li, Kit Newton, Steve Wright. Structured random sketching for PDE inverse problems.  submitted

Zhiyan Ding, Lukas Einkemme and Qin Li. Error analysis of an asymptotic preserving dynamical low-rank integrator for the multi-scale radiative transfer equation. submitted,

Wright, S. J. and Lee, C.-p. Analyzing random permutations for cyclic coordinate descent. to appear in Mathematics of Computation, 2020.

Han, R., Willett, R., and Zhang, A. An optimal statistical and computational framework for generalized tensor estimation. under revision, October, 2020. https://arxiv.org/abs/2002.11255

Zhu, Z., Li, X., Wang, M., and Zhang, A. Learning Markov models via low-rank optimization. Operations Research, to appear, 2020. https://arxiv.org/abs/1907.00113

Cai, T. T., Zhang, A., and Zhou, Y. Sparse Group Lasso: Optimal Sample Complexity, Convergence Rate, and Statistical Inference. under revision, December, 2020. https://arxiv.org/abs/1909.09851

Anru Zhang, Yuetian Luo, Garvesh Raskutti, and Ming Yuan. ISLET: Fast and optimal low-rank tensor regression via importance sketching. SIAM Journal on Mathematics of Data Science, 2, 444-479, 2020. https://arxiv.org/abs/1911.03804

Y. Li, B. Mark, G. Raskutti, R.Willett, H. Song, and D. Neiman. Graph-based regularization for regression problems with alignment and highly-correlated designs. accepted to SIAM Journal on Mathematics of Data Science, arXiv:1803.07658 , 2020.

W. J. Marais, R. E. Holz, J. S. Reid, and R. M. Willett. Leveraging spatial textures, through machine learning, to identify aerosol and distinct cloud types from multispectral observations. submitted, 2020.

G. Ongie, C. Metzler, A. Jalal, A. Dimakis, R. Baraniuk, and R. Willett. Deep learning techniques for inverse problems in imaging. submitted, 2020. 

G. Ongie, D. Pimentel-Alarcon, L. Balzano, R. Nowak, and R. Willett. Tensor methods for nonlinear matrix completion. submitted, 2020.

L. Zheng, R. Willett, and G. Raskutti. Context-dependent self-exciting point processes:models, methods, and risk bounds in high dimensions. submitted, 2020.

A. Stevens, R. Willett, A. Mamalakis, E. Foufoula-Georgiou, A. Tejedor, J. Randerson, P. Smyth,and S. J. Wright. Graph-guided regularized regression of Pacific Ocean climate variables to increase predictive skill of southwestern us winter precipitation. submitted, 2020.

D. Wang, Y. Yu, A. Rinaldo, and R. Willett. Localizing changes in high-dimensional vector autoregressive processes. submitted, 2020.

Rungang Han, Rebecca Willett, and Anru Zhang. An optimal statistical and computational framework for generalized tensor estimation. arXiv preprint arXiv:2002.11255, 2020.

Daren Wang, Kevin Lin, and Rebecca Willett. Statistically and computationally efficient change point localization in regression settings. arXiv preprint arXiv:1906.11364, 2019.

B. D. Luck, J. L. Drewry, R. D. Shaver, R. M. Willett, and L. F. Ferraretto Predicting in situ dry matter degradability of chopped and processed corn kernels using image analysis techniques. Submitted., 2020.

D. Gilton, G. Ongie, and R. Willett, Neumann networks for inverse problems in imaging. IEEE Transactions on Computational Imaging, vol. 6, no. 1, pp. 328–343, arXiv preprint arXiv:1901.03707, 2019.

D. Gilton, R. Luo, R. Willett, and G. Shakhnarovich. Detection and description of change in visual streams. submitted, 2020.

Ke Chen, Qin Li, Jianfeng Lu, Steve Wright. Randomized sampling for basis functions construction in generalized finite element methods. accepted, SIAM-MMS, 2019

Curtis, F. E., Robinson, D. P., Royer, C. W., and Wright, S. J. Trust-region Newton-CG with strong second- order complexity guarantees for nonconvex optimization. submitted, December, 2019. https://arxiv.org/abs/1912.04365

Xie, Y. and Wright, S. J. Complexity of proximal augmented Lagrangian for nonconvex optimization with nonlinear equality constraints. September, 2019. https://arxiv.org/abs/1908.00131