Chanwoo Lee, Miaoyan Wang. Beyond the Signs: Nonparametric Tensor Completion via Sign Series. arXiv preprint arXiv: 2102.00384, 2020.
A. Stevens, R. Willett, A. Mamalakis, E. Foufoula-Georgiou, A. Tejedor, J. T. Randerson, Smyth, and S. Wright. Graph-guided regularized regression of pacific ocean climate variables to increase predictive skill of southwestern US winter precipitation. Journal of Climate, vol. 34, no. 2, pp. 737–754, 2021. [Old preprint listed below]
D. Wang, Z. Zhao, Y. Yu, and R. Willett. Functional linear regression with mixed predictors. arXiv preprint arXiv:2012.00460, 2020.
D. Gilton, G. Ongie, and R. Willett. Model adaptation for inverse problems in imaging. arXiv preprint arXiv:2012.00139, 2020.
D. Wang, Z. Zhao, R.Willett, and C. Y. Yau. Functional autoregressive processes in reproducing kernel hilbert spaces. arXiv preprint arXiv:2011.13993, 2020.
D. Wang, Y. Yu, and R. Willett. Detecting abrupt changes in high-dimensional self-excitingpoisson processes. preprint arXiv:2006.03572, 2020.
A. Pensia, S. Rajput, A. Nagle, H. Vishwakarma, D. Papailiopoulos. Optimal Lottery Tickets via SubsetSum: Logarithmic Over-Parameterization is Sufficient. NeurIPS 2020 (spotlight).
H. Wang, K. Sreenivasan, S. Rajput, H. Vishwakarma, S. Agarwal, J.Y, Sohn, K. Lee, and D. Papailiopoulos. Attack Of The Tails: Yes, You Really Can Backdoor Federated Learning. NeurIPS 2020.
S. Liu, D. Papailiopoulos, D. Achlioptas. Bad Global Minima Exist and SGD Can Reach Them. NeurIPS 2020 .
Miaoyan Wang and Lexin Li. Learning from Binary Multiway Data: Probabilistic Tensor Decomposition and Its Statistical Optimality. Journal of Machine Learning Research, 21(154): 1−38, 2020
Mardia, Jay, Jiantao Jiao, Ervin Tánczos, Robert D. Nowak, and Tsachy Weissman. Concentration inequalities for the empirical distribution of discrete distributions: beyond the method of types. Information and Inference: A Journal of the IMA 9, no. 4 (2020): 813-850.
Sen, Ayon, Xiaojin Zhu, Erin Marshall, and Robert Nowak. Popular Imperceptibility Measures in Visual Adversarial Attacks are Far from Human Perception. In International Conference on Decision and Game Theory for Security, pp. 188-199. Springer, Cham, 2020.
Parhi, Rahul, and Robert D. Nowak. The role of neural network activation functions. IEEE Signal Processing Letters, 27 (2020): 1779-1783.
Karzand, Mina, and Robert D. Nowak. MaxiMin Active Learning in Overparameterized Model Classes. IEEE Journal on Selected Areas in Information Theory (2020).
Mason, Blake, Lalit Jain, Ardhendu Tripathy, and Robert Nowak. Finding All ε-Good Arms in Stochastic Bandits. Advances in Neural Information Processing Systems33 (2020).
Chanwoo Lee and Miaoyan Wang. Tensor denoising and completion based on ordinal observations. International Conference on Machine Learning (ICML), 2020.
Jiaxin Hu, Chanwoo Lee, and Miaoyan Wang. Supervised Tensor Decomposition with interactive side information. Advances in Neural Information Processing Systems 33 (NeurIPS) Workshop on Machine Learning and the Physical Sciences, 2020.
Luo, Y., Huang, W., Li, X., and Zhang, A. R. Recursive importance sketching for rank constrained least squares: Algorithms and high-order convergence. submitted, 2020. https://arxiv.org/abs/2011.08360
Zhou, Y., Zhang, A. R., Zheng, L., and Wang, Y. Optimal Ultrahigh-order tensor SVD via tensor-train orthogonal iteration. submitted, 2020. https://arxiv.org/abs/2010.02482
Han, R., Luo, Y., Wang, M., and Zhang, A. R. Exact Clustering in Tensor Block Model: Statistical Optimality and Computational Limit. submitted, 2020. https://arxiv.org/abs/2012.09996
Luo, Y., Han, R. and Zhang, A. R. A Schatten-q matrix perturbation theory via perturbation projection error bound. submitted, 2020. https://arxiv.org/abs/2008.01312
Xia, D., Zhang, A. R., and Zhou, Y. Inference for low-rank tensors — no need to debias. submitted, 2020. https://arxiv.org/abs/2012.14844
Cai, T. T., Han, R., and Zhang, A. R. On the non-asymptotic concentration of heteroskedastic Wishart-type random matrix. submitted, 2020. https://arxiv.org/abs/2008.12434
Luo, Y. and Zhang, A. R. Tensor clustering with planted structures: Statistical optimality and computational limits. submitted, 2020. https://arxiv.org/abs/2005.10743
Zhang, C., Han, R., Zhang, A. R., and Voyles, P. M. Denoising Atomic Resolution 4D Scanning Transmission Electron Microscopy Data with Tensor Singular Value Decomposition. Ultramicroscopy, 219, 113123, 2020.
Luo, Y. and Zhang, A. R. Open Problem: Average-Case Hardness of Hypergraphic Planted Clique Detection. Proceedings of the 33rd Conference on Learning Theory (COLT), 125, 3852-3856.(2020).
Brandon Legried, Erin Molloy, Tandy Warnow, and Sebastien Roch. Polynomial-Time Statistical Estimation of Species Trees Under Gene Duplication and Loss. Proceedings of RECOMB 2020, 120-135.
Wai-Tong Louis Fan, Brandon Legried & Sebastien Roch. Impossibility of Consistent Distance Estimation from Sequence Lengths Under the TKF91 Model. Bulletin of Mathematical Biology volume 82, Article number: 123 (2020)
Yuling Yan, Bret Hanlon, Sebastien Roch and Karl Rohe. Asymptotic seed bias in respondent-driven sampling. Electronic Journal of Statistics, 14(1):1577-1610, 2020.
Louis Fan and Sebastien Roch. Statistically consistent and computationally efficient inference of ancestral DNA sequences in the TKF91 model under dense taxon sampling. Bulletin of Mathematical Biology, 82(2):21, 2020.
Wai-Tong Louis Fan, Brandon Legried & Sebastien Roch. Impossibility of phylogeny reconstruction from k-mer counts. Submitted, 2020. https://arxiv.org/abs/2010.14460
Max Hill, Brandon Legried and Sebastien Roch. Species tree estimation under joint modeling of coalescence and duplication: sample complexity of quartet methods. Submitted, 2020. https://arxiv.org/abs/2007.06697
Gautam Dasarathy, Elchanan Mossel, Robert Nowak and Sebastien Roch. Coalescent-based species tree estimation: a stochastic Farris transform. Submitted, 2020. https://arxiv.org/abs/1707.04300
Yilin Zhang, Karl Rohe, Sebastien Roch. Reducing Seed Bias in Respondent-Driven Sampling by Estimating Block Transition Probabilities. Submitted, 2020. https://arxiv.org/abs/1812.01188
Shashank Rajput, Anant Gupta, Dimitris Papailiopoulos. Closing the convergence gap of SGD without replacement. February 2020 https://arxiv.org/abs/2002.10400
Z Charles, S Rajput, S Wright, D Papailiopoulos. Convergence and Margin of Adversarial Training on Separable Data. May 2019, https://arxiv.org/abs/1905.09209
Ng, T. L. and Newton, M. A., Random weighting to approximate posterior inference in LASSO regression. submitted, February, 2020. https://arxiv.org/abs/2002.02629
Ke Chen, Qin Li, Kit Newton, Steve Wright. Structured random sketching for PDE inverse problems. submitted
Zhiyan Ding, Lukas Einkemme and Qin Li. Error analysis of an asymptotic preserving dynamical low-rank integrator for the multi-scale radiative transfer equation. submitted,
Wright, S. J. and Lee, C.-p. Analyzing random permutations for cyclic coordinate descent. to appear in Mathematics of Computation, 2020.
Han, R., Willett, R., and Zhang, A. An optimal statistical and computational framework for generalized tensor estimation. under revision, October, 2020. https://arxiv.org/abs/2002.11255
Zhu, Z., Li, X., Wang, M., and Zhang, A. Learning Markov models via low-rank optimization. Operations Research, to appear, 2020. https://arxiv.org/abs/1907.00113
Cai, T. T., Zhang, A., and Zhou, Y. Sparse Group Lasso: Optimal Sample Complexity, Convergence Rate, and Statistical Inference. under revision, December, 2020. https://arxiv.org/abs/1909.09851
Anru Zhang, Yuetian Luo, Garvesh Raskutti, and Ming Yuan. ISLET: Fast and optimal low-rank tensor regression via importance sketching. SIAM Journal on Mathematics of Data Science, 2, 444-479, 2020. https://arxiv.org/abs/1911.03804
Y. Li, B. Mark, G. Raskutti, R.Willett, H. Song, and D. Neiman. Graph-based regularization for regression problems with alignment and highly-correlated designs. accepted to SIAM Journal on Mathematics of Data Science, arXiv:1803.07658 , 2020.
W. J. Marais, R. E. Holz, J. S. Reid, and R. M. Willett. Leveraging spatial textures, through machine learning, to identify aerosol and distinct cloud types from multispectral observations. submitted, 2020.
G. Ongie, C. Metzler, A. Jalal, A. Dimakis, R. Baraniuk, and R. Willett. Deep learning techniques for inverse problems in imaging. submitted, 2020.
G. Ongie, D. Pimentel-Alarcon, L. Balzano, R. Nowak, and R. Willett. Tensor methods for nonlinear matrix completion. submitted, 2020.
L. Zheng, R. Willett, and G. Raskutti. Context-dependent self-exciting point processes:models, methods, and risk bounds in high dimensions. submitted, 2020.
A. Stevens, R. Willett, A. Mamalakis, E. Foufoula-Georgiou, A. Tejedor, J. Randerson, P. Smyth,and S. J. Wright. Graph-guided regularized regression of Pacific Ocean climate variables to increase predictive skill of southwestern us winter precipitation. submitted, 2020.
D. Wang, Y. Yu, A. Rinaldo, and R. Willett. Localizing changes in high-dimensional vector autoregressive processes. submitted, 2020.
Rungang Han, Rebecca Willett, and Anru Zhang. An optimal statistical and computational framework for generalized tensor estimation. arXiv preprint arXiv:2002.11255, 2020.
Daren Wang, Kevin Lin, and Rebecca Willett. Statistically and computationally efficient change point localization in regression settings. arXiv preprint arXiv:1906.11364, 2019.
B. D. Luck, J. L. Drewry, R. D. Shaver, R. M. Willett, and L. F. Ferraretto Predicting in situ dry matter degradability of chopped and processed corn kernels using image analysis techniques. Submitted., 2020.
D. Gilton, G. Ongie, and R. Willett, Neumann networks for inverse problems in imaging. IEEE Transactions on Computational Imaging, vol. 6, no. 1, pp. 328–343, arXiv preprint arXiv:1901.03707, 2019.
D. Gilton, R. Luo, R. Willett, and G. Shakhnarovich. Detection and description of change in visual streams. submitted, 2020.