to be advised by Prof. Dongdong Ge. July 2015. pdf, Szemerdi Regularity Lemma and Arthimetic Progressions, Annie Marsden. CV; Theory Group; Data Science; CSE 535: Theory of Optimization and Continuous Algorithms. Department of Electrical Engineering, Stanford University, 94305, Stanford, CA, USA The ones marked, 2014 IEEE 55th Annual Symposium on Foundations of Computer Science, 424-433, SIAM Journal on Optimization 28 (2), 1751-1772, Proceedings of the twenty-fifth annual ACM-SIAM symposium on Discrete, 2015 IEEE 56th Annual Symposium on Foundations of Computer Science, 1049-1065, 2013 ieee 54th annual symposium on foundations of computer science, 147-156, Proceedings of the forty-fifth annual ACM symposium on Theory of computing, MB Cohen, YT Lee, C Musco, C Musco, R Peng, A Sidford, Proceedings of the 2015 Conference on Innovations in Theoretical Computer, Advances in Neural Information Processing Systems 31, M Kapralov, YT Lee, CN Musco, CP Musco, A Sidford, SIAM Journal on Computing 46 (1), 456-477, P Jain, S Kakade, R Kidambi, P Netrapalli, A Sidford, MB Cohen, YT Lee, G Miller, J Pachocki, A Sidford, Proceedings of the forty-eighth annual ACM symposium on Theory of Computing, International Conference on Machine Learning, 2540-2548, P Jain, SM Kakade, R Kidambi, P Netrapalli, A Sidford, 2015 IEEE 56th Annual Symposium on Foundations of Computer Science, 230-249, Mathematical Programming 184 (1-2), 71-120, P Jain, C Jin, SM Kakade, P Netrapalli, A Sidford, International conference on machine learning, 654-663, Proceedings of the Twenty-Ninth Annual ACM-SIAM Symposium on Discrete, D Garber, E Hazan, C Jin, SM Kakade, C Musco, P Netrapalli, A Sidford, New articles related to this author's research, Path finding methods for linear programming: Solving linear programs in o (vrank) iterations and faster algorithms for maximum flow, Accelerated methods for nonconvex optimization, An almost-linear-time algorithm for approximate max flow in undirected graphs, and its multicommodity generalizations, A faster cutting plane method and its implications for combinatorial and convex optimization, Efficient accelerated coordinate descent methods and faster algorithms for solving linear systems, A simple, combinatorial algorithm for solving SDD systems in nearly-linear time, Uniform sampling for matrix approximation, Near-optimal time and sample complexities for solving Markov decision processes with a generative model, Single pass spectral sparsification in dynamic streams, Parallelizing stochastic gradient descent for least squares regression: mini-batching, averaging, and model misspecification, Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization, Accelerating stochastic gradient descent for least squares regression, Efficient inverse maintenance and faster algorithms for linear programming, Lower bounds for finding stationary points I, Streaming pca: Matching matrix bernstein and near-optimal finite sample guarantees for ojas algorithm, Convex Until Proven Guilty: Dimension-Free Acceleration of Gradient Descent on Non-Convex Functions, Competing with the empirical risk minimizer in a single pass, Variance reduced value iteration and faster algorithms for solving Markov decision processes, Robust shift-and-invert preconditioning: Faster and more sample efficient algorithms for eigenvector computation. Journal of Machine Learning Research, 2017 (arXiv). I often do not respond to emails about applications. Fall'22 8803 - Dynamic Algebraic Algorithms, small tool to obtain upper bounds of such algebraic algorithms. BayLearn, 2019, "Computing stationary solution for multi-agent RL is hard: Indeed, CCE for simultaneous games and NE for turn-based games are both PPAD-hard. I enjoy understanding the theoretical ground of many algorithms that are In this talk, I will present a new algorithm for solving linear programs. COLT, 2022. I develop new iterative methods and dynamic algorithms that complement each other, resulting in improved optimization algorithms. Research Institute for Interdisciplinary Sciences (RIIS) at Page 1 of 5 Aaron Sidford Assistant Professor of Management Science and Engineering and of Computer Science CONTACT INFORMATION Administrative Contact Jackie Nguyen - Administrative Associate Group Resources. "t a","H Before joining Stanford in Fall 2016, I was an NSF post-doctoral fellow at Carnegie Mellon University ; I received a Ph.D. in mathematics from the University of Michigan in 2014, and a B.A. Navajo Math Circles Instructor. Overview This class will introduce the theoretical foundations of discrete mathematics and algorithms. Some I am still actively improving and all of them I am happy to continue polishing. Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant, Efficient Convex Optimization Requires . Here is a slightly more formal third-person biography, and here is a recent-ish CV. with Aaron Sidford With Jack Murtagh, Omer Reingold, and Salil P. Vadhan. O! From 2016 to 2018, I also worked in We establish lower bounds on the complexity of finding $$-stationary points of smooth, non-convex high-dimensional functions using first-order methods. with Yair Carmon, Arun Jambulapati and Aaron Sidford Unlike previous ADFOCS, this year the event will take place over the span of three weeks. ", "Collection of variance-reduced / coordinate methods for solving matrix games, with simplex or Euclidean ball domains. The following articles are merged in Scholar. MI #~__ Q$.R$sg%f,a6GTLEQ!/B)EogEA?l kJ^- \?l{ P&d\EAt{6~/fJq2bFn6g0O"yD|TyED0Ok-\~[`|4P,w\A8vD$+)%@P4 0L ` ,\@2R 4f D Garber, E Hazan, C Jin, SM Kakade, C Musco, P Netrapalli, A Sidford. [pdf] [poster] Conference on Learning Theory (COLT), 2015. data structures) that maintain properties of dynamically changing graphs and matrices -- such as distances in a graph, or the solution of a linear system. /Creator (Apache FOP Version 1.0) endobj Np%p `a!2D4! Another research focus are optimization algorithms. % Applying this technique, we prove that any deterministic SFM algorithm . We also provide two . Selected recent papers . Prior to that, I received an MPhil in Scientific Computing at the University of Cambridge on a Churchill Scholarship where I was advised by Sergio Bacallado. 2017. The paper, Efficient Convex Optimization Requires Superlinear Memory, was co-authored with Stanford professor Gregory Valiant as well as current Stanford student Annie Marsden and alumnus Vatsal Sharan. Spectrum Approximation Beyond Fast Matrix Multiplication: Algorithms and Hardness. Congratulations to Prof. Aaron Sidford for receiving the Best Paper Award at the 2022 Conference on Learning Theory ( COLT 2022 )! with Hilal Asi, Yair Carmon, Arun Jambulapati and Aaron Sidford February 16, 2022 aaron sidford cv on alcatel kaios flip phone manual. I am a fourth year PhD student at Stanford co-advised by Moses Charikar and Aaron Sidford. 2022 - current Assistant Professor, Georgia Institute of Technology (Georgia Tech) 2022 Visiting researcher, Max Planck Institute for Informatics. COLT, 2022. arXiv | code | conference pdf (alphabetical authorship), Annie Marsden, John Duchi and Gregory Valiant, Misspecification in Prediction Problems and Robustness via Improper Learning. University, where It was released on november 10, 2017. ", "General variance reduction framework for solving saddle-point problems & Improved runtimes for matrix games. . ", "Streaming matching (and optimal transport) in \(\tilde{O}(1/\epsilon)\) passes and \(O(n)\) space. Goethe University in Frankfurt, Germany. Yang P. Liu, Aaron Sidford, Department of Mathematics [pdf] Algorithms Optimization and Numerical Analysis. Symposium on Foundations of Computer Science (FOCS), 2020, Efficiently Solving MDPs with Stochastic Mirror Descent << I am a senior researcher in the Algorithms group at Microsoft Research Redmond. Student Intranet. Slides from my talk at ITCS. Two months later, he was found lying in a creek, dead from . Assistant Professor of Management Science and Engineering and of Computer Science. (arXiv), A Faster Cutting Plane Method and its Implications for Combinatorial and Convex Optimization, In Symposium on Foundations of Computer Science (FOCS 2015), Machtey Award for Best Student Paper (arXiv), Efficient Inverse Maintenance and Faster Algorithms for Linear Programming, In Symposium on Foundations of Computer Science (FOCS 2015) (arXiv), Competing with the Empirical Risk Minimizer in a Single Pass, With Roy Frostig, Rong Ge, and Sham Kakade, In Conference on Learning Theory (COLT 2015) (arXiv), Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization, In International Conference on Machine Learning (ICML 2015) (arXiv), Uniform Sampling for Matrix Approximation, With Michael B. Cohen, Yin Tat Lee, Cameron Musco, Christopher Musco, and Richard Peng, In Innovations in Theoretical Computer Science (ITCS 2015) (arXiv), Path-Finding Methods for Linear Programming : Solving Linear Programs in (rank) Iterations and Faster Algorithms for Maximum Flow, In Symposium on Foundations of Computer Science (FOCS 2014), Best Paper Award and Machtey Award for Best Student Paper (arXiv), Single Pass Spectral Sparsification in Dynamic Streams, With Michael Kapralov, Yin Tat Lee, Cameron Musco, and Christopher Musco, An Almost-Linear-Time Algorithm for Approximate Max Flow in Undirected Graphs, and its Multicommodity Generalizations, With Jonathan A. Kelner, Yin Tat Lee, and Lorenzo Orecchia, In Symposium on Discrete Algorithms (SODA 2014), Efficient Accelerated Coordinate Descent Methods and Faster Algorithms for Solving Linear Systems, In Symposium on Fondations of Computer Science (FOCS 2013) (arXiv), A Simple, Combinatorial Algorithm for Solving SDD Systems in Nearly-Linear Time, With Jonathan A. Kelner, Lorenzo Orecchia, and Zeyuan Allen Zhu, In Symposium on the Theory of Computing (STOC 2013) (arXiv), SIAM Journal on Computing (arXiv before merge), Derandomization beyond Connectivity: Undirected Laplacian Systems in Nearly Logarithmic Space, With Jack Murtagh, Omer Reingold, and Salil Vadhan, Book chapter in Building Bridges II: Mathematics of Laszlo Lovasz, 2020 (arXiv), Lower Bounds for Finding Stationary Points II: First-Order Methods. Contact: dwoodruf (at) cs (dot) cmu (dot) edu or dpwoodru (at) gmail (dot) com CV (updated July, 2021) . Yin Tat Lee and Aaron Sidford; An almost-linear-time algorithm for approximate max flow in undirected graphs, and its multicommodity generalizations. 2021. small tool to obtain upper bounds of such algebraic algorithms. [pdf] Prof. Sidford's paper was chosen from more than 150 accepted papers at the conference. Prof. Erik Demaine TAs: Timothy Kaler, Aaron Sidford [Home] [Assignments] [Open Problems] [Accessibility] sample frame from lecture videos Data structures play a central role in modern computer science. Annie Marsden, Vatsal Sharan, Aaron Sidford, and Gregory Valiant, Efficient Convex Optimization Requires Superlinear Memory. Aaron Sidford is part of Stanford Profiles, official site for faculty, postdocs, students and staff information (Expertise, Bio, Research, Publications, and more). SODA 2023: 5068-5089. This work presents an accelerated gradient method for nonconvex optimization problems with Lipschitz continuous first and second derivatives that is Hessian free, i.e., it only requires gradient computations, and is therefore suitable for large-scale applications. Secured intranet portal for faculty, staff and students. Aaron Sidford (sidford@stanford.edu) Welcome This page has informatoin and lecture notes from the course "Introduction to Optimization Theory" (MS&E213 / CS 269O) which I taught in Fall 2019.