|Email:||gittea at rpi dot edu|
|Office:||316 Lally Building|
|General Office Hours:||ThF 1-2pm|
|Address:||Room 316 Lally, CS Department, 110 8th St, Troy, NY 12180|
I am an assistant professor in the Computer Science department of Rensselaer Polytechnic Institute. My research focuses on using randomization to reduce the computational costs of extracting information from large datasets. My work lies at the intersection of randomized algorithms, numerical linear algebra, high-dimensional probability, and machine learning.
I earned my PhD in applied and computational mathematics at CalTech under the supervision of Prof. Joel Tropp, in 2013. From 2013 until 2015, I was a member of the machine learning research group at eBay. Following that I was a postdoctoral fellow in the AMPLab at UC Berkeley and a member of the International Computer Science Institute. I joined RPI in January of 2017.
My current research interests include, in no order:
- Federated learning.
- Word embeddings.
- AutoML and the CASH (combined algorithm and hyperparameter selection) problem.
- Randomized NLA with applications to machine learning, tensor approximation, and general optimization.
- Matrix and tensor completion.
- Adversarially robust machine learning.
My current graduate students are Sharmishtha Duttas (PhD; AI for threat intelligence), Dong Hu (PhD; matrix completion and low-rank approximation), and Kevin Kim (PhD; tensor decompositions). In the summer of 2020, I am also working with Chris Jerrett (BS; tensor decomposition heuristics).
Publications (out of date; see Google Scholar)
- S. Tu, S. Venkataraman, A. Wilson, A. Gittens, M. Jordan, B. Recht. Breaking Locality Accelerates Block Gauss-Seidel. ICML 2017
- S. Wang, A. Gittens, M. Mahoney. Sketched Ridge Regression: Optimization Perspective, Statistical Perspective, and Model Averaging. ICML 2017
- A. Gittens, D. Achlioptas, M. Mahoney. Skip-Gram – Zipf + Uniform = Vector Additivity. ACL 2017
- A. Gittens, A. Devarakonda, E. Racah, et al. Matrix Factorization at Scale: a Comparison of Scientific Data Analytics in Spark and C+ MPI Using Three Case Studies. IEEE BigData 2016
- A. Gittens, J. Kottalam, J. Yang, et al. A multi-platform evaluation of the randomized CX low-rank matrix factorization in Spark. IPDPS ParLearning Workshop 2016
- A. Gittens, M. Mahoney. Revisiting the Nystrom Method for Improved Large-scale Machine Learning. JMLR. 2016
- C. Boutsidis, P. Kambadur, A. Gittens. Spectral Clustering via the Power Method-Provably. ICML 2015
- D. Kuang, A. Gittens, R. Hamid. Hardware compliant approximate image codes. CVPR 2015
- R. Hamid, Y. Xiao, A. Gittens, D. DeCoste. Compact Random Feature Maps. ICML 2014
- C. Boutsidis, A. Gittens. Improved matrix algorithms via the subsampled randomized Hadamard transform. SIMAX. 2013
- R. Y. Chen, A. Gittens, J. A. Tropp. The masked sample covariance estimator: an analysis using matrix concentration inequalities. Information and Inference. 2012
- J. Yang, A. Gittens. Tensor machines for learning target-specific polynomial features. 2015.
- D. Kuang, A. Gittens, R. Hamid. piCholesky: Polynomial Interpolation of Multiple Cholesky Factors for Efficient Approximate Cross-Validation. 2014
- A. Gittens, J. A. Tropp. Tail bounds for all eigenvalues of a sum of matrices. 2011
- A. Gittens. The spectral norm error of the naive Nystrom extension. 2011
|Fall 2021||Teaching CSCI6961/4961, "Machine Learning and Optimization". This course formulates ML problems as optimization problems, then focuses on solving them efficiently and quickly using algorithms appropriate for large-scale applications (aka first-order algorithms). Another focus of the course is on specific architectures used in modern machine learning to impose helpful inductive biases (aka, deep learning). It's a fun, challenging, and rewarding class. Take it.|
|Spring 2021||Taught CSCI2200, "Foundations of Computer Science", aka FOCS. This course serves as an introduction to discrete mathematics and the theory of computing for computer scientists.|
|Fall 2020||Taught CSCI4961/6961, "Machine Learning and Optimization". The focus is on understanding randomized optimization algorithms motivated by and focusing on applications in machine learning and data analysis.|
|Spring 2020||Taught CSCI2200, "Foundations of Computer Science", a discrete mathematics/theory of computing course. See the website for more information.|
|Fall 2019||taught CSCI6220/4030, "Randomized Algorithms". See the website for the syllabus and assignments.|
|Spring 2019||taught CSCI6971/CSCI4971, "Large Scale Matrix Computation and Machine Learning". See the website for the syllabus and assignments.|
|Fall 2018||taught CSCI6220/4030, "Randomized Algorithms". See the website for the syllabus and assignments.|
|Spring 2018||taught CSCI6971/CSCI4971, "Large Scale Matrix Computation and Machine Learning". See the website for the syllabus and assignments.|
|Fall 2017||taught CSCI6220/4030, "Randomized Algorithms". See the website for the syllabus and assignments.|
|Spring 2017||taught CSCI6971/CSCI4971, "Large Scale Matrix Computation and Machine Learning". See the syllabus.|