Randomized Algorithms are the state of the art in contemporary algorithm design. They are usually simple, sometimes even easy to analyze, and they work well in practice. This course provides an introduction to basic concepts in the design and analysis of randomized algorithms.
A more detailed version of the syllabus is available as an archival pdf.
Instructor: Alex Gittens (gittea at rpi dot edu)
Lectures: MTh 10am-11:50am; Greene 120
Office Hours: MTh 12pm-1pm, or by appointment; Lally 316
Course Text: (Recommended, not required) Probability and Computing, by M. Mitzenmacher and E. Upfal.
- Homeworks, 50%
- In-class Pop Quizzes, 25%
- Project, 25%
Students are expected to have writing supplies on hand in each class to complete the in-class pop quizzes. If you are an athlete or for some other reason are not able to attend each class, make alternative arrangements with the instructor in the first two weeks of the course.
Letter grades will be computed from the semester average. Lower-bound cutoffs for A, B, C and D grades are 90%, 80%, 70%, and 60%, respectively. These bounds may be moved lower at the instructors discretion.
We will largely follow Mitzenmacher and Upfal, replacing the more ML-focused content with alternative material.
- Events and Probabilities
- Discrete Random Variables
- Continuous Random Variables
- Moments and Deviations
- Chernoff and Hoeffding Bounds
- Balls, Bins, and Graphs
- Randomized Data Structures: Treaps, Skip Lists, Hash Tables, Bloom Filters
- The Probabilistic Method
- Markov Chains and Random Walks
- The Gaussian Distribution and Dimensionality Reduction
- Monte Carlo Methods
- Streaming Algorithms
- Lecture 1 (8/29/2019) : Course logistics, Randomized Algorithms, Standard algorithmic uses for randomness, Probability spaces (MU 1.2, and Theorem 1.6), Karger's min-cut algorithm (MU 1.5)
- Lecture 2 (9/3/2019) : Finish Karger's min-cut algorithm, String equality testing via polynomial equality testing, Random variables and expectations (MU 2.1, 2.3), Bernoulli and Binomial random variables (MU 2.2), Start analysis of expected runtime of randomized quicksort (MU 2.5)
- Lecture 3 (9/6/2019) : Finish runtime of randomized quicksort, Monte Carlo and Las Vegas algorithms and MC to LV conversion, the Probabilistic method: application of the first moment method to the MAX k-SAT problem (MU 6.2.2).
- Lecture 4 (9/9/2019) : Recap first moment method for the MAX k-SAT problem, first moment method for a graph coloring problem (MU 6.1), definition of conditional expectation (MU 2.3), examples of working with conditional expectation.
- Lecture 5 (9/12/2019): Review of properties of expectation and conditional expectation. Introduction to Kaczmarz algorithm and statement of convergence result.
- No lecture September 16
- Lecture 6 (9/19/2019): Proof of convergence of Kaczmarz algorithm. lecture notes.
- Lecture 7 (9/23/2019). Variance. Markov's inequality. Chebyshev's inequality. Coupon collector problem (MU 2.4.1).
- Lecture 8 (9/26/2019) More on variance. More on coupon collector problem. Weak law of large numbers for sums of i.i.d. random variables with finite second moments.
- Lecture 9 (9/30/2019): Moment tail bounds, Moment generating functions, Cramer-Laplace-Chernoff framework for tail bounds. (MU4.1--4.2)
- Lecture 10 (10/3/2019): Jensen's inequality (MU 2.1.2), Chernoff bounds for sums of independent random trials (MU 4.2.1--4.3), Hoeffding bounds (MU 4.5).
- Lecture 11 (10/7/2019): Chebyshev applied to #SAT problem (MU 6.2.2), Introduction to randomized routing on hypercubes (MU 4.6).
- Lecture 12 (10/10/2019): Randomized routing on hypercubes.
- Lecture 13 (10/17/2019): Randomized data structures --- treaps. (essentially from Jeff Erickson's notes, except we use max-heaps instead of min-heaps)
- Lecture 14 (10/21/2019): Treaps.
- Lecture 15 (10/24/2019): Skiplists and the Tail Sum Formula. (also following Erickson's notes)
- Lecture 16 (10/28/2019): Randomized data structures - hashing. (essentially following Erickson's notes)
- Lecture 17 (10/31/2019): Hashing.
- Lecture 18 (11/4/2019): Hashing.
- Lecture 19 (11/7/2019): Markov Chains: memorylessness and conditional independence of past and future given present, transition matrices, Chapman-Kolmogorov equations, classical examples of MCs.
- Lecture 20 (11/11/2019): Markov Chains: stationary distributions, irreducibility, aperiodicity, ergodicity and usefulness for Markov Chain Monte Carlo Methods
- Lecture 21 (11/14/2019) MCMC Methods: Sampling using inverse CDF, Acceptance-rejection method, Gibbs sampling, Metropolis algorithm, Metropolis-Hastings
All assignments must be typed (preferably in LaTeX) and are due at the start of class (defined as the first 10 minutes). Late assignments will not be accepted, unless you contact the instructor at least two days before the due date to receive a deferral. Deferrals will be granted at the instructor’s discretion, of course.
- HW1: Probability Theory UPDATED Sept 11 to correct the success probability in Problem 4 and provide hints for all problems. Assigned Thursday September 5, due Monday September 23. Start early, and ask questions about both the content and your write-ups.
- HW2 Assigned Thursday September 19, due Thursday October 10.
- HW3 Assigned Thursday October 17, due Monday October 28.
- HW4 Assigned Monday October 28, due Thursday November 7.
- HW5 Assigned Thursday November 7, due Thursday November 21.
- Substitute Quiz Assigned Monday November 11, due Monday November 25. If your score on this quiz is higher, it will replace the average of your quiz grades on the first six quizzes.
In the following teams, you will present theoretical and empirical evaluations of modern randomized algorithms that arose in either your own research, or a paper of your choice. See the project page for more details and deadlines.
|Samvit, Chang Ju||Sola, Colin, Manqing, Rufeng|
|Adam, Brandon, Jacob||Georgios, Dong, Kevin|
|Frederik, James, Nick||Jessie, Daniel, Feimi, Harrison|