Randomized Algorithms are the state of the art in contemporary algorithm design. They are usually simple, sometimes even easy to analyze, and they work well in practice. This course provides an introduction to basic concepts in the design and analysis of randomized algorithms.
A more detailed version of the syllabus is available as an archival pdf.
Instructor: Alex Gittens (gittea at rpi dot edu)
Lectures: MTh 10am-11:50am; Greene 120
Office Hours: MTh 12pm-1pm, or by appointment; Lally 316
Course Text: (Recommended, not required) Probability and Computing, by M. Mitzenmacher and E. Upfal.
- Homeworks, 50%
- In-class Pop Quizzes, 25%
- Project, 25%
Students are expected to have writing supplies on hand in each class to complete the in-class pop quizzes. If you are an athlete or for some other reason are not able to attend each class, make alternative arrangements with the instructor in the first two weeks of the course.
Letter grades will be computed from the semester average. Lower-bound cutoffs for A, B, C and D grades are 90%, 80%, 70%, and 60%, respectively. These bounds may be moved lower at the instructors discretion.
We will largely follow Mitzenmacher and Upfal, replacing the more ML-focused content with alternative material.
- Events and Probabilities
- Discrete Random Variables
- Continuous Random Variables
- Moments and Deviations
- Chernoff and Hoeffding Bounds
- Balls, Bins, and Graphs
- Randomized Data Structures: Treaps, Skip Lists, Hash Tables, Bloom Filters
- The Probabilistic Method
- Markov Chains and Random Walks
- The Gaussian Distribution and Dimensionality Reduction
- Monte Carlo Methods
- Streaming Algorithms
- Lecture 1 (8/29/2019) : Course logistics, Randomized Algorithms, Standard algorithmic uses for randomness, Probability spaces (MU 1.2, and Theorem 1.6), Karger's min-cut algorithm (MU 1.5)
- Lecture 2 (9/3/2019) : Finish Karger's min-cut algorithm, String equality testing via polynomial equality testing, Random variables and expectations (MU 2.1, 2.3), Bernoulli and Binomial random variables (MU 2.2), Start analysis of expected runtime of randomized quicksort (MU 2.5)
- Lecture 3 (9/6/2019) : Finish runtime of randomized quicksort, Monte Carlo and Las Vegas algorithms and MC to LV conversion, the Probabilistic method: application of the first moment method to the MAX k-SAT problem (MU 6.2.2).
- Lecture 4 (9/9/2019) : Recap first moment method for the MAX k-SAT problem, first moment method for a graph coloring problem (MU 6.1), definition of conditional expectation (MU 2.3), examples of working with conditional expectation.
- Lecture 5 (9/12/2019): Review of properties of expectation and conditional expectation. Introduction to Kaczmarz algorithm and statement of convergence result.
- No lecture September 16
- Lecture 6 (9/19/2019): Proof of convergence of Kaczmarz algorithm. Jensen's inequality. Outline of convergence of stochastic gradient descent for convex functions.
- Lecture 7 (9/23/2019) Markov's inequality. Variance of a random variable, and its properties. Variance of common distributions. Chebyshev's inequality. Median vs mean, robustness properties. (MU 3.1-3.4)
- Lecture 8 (9/26/2019): Randomized linear-time median algorithm. (MU 3.5)
- Lecture 9 (9/30/2019): Central Limit Theorem and Berry--Esseen correction. Moment generating functions, Chernoff bounds for sums of independent random trials (MU 4.1-4.3).
All assignments must be typed (preferably in LaTeX) and are due at the start of class (defined as the first 10 minutes). Late assignments will not be accepted, unless you contact the instructor at least two days before the due date to receive a deferral. Deferrals will be granted at the instructor’s discretion, of course.
- HW1: Probability Theory UPDATED Sept 11 to correct the success probability in Problem 4 and provide hints for all problems. Assigned Thursday September 5, due Thursday September 19. Start early, and ask questions about both the content and your write-ups.
In teams of two to three individuals, you will give a 20 minute presentation of an interesting or cool result or an algorithm (or class of algorithms) from a course-relevant technical or survey paper to the class; if your research is relevant, that could also be the topic of your presentation. This can be material that requires no background beyond general knowledge of CS, or more niche/applied knowledge; in the latter case, you should quickly explain the relevant concepts and background. See the "Giving Talks" and "Reading Papers" slide decks from the CS RPI Graduate Skills Seminar.
Content: address the following points.
- Who are the authors, and the date and venue of publication?
- What is the problem that is addressed (pick one, if the paper addresses more than one), and why is it interesting or useful?
- What is the main result of the paper?
- Describe the result or algorithm and motivate it intuitively.
- What is the cost (time, space, or some other metric) of this algorithm, and how does it compare to prior algorithms for the same problem? (and similarly, for non-algorithmic results)
- What performance guarantees, if any, are provided for the algorithm?
- Give an accurate description of the analysis given in the paper: in simple cases this may be a tour through the entire argument; when this is not possible, focus on explaining a core lemma/theorem that supports the claim of the paper.
- Provide an empirical evaluation of the algorithm: compare its performance to reasonable baselines, and explore relevant aspects of the algorithm (its variability, sensitivity to relevant properties of the input, etc.). If presenting a non-algorithmic result and it is possible, provide some experimental evidence of its sharpness or lack thereof.
- On-time team formation and paper selection, 10%
- Discussion of paper with me, 40%
- In-class presentation, 40%
- On-time deliverables, 10%