Department of Computer Science | Institute of Theoretical Computer Science
Frederik Benzing, Maxime Larcher, Anders Martinson, Raphaël Dang-Nhu , and Yassir Akram
Lecture: Tuesday 13:15 - 14:00 in CAB G 51 and Thursday 08:15 - 10:00 in CAB G 51.
Exercise Class: You can attend one of the two following exercise classes:
Exam: Saturday, 14.12, 10:00 - 13:00 in HG F1.
Students of Computer Science or Mathematics in the 5th semester or later. Knowledge of topics covered in the lecture "Algorithms, Probability, and Computing" is not required; both courses can be attended in parallel.
Your final grade will be calculated as the the weighted average of:
Following the D-INFK guidelines for doctoral studies, PhD students get credit points according to the same rules that apply for Bachelor or Master students. That is, with a final grade of at least 4 (calculated as explained above) you will receive 7KE, and 0KE otherwise.
We provide the exams of previous years HS16, HS17, HS18 so you can have an idea of what we expect. One of the theorems was removed from the Lecture in 2018, so you may not be able to solve Exercise 3 of 2017.
The lecture notes can be found here.
The exercise classes are an integral part of the course. Thus, we strongly recommend you to attend them regularly and solve the weekly exercise sheets.
Graded Homework 1 (Guidelines for GHW1)
Common Mistakes in Graded Homework 2
This pdf contains a short introduction to big-oh notation, asymptotics and it also contains some useful inequalities.
LaTex template. To save the LaTex template, you need to right-click on the link and the "save then target" wherever you want to save it. Just clicking on the link will not be of much use, as you will just see the squeezed LaTex code.Lecture 1 (17.09.): - The probabilistic method (Section 1.2 of the script) - Ramsey numbers (1.3) - SAT (1.4, partially covered) |
Lecture 2 (19.09.): - Derandomization for SAT (1.4, continued) - Long and colourful pahts (1.5) - Linearity of Expectation (2), Coupon Collector (2.2) |
Lecture 3 (24.09.): - Independent Sets in Graphs (2.1) - Max 3-SAT (2.3) |
Lecture 4 (26.09.): - Max 3-SAT, deterministic algorithm (2.3) - Markov and Chebychev inequality (3) - Calculating the Median (3.2) |
Lecture 5 (01.10.): - First and Second Moment Method (4, partially covered) |
Lecture 6 (03.10.): - First and Second Moment Method (4, ctd) - Triangles in random graphs (4.1) - Thresholds in random graphs (4.1) - Chernoff bounds (5., partially covered) |
Lecture 7 (08.10.): - Chernoff bounds (5., continued) - Target Shooting (5.1) |
Lecture 8 (10.10.): - Routing in the Hypercube (5.2) |
Lecture 9 (15.10.): - Negative Correlation (6.,partially covered) |
Lecture 10 (17.10.): - Negative Correlation (6.,ctd.) |
Lecture 11 (22.10.): - Azuma's inequality (7.) - Concentration bounds for chromatic numbers (7.1) |
Lecture 12 (24.10.): - Janson's inequality (7.) - Wilf algorithm (7.2) |
Lecture 13 (29.10.): - Talagrand's inequality (8.) |
Lecture 14 (31.10.): - Introduction to Markov chains (9.) |
Lecture 15 (05.11.): - Markov chains - 3 SAT Example (9.3) |
Lecture 16 (07.11.): - Drift theorems (9.2) |
Lecture 17 (12.11.): - Drift theorems (9.2, continued) |
Lecture 18 (14.11.): - Markov chains, stationary distributions (10) |
Lecture 19 (19.11.): - Rapidly mixing chains (10.1) |
Lecture 20 (21.11.): - Examples: random walk on the hypercube, card shuffling (10.1) |
Lecture 21 (26.11.): - Additional material |
Lecture 22 (28.11.): - Example: generating and couting matchings (10.2) |
Lecture 23 (03.12.): - Generating and couting matchings (10.2,continued) |
Lecture 24 (05.12.): - (not examinable) |
Lecture 25 (10.12.): - How to become rich (not examinable) |
Introduction
Linearity of expectation
Markov and Chebyshev inequalities
First and second moment method
Chernoff bounds
Negative Correlation
Janson, Azuma and Talagrand inequalities
Markov chains
Outlook (if time permits)