Department of Computer Science | Institute of Theoretical Computer Science | CADMO

Theory of Combinatorial Algorithms

Prof. Emo Welzl and Prof. Bernd Gärtner

Optimization for Data Science (2019)

Lecturers: Bernd Gärtner (CAB G31.1);
David Steurer (CAB H37.1).
Assistants: Luis Barba (CAB G33.3), contact assistant;
Tommaso D'Orsi (CAB H36.2);
Hung Hoang (CAB G19.2);
Gleb Novikov (CAB H36.2);
Stefan Tiegel.
Lectures: Mon 15-16, HG E1.1,
Tue 10-12, ETF C1.
Credit Points: 8CP for Data Science Master, Computer Science Master and Computational Science and Engineering Master (261-5110-00L, 3V + 2U + 2A)
Language: English
Contents: This course teaches an overview of modern optimization methods, with applications in particular for machine learning and data science.
  • In the first part of the course, we will discuss how classic first and second order methods such as gradient descent and Newton's method can be adapted to scale to large datasets, in theory and in practice. We also cover some new algorithms and paradigms that have been developed specifically in the context of data science. The emphasis is not so much on the application of these methods (many of which are covered in other courses), but on understanding and analyzing the methods themselves.
  • In the second part, we discuss convex programming relaxations as a powerful and versatile paradigm for designing efficient algorithms to solve computational problems arising in data science. We will learn about this paradigm and develop a unified perspective on it through the lens of the sum-of-squares semidefinite programming hierarchy. As applications, we are discussing non-negative matrix factorization, compressed sensing and sparse linear regression, matrix completion and phase retrieval, as well as robust estimation.
Prerequisites: As background, we require material taught in the course "252-0209-00L Algorithms, Probability, and Computing". It is not necessary that participants have actually taken the course, but they should be prepared to catch up if necessary.

Exams, Special Assignments and Grading

Grading: There will be a written exam in the examination session. Furthermore, there will be two mandatory written special assignments during the semester. The final grade of the whole course will be calculated as a weighted average of the grades for the exam (80%) and the special assignments (20%).
Special Assignments: At two times in the course of the semester, we will hand out specially marked exercises or term projects — the written part of the solutions are expected to be typeset in LaTeX or similar. Solutions will be graded, and the grades will account for 20% of the final grade. Assignments can be discussed with colleagues, but we expect an independent writeup.
Exam: Date to be determined. The exam lasts 120 minutes, it is written and closed-book. No written material permitted!
Past year exam: Questions, Solutions.

Regular Exercises

Theoretical Exercises

The theoretical exercises are discussed in classes, which start from the second week of the semester, i.e., on 26 Feb 2019. Students are expected to try to solve the problems beforehand. We have assigned students to classes according to surnames. Attendance according to these assignments is not compulsory but encouraged. The details of the classes are as follows.

Group Time Room Students with Surnames Assistant
A Tue 13-15 CHN G22 Beginning with A-C Gleb Novikov
B Tue 13-15 HG D3.2 Beginning with D-O Tommaso D'Orsi
C Tue 13-15 RZ F21 Beginning with P-Z Stefan Tiegel

Practical Exercises

These form a self-study component which provides guidance to implement some of the methods discussed in the lectures. Students are encouraged to attempt these exercises and check against the suggested solutions, which will be made available online some time after the release of the exercises. Although they are not discussed in the regular classes, students can contact the practical exercise assistant (Hung Hoang) with any questions.


Important announcements about the course during the semester will be published here. Please check back regularly.

14 Feb 2019: The first exercise class on 19 Feb 2019 will be in HG D3.2 only. We will give installation support and an introduction to Jupyter Notebook for the practical exercises. Please refer to the exercise document in the Schedule section for more information.

12 Mar 2019: The past year exam has been published. Please look at the exam section for a version with and one without the solutions.

27 Mar 2019: Special Assignment 1 has been published. Please refer to the Schedule section.

2 Apr 2019: Updated Addendum of Special Assignment has been uploaded to the Schedule section.

2 Apr 2019: New textbook by Martin J. Wainwright, just released last month, has been added in the Literature section. It is available through the link in that section, using the ETH library access.

21 May 2019: Solution of Special Assignment 1 has been published. Please refer to the Schedule section.

24 May 2019: Special Assignment 2 has been published. Please refer to the Schedule section.

15 Jul 2019: Solution of Special Assignment 2 has been published. Please refer to the Schedule section.

13 Sep 2019: Exam inspection is at CAB G15.2, 10:30am-12:30pm on Friday 20 Sep 2019. Please come if you would like to inspect your exam.


In the table below are the lecture dates and the preliminary topics. The exercises and their solutions will be published here.

Note: Documents are password protected. Please use your nethz-login.

Calendar Week Date Topic Exercises and Special
Assignments (by due date)
8 Mon 18.02.19 Theory of Convex Functions (notes, slides)
Blackboard: Def 1.9, Thm 1.10 (pp 8-9)
Exercise Set 1

Chapter 1 with Solutions
(Note: Exercise 7 is not examinable)
Tue 19.02.19
9 Mon 25.02.19 Gradient Descent (notes, slides)
Blackboard: Cauchy-Schwarz inequality,
Sec 2.6 (pp 48-49)
Exercise Set 2

Chapter 2 with Solutions
Tue 26.02.19
10 Mon
Projected Gradient Descent (notes, slides)
Blackboard: Sec 3.5 (pp 62-66)
Exercise Set 3
Chapter 3 with Solutions
(Put this in the same folder as the template file.)
11 Mon
Subgradient Descent & Stochastic Gradient Descent (notes, slides)
Blackboard: Nesterov's GD
Exercise Set 4
Chapters 4&5 with Solutions
12 Mon
Nonconvex Functions (notes, slides) Exercise Set 5
Chapter 6 with Solutions
13 Mon
Newton's and Quasi-Newton Methods (notes, slides) Special Assignment 1
Addendum to Special Assignment 1
Exercise Set 6
Solution of Special Assignment 1
Chapters 7&8 with Solutions
14 Mon
Quasi-Newton Method (cont'd) (notes, slides) Exercise Set 7
Local Optimization Is Hard (slides)
15 Mon
Sechseläuten – No Lecture! Exercises 1.1, 1.3, 1.4, 1.7 Chapter with Solutions
Regression (notes) (pp.2-6)
16 Mon
Out of Sample Prediction Error and Matrix Concentration Bound (notes) Exercise Set 9
Solutions to Exercise Set 9
17 Mon
Easter – No Lecture!
Easter – No Lecture!
18 Mon
Bayesian Linear Regression and Optimality of Least Squares (notes) Exercise Set 10
Solutions to Exercise Set 10
19 Mon
Sparse Linear Regression, Likelihood Maximization and Guarantees of Best Subset Selection (pp 10-12) Exercise Set 11 Solutions to Exercise Set 11
20 Mon
Statistical Guarantees of LASSO for Sparse Linear Regression, Slow and Fast Rates (pp 13-15) Exercise Set 12 Solutions to Exercise Set 12
21 Mon
Factorization (notes) Special Assignment 2
Exercise 2.1, 2.4, 2.5
Solution of Special Assignment 2
Solutions to Exercise Set 13
22 Mon
Factorization (cont'd)