Stochastic Optimization and Automatic Differentiation for Machine Learning (Spring 2019)

Supports TPs/TDs

Coding sessions

Projects (due May 23rd, 2019, by email)

You can choose either of the projects below. You can complete this assignment alone or as a group of two, but I will grade them accordingly (I will have higher expectations for an assignment completed jointly by two students). Please send me, in an email to, a zip file containing your repord in pdf (not a .doc) and code in whatever format (you can send a notebook)

SVRG on nonconvex objectives

Implement one of the algorithms provided in this paper and compare it to standard SVRG on a convex problem (e.g. logistic regression with LASSO penalty) and a non-convex problem. Check if you recover results that can be compared to those of the paper (not necessarily on the same datasets)


Implement the algorithm presented in this paper and test it on two problems of your choice (convex and non-convex).

Learning with a Wasserstein loss

Look at the paper described in this page. The paper describes a method to carry out multi-label prediction in a setting where you have a metric between labels available to you. Implement that algorithm for a simple multi-class logistic regression problem involving only a few labels (<100) for which you will have also managed to define a ground metric. Implement the algorithm using automatic differentiation. To do so, you will need to restrict the algorithm to carry out only a few predetermined Sinkhorn iterations.