Stochastic Optimization and Automatic Differentiation for Machine Learning (Spring 2017)
Distributed & Stochastic Optimization for Machine Learning (Spring 2017)
Introduction [slides] (introductory material on SVM's)
Basics convexity, duality [slides]
Setup instructions for python: python3, numpy, numba, scikit-learn, ipython notebook / jupyter.
Gradient descent, proximal gradient descent, SG [slides]
Incremental methods, coordinate ascent [slides]
SDCA, ADMM, COCOA, GPUs, Differentiation automatique [slides]
Guest lecture by S. Combes: [slides]
Coding sessions
Ipython notebooks
Project: Choose one of the projects below, or suggest one by email.
Implement the SDCA algorithm to estimate support Vector Machines. Test the algorithm on databases of your choice and compare it with a subgradient descent approach such as this one
Implement COCOA on a single machine (can be distributed across several machines using Spark, but this won't be needed, the skeleton of the algorithm will be ok for this time, the algorithm can be run on multiple cores) for logistic regression, benchmark it against naive alternatives on a large dataset.
Implement and discuss the efficiency of 3 incremental gradient algorithm (SVRG, SAGA, MISO) and benchmark them against batch-gradient descent on a sparse logistic regression problem.
Submit your code and a short pdf file describing your findings (in french or english).
Report submission deadline: May 5, 23:59. (3 points/20 less per day late)*
|