top of page
sou3.jpg

Summer school

On the 8th of August, we will have a summer school on energy and machine learning, which represent two of the fastest growing areas of bilevel optimization. Each lecture will run for about half of the day, organized in two separate parts (of 1h30min each) as follows:

08:30 - 10:00

10:30 - 12:00

12:00 - 13:15

13:15 - 14:45

14:45 - 15:15

15:15 - 16:45

17:00 - 19:00

Bilevel Optimization Algorithms & Models for Contemporary Energy Challenges (Part A)
Coffee break
Bilevel Optimization Algorithms & Models for Contemporary Energy Challenges (Part B)
Lunch Break
Bilevel Optimization in Machine learning (Part A)
Coffee Break
Bilevel Optimization in Machine learning (Part B)
Conference welcome reception

Bilevel Optimization Algorithms & Models for Contemporary Energy Challenges

The first part of the course will begin with the basic concepts of bilevel optimization. It will then focus on mathematical optimization algorithms for solving bilevel optimization problems. While most attention will be paid to the problems where the functions involved are linear or convex, some aspects of nonconvex problems will be discussed. The second part will present bilevel models that address contemporary challenges in electric energy systems.

 

Part A: Solving bilevel optimization problems

A.1 - Single-level reformulations

A.2 - Algorithms for linear and convex problems

A.3 - Nonconvex bilevel optimization: What is possible?

Part B: Models for contemporary electric energy challenges

B.1 - Residential demand response and energy storage

B.2 - Multinational carbon-credit market with distinct national strategies

B.3 - Unit commitment under demand uncertainty    

Lecturer

Miguel F. Anjos

Biography

Miguel F. Anjos holds the Chair of Operational Research at the School of Mathematics, University of Edinburgh, U.K., and is Schöller Senior Fellow at the University of Erlangen-Nuremberg, Germany. He previously held faculty positions at Polytechnique Montreal, the University of Waterloo, and the University of Southampton. He is the Founding Academic Director of the Trottier Institute for Energy at Polytechnique Montreal. His accolades include an Inria International Chair, a Canada Research Chair, the NSERC-Hydro-Quebec-Schneider Electric Industrial Research Chair, a Humboldt Research Fellowship, and the Queen Elizabeth II Diamond Jubilee Medal. He is a Fellow of EUROPT and of the Canadian Academy of Engineering. The research interests of Professor Anjos are in the theory, algorithms and applications of mathematical optimization. He is particularly interested in the application of optimization techniques to problems in the areas of power systems, smart energy grids and facility layout. He has published four books and more than 100 scientific articles in leading international journals such as Mathematical Programming, SIAM Journal on Optimization, European Journal of Operational Research, IEEE Transactions on Power Systems, and IEEE Transactions on Smart Grid. He has led research collaborations with companies such as EDF, ExPretio, Hydro-Quebec, National Grid ESO, Rio Tinto, and Schneider Electric. He served as Editor-in-Chief of Optimization and Engineering, and is currently Area Editor for the Journal of Optimization Theory and Applications and for RAIRO-OR, and a member of several other editorial boards.

  

Bilevel Optimization in Machine Learning

In this lecture, you will learn the challenges in solving bilevel machine learning problems. Popular examples are hyperparameter optimization and meta-learning. The focus will be on explaining efficient gradient based methods that rely only on gradients and Jacoban-vector products and on establishing theoretical quantitative guarantees for such methods.

Part A:
A.1 - Introduction and outline.
A.2 - Machine learning applications overview: hyperparameter optimization, meta-learning, ...
A.3 - Characteristics of bilevel problems in machine learning: large-scale and simple constraints.
A.4 - Implicit function theorem and the hypergradient.
A.5 - Hypergradient approximation methods, pytorch implementation and memory/time complexity.

Part B:
B.1 - Theoretical assumptions: smoothness and strong-convexity/contraction at the lower level.
B.2 - Error rates for Approximate Implicit Differentiation (AID) and Iterative Differentiation (ITD).
B.3 - Convergence rates for AID-based inexact (projected) hypergradient descent.
B.4 - Relaxing the assumptions: non-smoothness, multiple inner solutions, …

  

Lecturer

Massimiliano Pontil

Biography

Massimiliano Pontil is P.I. of the Computational Statistics and Machine Learning research unit at IIT and professor at University College London and member of the UCL Centre for Artificial Intelligence. He is an ELLIS Fellow and co-director of ELLIS Unit Genoa, a joint effort of IIT and University of Genoa. He has been active in machine learning research for over twenty years, working on theory and algorithms, including the areas of kernel methods, multitask and transfer learning, online learning, sparse estimation, and statistical learning theory. Recent interests include meta-learning, algorithm fairness, hyperparameter optimization and learning with partial feedback. He received a best paper runner up at ICML 2013 and served as an Area Chair at NeurIPS, ICML, COLT, and action editor for the Journal of Machine Learning Research.

  

Lecturer

Riccardo Grazzi

Biography

Riccardo Grazzi is a Computer Science PhD Student at University college of London (UCL) and the Italian Institute of Technology (IIT) under the supervision of Massimiliano Pontil. During the PhD he was an intern at Amazon AWS Berlin under the supervision of Matthias Seeger. Previously, he completed his Master and Bachelor and degree in computer engineering at University of Florence. His main research focus is on bilevel optimization in machine learning (2 ICML and 1 AISTATS publications on this area). In particular, on the quantitative study of efficient gradient based bilevel optimization algorithms.

  

10:00 - 10:30

bottom of page