Introduction

Numerical Optimization: theory and applications #

Himmelblau's Function

Description #

This page regroups information, lecture documents and practical labs for a course given at doctoral school SIE on numerical optimisation. This version concerns the year 2025. You can fin on the left, all the course materials with solutions to exercises and implementation of programs in python.

Course Objectives #

This course teaches an overview of modern optimization methods, for applications in inverse problems, machine learning and data science. Alternating between mathematical theory of optimization and practical lab sessions and projects, the course aims at providing doctoral students with the knowledge to solve common problems through numerical optimizations.

At the end of this course, the students should:

  • Be able to recognize convex optimization problems that arise in scientific fields and design appropriate cost functions
  • Have an understanding of how such problems are solved, and gain some experience in solving them
  • Have knowledge of the underlying tools behind training of machine learning models
  • Be able to implement backpropagation and stochastic optimization algorithms for large models

Program #

Part I - Fundamental Theory (Week 1) #

The course will first start with fundamental mathematical concepts of optimization and convex optimization:

  • Formulating an optimization problem
  • Reminders of linear algebra and formulating a constrained optimization problem
  • Reminders on differentiability
  • Convexity theory
  • Gradient Methods
  • Second-order methods
SessionDurationContentDateRoomSlides
CM11.5hIntroduction, Linear algebra and Differentiation reminders, and exercices2 June 2025 10amB-120Slides
CM21.5hSteepest descent algorithm, Newton method and convexity2 June 2025 1.15pmB-120Slides
TD11.5hApplication to linear regression (1/2)2 June 2025 3pmC-213
CM31.5hLinesearch algorithms and their convergence3 June 2025 10amB-120Slides
TD21.5hLinesearch in linear regression (2/2)3 June 2025 1.15pmC-213
CM41.5hConstrained optimization : linear programming and lagrangian methods3 June 2025 3pmB-120Slides

Part II - Application to Image / Remote Sensing Problems (Week 1) #

Then we will apply those concepts in practice in inverse problem solving with examples in image denoising problems in remote sensing:

  • Formulating an unconstrained optimization problem and solving it for a practical example
  • Numerical implementation in python

This lab will be supervised by Yassine Mhiri.

SessionDurationContentDateRoom
TP4hImplementation of inverse problems for image processing4 June 2025 1pmC-213

Part III - Advanced topics (Week 2) #

We then move on to more complicated problems:

  • Newton with linesearch and quasi-Newton methods
  • Proximal algorithms
SessionDurationContentDateRoomSlides
CM51.5hProjected Gradient then Newton / Quasi Newton methods12 June 2025 10amB-120Slides 1 / Slides 2
CM61.5hStudy in autonomy12 June 2025 1.15pmB-120
TD31.5hStudy in autonomy12 June 2025 3pmC-213

Part IV - Application to Machine Learning (Week 3) #

We finally apply all those concepts with a focus on machine learning training process:

  • Stochastic optimization
  • Shallow models: Optimization for linear/logistic regression, support vector machine, perceptron and MLP
  • Deep models: Formulating the back propagation algorithm for common layers

For this part, we will rely on RĂ©mi Flamary’s slides available at here and here.

SessionDurationContentDateRoom
LectureCM71.5hStochastic optimisation12 June 2025 1.15pm
LectureCM81.5hOptimization for shallow models : from perceptron to MLP to CNN16 June 2025 3pm
ExerciseTD41.5hImplementing backpropagation for neural networks17 June 2025 1.15pm
LectureCM91.5hOptimization for deep models : Adam and SGD17 June 2025 3pm

Prerequisites #

Students should have a decent understanding of basic linear algebra and differentiability of functions over several variables (some reminders will be given at the beginning still). Some capacity to program in Python is desired with knowledge of NumPy.

Teaching Method #

Lectures will alternate between theoretical lectures, mathematical exercises and practical implementation in Python of the algorithms studied. A mini-project in Remote Sensing will also be given to illustrate on a real-world problem.

Ressources #

The course is based on several resources, including:

  • Numerical Optimization by J. Nocedal and S. Wright, for linesearch and Newton methods
  • Proximal Algorithms monograph by N. Parikh and S. Boyd, for proximal methods
  • Deep Learning by I. Goodfellow, Y. Bengio and A. Courville, for backpropagation and stochastic optimization

Registration #

Register through Adum and/or by email to ammar.mian@univ-smb.fr