Teratec / NAG - Seminar on Mathematical Optimization

October 10th, 2019 - Bruyères-le-Chatel
 

This seminar will capture recent developments in state-of-the-art continuous optimization. The day is split into 4 thematic sessions: introduction to optimization; derivative-free optimization; modern convex optimization; and algorithmic differentiation. Each session will last for approximately 90 minutes. 

1. Introduction to optimization

This session focuses on general optimization principles. We will give an overview of the capabilities and limitations of modern numerical solvers and how to correctly use them:

  • benchmarking and choosing the solver;
  • the importance of precise derivative evaluations;
  • interpreting convergence measures;
  • understanding the output of the solver;
  • avoiding common modelling pitfalls.

2. Derivative-free optimization

Calibrating numerical models is a common optimization problem. Very often, evaluating these models involve heavy computations that can be either very expensive or very noisy if they are not run to full convergence. In such a case using classical derivative-based optimization methods may not be advisable as computing the model's gradient can become a difficult challenge. Finite-differences are often not realistic in terms of computing time, and even more sophisticated methods such as algorithmic differentiation can fail in the presence of noise. In this session we give a description of state-of-the-art derivative-free algorithms and how they can handle noise in the function evaluations.

3. Modern convex optimization

When the modelled problem is convex, most of the time numerical optimization solvers can make use of its far stronger theoretical properties. However, formulating the problem in a form that is exploitable for the solver can be quite a challenge. We will present some background on conic convex problems (Second Order Conic Programming - SOCP - and Semi-Definite Programming - SDP) and give some hints on the modelling possibilities opened by such solvers. We will show some common convex reformulations and some examples of their use in various industries.

4. Algorithmic Differentiation (AD)

Derivatives are required in many different areas of numerical computation. Algorithmic differentiation is a technique that takes a function definition written in a high-level programming language such as C++ or Fortran, and automatically generates code to compute its exact derivatives. Compared to the commonly used finite difference (or bumping) approach, AD not only computes exact derivatives but also allows fast computation of gradients. In this session we will give a short introduction to AD, show the usage of an operator overloading AD tool, and discuss the usage of AD in the context of an optimization problem.

Program: 

  • 08h45 - 09h00 Welcoming participants
  • 09h00 - 09h15 Presentation of the day
  • 09h15 - 10h45 Introduction to optimization
  • 10h45 - 11h00 Pause
  • 11h00 - 12h30 Derivative-free optimization
  • 12h30 - 13h30 Buffet lunch – Networking
  • 13h30 - 15h00 Modern convex optimization
  • 15h00 - 15h15 Pause
  • 15h15 - 16h45 Algorithmic Differentiation (AD)
  • 16h45 - 17h00 Conclusion

Free seminar, limited places!

  • Lunch will be offered as a buffet to all participants.
  • For more information, contact NAG via Email or +33 6 85 46 42 03
  • Seminar reserved for professionals.
  • Register HERE