logo-polimi
Loading...
Risorse bibliografiche
Risorsa bibliografica obbligatoria
Risorsa bibliografica facoltativa
Scheda Riassuntiva
Anno Accademico 2019/2020
Scuola Scuola di Ingegneria Industriale e dell'Informazione
Insegnamento 095972 - OPTIMIZATION
Docente Amaldi Edoardo
Cfu 8.00 Tipo insegnamento Monodisciplinare

Corso di Studi Codice Piano di Studio preventivamente approvato Da (compreso) A (escluso) Insegnamento
Ing Ind - Inf (Mag.)(ord. 270) - MI (263) MUSIC AND ACOUSTIC ENGINEERING*AZZZZ051822 - NONLINEAR OPTIMIZATION
Ing Ind - Inf (Mag.)(ord. 270) - MI (473) AUTOMATION AND CONTROL ENGINEERING - INGEGNERIA DELL'AUTOMAZIONE*AZZZZ089187 - OTTIMIZZAZIONE DISCRETA
051822 - NONLINEAR OPTIMIZATION
051823 - DISCRETE OPTIMIZATION
Ing Ind - Inf (Mag.)(ord. 270) - MI (474) TELECOMMUNICATION ENGINEERING - INGEGNERIA DELLE TELECOMUNICAZIONI*AZZZZ051822 - NONLINEAR OPTIMIZATION
Ing Ind - Inf (Mag.)(ord. 270) - MI (481) COMPUTER SCIENCE AND ENGINEERING - INGEGNERIA INFORMATICA*AZZZZ051823 - DISCRETE OPTIMIZATION
051822 - NONLINEAR OPTIMIZATION
Ing Ind - Inf (Mag.)(ord. 270) - MI (487) MATHEMATICAL ENGINEERING - INGEGNERIA MATEMATICA*AZZZZ095972 - OPTIMIZATION

Obiettivi dell'insegnamento
 
The original 8 credit version of this course is linked to two joint courses: 051823 - DISCRETE OPTIMIZATION (5 credits) and 051822 - NONLINEAR OPTIMIZATION (5 credits). This description contains the objectives, programs and expected learning outcomes of the three versions of the course. 
 
The purpose of the overall version of the course is to present the main concepts and methods of discrete optimization and nonlinear (continuous) optimization, which allow to tackle a wide variety of decision-making problems arising in science, engineering and management. Modeling and application-related aspects are also covered. Optimization problems from, among others, data mining, economics, energy, healthcare, machine learning, logistics, production, telecommunications and transportation, are discussed.

After the introduction and the fundamentals of convex analysis for optimizatrion (common to the two joint courses), the course is subdivided into two parts. The first one is devoted to Discrete Optimization and in particular to integer linear programming, that is, optimization problems with a linear objective function and linear constraints where some (or all) variables are restricted to take discrete (integer) values. The second part is devoted to Nonlinear Optimization with continuous variables.
 
During the computer laboratory sessions, the students will learn how to use AMPL modeling language (with state-of-the-art solvers like CPLEX) and MATLAB to build optimization models and solve (mixed) integer linear and nonlinear optimization problems.

Risultati di apprendimento attesi

 

In the following, the expected learning outcomes related to the joint courses Discrete Optimization and Nonlinear Optimization are labelled with, respectively, (1) and (2).

 

Knowledge and understanding

 

The students will learn:

- the key convex analysis concepts and results for optimization (1,2)

- a wide selection of optimization models arising in a variety of fields of application   (1)
- the main modeling techniques for Integer Linear Programming (ILP) problems  (1)
- the concepts of strong and ideal formulations, and how to stregthen a given formulation (1)
- some important relaxations, in particular the Lagrangian relaxation with the subgradient method (1)
- the cutting plane methods for generic and structured ILP problems  (1)
- the general scheme of the Banch-and-Cut method (1)
- the column generation method and Benders' deomposition for tackling large-scale problems (1)
- the greedy, local search and Tabu search heuristics (1)

- the optimality conditions for unconstrained nonlinear optimization  (2)
- the gradient, conjugate directions, Newton and quasi-Newton methods, with their convergence properties  (2)
- the subgradient method for convex non-smooth optimization  (2)
- the optimality conditions for constrained nonlinear optimization  (2)
- the concepts and results of Lagrangian duality  (2)
- the null-space and active set methods for quadratic programming problems  (2)
- the quadratic penalty method and the augmented Lagrangian method  (2)
- the logarithmic barrier method  (2)
- the fundamentals of quadratic sequential programming. (2)


Applying knowledge and understanding


The students will be able to:

- build optimization models starting from the decision problem descriptions, solve them using a mathematical modeling language and/or a state-of-the-art optimization solver  (1,2)

- enhance (linearize and strengthen) ILP formulations  (1)
- recognize an ideal ILP formulation based on the concept of totally unimodular matrices  (1)
- apply the studied methods, such as for instance cutting planes, column generation, Benders' decomposition and heuristics, to some small problems  (1)
- devise a Lagrangian relaxation for an assigned discrete optimization problem and discuss its strength  (1)

- establish whether a given nonlinear optimization problem is convex  (2)
- identify an appropriate method to tackle a given nonlinear optimization problem and apply it to small examples  (2)
- solve a constrained nonlinear optimization problem via the Karush-Kuhn-Tucker optimality conditions  (2)
- construct the Lagrangian dual of a given problem and establish the relationship with the primal problem  (2)
- solve (un)constrained nonlinear optimization problems with some of the studied methods in Matlab  (2)

 

- leverage on the acquired theoretical and algorithmic knowledge to answer relevant questions.  (1,2)

 

Communication skills


The students will be able to clearly describe any studied method, its main properties and its (dis)advantages with respect to other alternative algorithms.  (1,2)

 


Argomenti trattati
 
In the following, the topics related to the joint courses Discrete Optimization and Nonlinear Optimization are labelled with respectively (1) and (2).
 

1. Introduction  (1,2)

 

Discrete (integer) and nonlinear (continuous) optimization models, local and global optima. Three examples: 3-D image reconstruction (computed tomography), location and transportation problem, winner determination in combinatorial auctions. Some fields of application.

 

2. Fundamentals of convex analysis  (1,2)

 

Separating/supporting hyperplanes, main consequences (Farkas Lemma), characterization of convex functions, subgradients.

 

3. Discrete optimization  (1)


- Integer linear programming (ILP) problems and modeling techniques with illustrative examples, including covering, packing, partitionning, location, routing, network flows, lot sizing and scheduling.
- Valid inequalities. Alternative, strong and ideal formulations.
- "Easy" ILP problems and totally unimodular matrices.
- Linear, surrogate and combinatorial relaxations.
- Lagrangian relaxation: Lagrangian subproblem, dual problem, subgradient method, choice of Lagrangian dual. Examples: symmetric travelling salesman and generalized assignment problems.
- Branching and bounding strategies for the Branch-and-Bound method (the general method is a prerequisite).
- Cutting plane methods: Chvátal-Gomory procedure, fractional and mixed integer Gomory cuts, facets of polyhedra, strong valid inequalities for structured ILP problems (cover inequalities for the binary knapsack problem, clique and odd cycle inequalities for the independent set problem, cut-set inequalities for TSP), separation problem and lifting procedure.
- Branch-and-Cut method: general scheme and main components.
- Large-scale problems: column generation method and Benders' decomposition.
- Heuristics (greedy, local search and Tabu search) and approximation algorithms with performance guarantee.


4. Unconstrained nonlinear optimization  (2)


- Examples: statistical estimation, traning artificial neural networks.
- Optimality conditions based on feasible directions.
- Line search methods and convergence rate.
- One-dimensional optimization (Wolfe conditions).
- Line search methods: gradient, conjugate directions, Newton and quasi-Newton methods (DFP and BFGS).
- Subgradient method for convex non-smooth optimization.

 

5. Constrained nonlinear optimization  (2)

 

- Examples: designing linear classifiers and training Support Vector Machines (SVMs).
- Necessary optimality conditions: constraint qualification, Karush-Kuhn-Tucker conditions.
- Sufficient optimality conditions: Lagrange function, saddle points, weak and strong duality.
- Quadratic programming: examples from portfolio optimization and machine learning, null space method for equality constrained problems, active set methods for problems with also inequality constraints.
- Quadratic penalty and augmented Lagrangian methods.
- Barrier method and an application to linear programming (interior point method).
- Fundamentals of quadratic sequential programming.

 

A few topics will be assigned for independent study before related lectures or sessions devoted to exercises.

 

Organization


Besides the lectures, the course includes 9 sessions devoted to exercises (seven 90 minute sessions and two 60 minute sessions) and 6 sessions devoted to computer laboratory work (90 minutes each). Every week the students will be assigned a set of exercises to be solved before the next exercise session.


Joint courses: "Discrete Optimization" and "Nonlinear Optimization"


The courses "Discrete Optimization" and "Nonlinear Optimization" (5 credits) correpond to two overlapping parts of the course "Optimization" (8 credits).
While "Optimization" includes all the material presented during the lectures, the exercise and computer laboratory sessions,
- the course "Discrete Optimization" (5 credits) includes Chapters 1-3, the exercise sets n. 1-5 and the computer laboratory sessions n. 1-3 using AMPL and CPLEX,
- the course "Nonlinear Optimization" (5 credits) includes Chapters 1, 2, 4 and 5, the exercise sets 1,6-9 and the computer laboratory sessions 4-6 using MATLAB.
The students enrolled in both courses "Discrete Optimization" (5 credits) and "Nonlinear Optimization" (5 credits) must take the written exam of "Optimization" (8 credits) and conduct a project or an individual study (2 credits) to be defined with the instructor.

 

 


Prerequisiti

The prerequisites for the course include linear programming (basic feasible solutions, simplex algorithm, duality), graph optimization (minimum spanning tree, shortest paths, maximum flow), Branch and Bound method for ILP and combinatorial optimization problems, and the basics of NP-completeness theory. These topics are covered in the course "Foundations of Operations Research".


Modalità di valutazione

A written exam covering all the material presented during the lectures, the sessions devoted to exercises, and the computer laboratory sessions and assignments. For the joint courses Discrete Optimization and Nonlinear Optimization, the material is that related to the lectures, exercises and computer laboratories of the corresponding part of the course, as detailed in the above program.

The written exam consists of several questions aimed at evaluating the level of (applying) knowledge and understanding of the main concepts, models, results and methods covered in the course as specified in the expected learning outcomes. In particular, it includes modeling questions, questions involving the application of the studied methods to assigned problems or the solution of numerical problems with appropriate algorithms, and theoretical questions related to the main concepts, methods and results. An open-answer question will also allow assessment of the communication skills.

 

A few brief computer laboratory tests scheduled at the end of some computer laboratory sessions.

 

The final grade is determined by adding to the grade of the written exam the points obtained in the computer laboratory tests.

 


Bibliografia
Risorsa bibliografica obbligatoriaSlides of the lectures, material for the exercise and computer laboratory sessions http://home.deib.polimi.it/amaldi/OPT-19-20.shtml
Note:

The material will be made available during the semester

Risorsa bibliografica facoltativaM. S. Bazaraa, H. D. Sherali, C. M. Shetty, Nonlinear programming: Theory and Algorithms. Third edition, Editore: Wiley Interscience, Anno edizione: 2006
Risorsa bibliografica facoltativaS. Boyd, L. Vandenberghe, Convex optimization, Editore: Cambridge Univeristy Press, Anno edizione: 2004 http://www.stanford.edu/~boyd/cvxbook/
Risorsa bibliografica facoltativaM. Conforti, G. Cornuejols, G. Zambelli, Integer Programming, Editore: Springer, Anno edizione: 2014
Risorsa bibliografica facoltativaJ. Nocedal, S. Wright, Numerical optimization, First edition, Editore: Springer, Anno edizione: 1999
Risorsa bibliografica facoltativaL. A. Wolsey, Integer Programming, Editore: J. Wiley & Sons, Anno edizione: 1998

Forme didattiche
Tipo Forma Didattica Ore di attività svolte in aula
(hh:mm)
Ore di studio autonome
(hh:mm)
Lezione
52:00
89:00
Esercitazione
16:00
27:00
Laboratorio Informatico
12:00
4:00
Laboratorio Sperimentale
0:00
0:00
Laboratorio Di Progetto
0:00
0:00
Totale 80:00 120:00

Informazioni in lingua inglese a supporto dell'internazionalizzazione
Insegnamento erogato in lingua Inglese
Disponibilità di materiale didattico/slides in lingua inglese
Disponibilità di libri di testo/bibliografia in lingua inglese
Possibilità di sostenere l'esame in lingua inglese
Disponibilità di supporto didattico in lingua inglese
schedaincarico v. 1.6.1 / 1.6.1
Area Servizi ICT
08/12/2019