MATP6640/ISYE6770 Linear and Conic Optimization

TF 12:00–1:50, Voorhees South Spring 2022

Course Outline: Optimization is concerned with optimal allocation of resources, and linear programming is the fundamental optimization problem. There has been a great deal of progress in linear programming and huge problems can now be solved routinely using two different types of approaches — the simplex method and interior point methods. This course discusses these two classes of approaches, the theory underlying them, and extensions of the approaches. Interior point methods can be regarded as methods for nonlinear programming applied to linear programming. They can also be used to solve nonlinear programming problems, particularly conic optimization problems.

  1. Polyhedral theory: (1 week) The feasible region of a linear programming problem is a polyhedron, ie, the intersection of a finite collection of half spaces. The structure of polyhedra is discussed, including the Goldman Resolution theorem, and the Weyl and Minkowski theorems.
  2. Duality: (1–2 weeks) Duality is fundamental to the study of linear programming, and to conic optimization and nonlinear programming more generally. We will return to duality throughout the course.
  3. The simplex algorithm: (1–2 weeks) The simplex method is the traditional method for solving linear programming problems. I will discuss the simplex method in matrix terms, and indicate how it can be implemented efficiently.
  4. Decomposition and column generation methods for large scale problems: (3–4 weeks) Some large linear programs can be broken into smaller ones and the polyhedral structure of linear programming can be exploited to develop an algorithm. For other problems, the number of variables is very large and an attractive approach is to generate columns of the constraint matrix only as needed.
  5. The ellipsoid algorithm: (1 lecture) The ellipsoid algorithm was the first polynomial algorithm for linear programming. It was developed originally as an algorithm for nonlinear programming.
  6. Interior point methods for linear programming: (3 weeks) Topics to be discussed include the similarity of these algorithms to traditional nonlinear programming methods, the polynomial complexity of many of these algorithms, the asymptotic rate of convergence of the algorithms, and the computational results achieved. Efficient linear algebra techniques will also be discussed.
  7. Conic optimization: (3 weeks) Interior point methods for semidefinite programming and second order cone programming problems will be discussed.

Grading policy: The grade will be determined by three elements: homework, a midterm exam, and a project. The three elements will be equally important. Grades will be posted on LMS.

I will give out a homework approximately every two weeks. The homeworks will probably contain some computational work using AMPL, a mathematical programming modeling language. You should learn a fair amount from the homeworks. Therefore, try working out the solutions on your own. If you have difficulties, you may talk to me or to other students about the homeworks, but you must write up your solutions on your own. You may be able to find the solutions to some of the homework questions on the web: do not use these solutions!

There will be an in-class midterm exam on about Friday, March 18. The exam must be all your own work.

The project will involve modeling and computational testing. Depending on the project, the testing may use AMPL or PYOMO or it may use a programming language such as MATLAB or PYTHON or C or FORTRAN. You will write up your solution and give a presentation in class. The presentation will take place either during the last week of classes or during the time scheduled for a final exam. Your project can be one of the following:

Student learning objectives: Develop a thorough understanding of the theory and algorithms of linear programming. Be able to perform standard manipulations of linear optimization problems, especially those related to duality. Be able to solve optimization problems using modern packages. An ability to apply knowledge of mathematics, science and engineering. An ability to design and conduct experiments, as well as to analyze and interpret data. An ability to identify, formulate and solve engineering problems.

Textbooks:

Required:

 

D. Bertsimas and J. Tsitsiklis, Introduction to Linear Optimization, Athena Scientific, 1997.

 

S. Wright, Primal-dual Interior Point Methods. SIAM, 1997. Covers most of the second half of the course. Note: SIAM membership is free for RPI students, and SIAM books are discounted 30% for members if you buy directly from SIAM. Further, SIAM e-books are free for RPI students, including this book.

Online texts and resources:

 

J. Lee, A First Course in Linear Optimization, Reex Press, 2013–21. Available online from https://sites.google.com/site/jonleewebpage/home/publications/#book

 

S. Boyd, Convex Optimization, a MOOC that originally ran January-April 2014.

The following textbooks each cover part of the course and are on reserve in the library:

 

V. Chvátal, Linear Programming. Freeman, 1983. I used to use this book, but it is now out of print.

 

K. G. Murty, Linear Programming. Wiley, 1983. Similar to Chvátal.

 

Roos, Terlaky, and Vial, Theory and Algorithms for Linear Optimization. Wiley, 1997. 2nd Ed 2005 (Springer). Excellent for interior point methods.

 

R. Vanderbei, Linear Programming. Kluwer, 1996. 2nd Ed 2001, 3rd Ed 2007 (Springer). Good coverage of both simplex and interior point methods.

 

Y. Ye, Interior Point Algorithms. Wiley, 1997. Very good for potential reduction interior point methods.

 

Fiacco and McCormick, Nonlinear programming; sequential unconstrained minimization techniques. Wiley, 1968. Reprinted by SIAM. The techniques presented in this book can be specialized to give interior point methods for linear programming.

 


Office Hours: Thursday 11am–1pm, on webex: https://rensselaer.webex.com/meet/mitchj

Remote: while we are remote, lectures will be prerecorded and posted on LMS and in Box. Class time will be used for webex office hours, replacing the regularly scheduled office hours.

Prerequisites: MATP4700/ISYE4770, or permission of instructor. Familiarity with linear programming and linear algebra will be assumed. Some elements of MATP6600/ISYE6780 and MATP4820/ISYE4780, such as the Karush-Kuhn-Tucker optimality conditions and Newton’s method, will be used, although it is not necessary to have seen this material before.

Attendance: Strongly encouraged. Active engagement in class will greatly help you in understanding the material.

The World Wide Web: This outline, the homeworks, and other information will be available via my homepage,

http://www.rpi.edu/~mitchj/matp6640

In addition, an RPI LMS page has been set up for this course. The software package AMPL will be available from there.

Academic Integrity: Student-teacher relationships are based on mutual trust. Acts which violate this trust undermine the educational process. The Rensselaer Handbook defines various forms of academic dishonesty and procedures for responding to them. The penalties for cheating can include failure in the course, as well as harsher punishments.

Appealing grades: As with any other administrative question regarding this course, see me in the first instance. If we are unable to reach agreement, you may appeal my decision to the Head of the Math Sciences department.

Cell phones: Please make sure that cell phone ringers are turned off during class.

      John Mitchell 518–276–6915
      Amos Eaton 325 mitchj at rpi dot edu
      Office hours: Thursday 11am–1pm, https://rensselaer.webex.com/meet/mitchj