Spring 2016

**Office hours:** Tuesdays 2-4pm, Wednesdays 12 noon-2pm, or by appointment.

**Material
on reserve** in the library.

**Scores** are available on LMS

- Homework 1: Mean 51.5/60, median 54.5, min 36, max 60, standard deviation 8.0.
- Homework 2: Mean 53.5/60, median 55, min 34, max 60, standard deviation 6.6.
- Homework 3: Mean 45.5/50, median 47, max 50, standard deviation 4.6.
- Homework 4: Mean 43.8/50, median 44, max 49, standard deviation 4.0.
- Homework 5: Mean 52.3/60, median 53.5, max 60, standard deviation 6.1.
- Homework 6: Mean 43.4/50, median 43.5, max 50, standard deviation 5.8.
- Totals on homeworks 1-6 for students who have handed in all the homeworks,
out of 330:

317 314 312 312 311 309 309 305 305 305 301 297 293 293 281 270 248 239 206 - Midterm: Mean 74.2/100, median 68.5, maximum 100, standard deviation 18.5.
- Scores on the midterm:

100 100 99 99 97 88 85 80 78 70 67 65 64 63 63 62 60 58 45 41

**Projects:**
Here is some information
about the requirements for your project reports and presentations.

** Midterm Exam: **
In class on Friday, April 8

It will cover everything seen in class through Tuesday, March 29.

You can bring one sheet of handwritten notes, no larger
than 8.5" x 11". You can write on both sides.

Here are the solutions.

**Old exams**:

- The midterm exam from 2014. Here are the solutions. The mean was 69%.
- The midterm exam from 2012. Here are the solutions. The mean was 66%.
- The midterm exam from 2010. Mean: 71%. Here are the solutions.
- The midterm exam from 2008. Mean: 68%. Here are the solutions.
- The midterm exam from 2006. Mean: 69%. Here are the solutions.
- The take home midterm exam from 2004. Here are the solutions.
- The take home midterm exam from 2002. Here are the solutions.
- The take home midterm exam from 2000. Here are the solutions.

** Homework:**

- Homework 1. Here are the solutions.
- Homework 2. Here are the solutions.
- Homework 3. Here are the solutions.
- Homework 4. Here are the solutions.
- Homework 5. Here are the solutions.
- Homework 6. Here are the solutions.

**Information about
AMPL**.

** Notes:**
Handwritten scanned copies of my notes from previous semesters (pdf files):

- Introduction, basic feasible solutions,
duality, and the simplex algorithm.
- Introduction: Lecture 1: Pages 1-8. Pages 5-8 are covered in the first three handouts
- Basic feasible solutions, degeneracy: Lecture 2: Pages 9-19. Here are typewritten notes for part of this material.
- Duality: Lecture 3: Pages 20-29. Here are typewritten notes for part of this material.
- The simplex algorithm: Lecture 4: Pages 30-42. See also two handouts.
- Multiple optimal solutions, handling free variables and upper bounds: Lecture 5: Pages 43-50. See also a handout.

- Revised simplex, resolution.
- Lecture 6: Revised simplex method, pages 1-8. Introduction to AMPL.
- Lecture 7: Dual simplex method, via a handout. Fourier-Motzkin elimination, pages 16-22. Using Farkas to prove strong duality, page 29.
- Lecture 8: Resolution, pages 9-15. The Weyl and Minkowski theorems, pages 23-28. Here are typewritten notes for part of this material (updated from class to include a proof of the affine Minkowski Theorem and an example for the Goldman Resolution Theorem).

- The Klee Minty cube: Lecture 7.
- Probabilistic analysis of the simplex method: Lecture 7. Focus on the results on pages 4, 5, and 9.
- The ellipsoid method: Lecture 8.
- Dantzig-Wolfe decomposition,
cutting stock problems, network simplex, multicommodity network flows.
- Lectures 9 and 10: introduction to Dantzig-Wolfe, pages 1-5. Dantzig-Wolfe iteration and behavior, block angular structure, pages 6-17. Also, an example of Dantzig-Wolfe decomposition.
- Lectures 10 and 11: Lagrangian relaxation, cutting stock problems, pages 18-27. Also, interpreting Dantzig-Wolfe decomposition as a cutting plane method for the Lagrangian dual.
- Lectures 12 and 13: Network simplex, multicommodity network flow problems, pages 28-37. Here are some notes on network flows.

- Stochastic programming.
- Lecture 14: pages 1-5. Here's some material on the newsvendor problem.
- Lecture 15: pages 6-11, 17-19.
- Lecture 16: pages 20-26.
- The results on sample average approximation can be found in a tutorial on stochastic programming by Shapiro and Philpott.

- Introduction
to interior point methods:
strict complementarity, the central path,
primal-dual framework, complexity theory, affine methods,
the centering direction, a potential-reduction method.

Pages 24-27, 29, 1, 3-7 covered in Lecture 17.

Pages 7-8, 2, 9-17 covered in Lecture 18.

Pages 18-22 covered in Lecture 20. - Barrier methods.

Pages 6-18 covered in Lecture 20. - Long-steps, quadratic convergence, predictor-corrector,
finite termination, crossover, infeasible methods.

Pages 6-12 covered in Lecture 21.

Pages 13-16 covered in Lecture 22. - Homogenized methods,
Mehrotra's predictor-corrector, sparse linear algebra,
interior point cutting plane methods.
~~Pages 1-4 covered in Lecture 21.~~

Pages 5-9 covered in Lecture 22.

Pages 10-14 summarized in ten minutes in Lecture 22.

Pages 19-25 covered in Lecture 23.

Slides 3/72 to 30/72 of this talk on interior point cutting plane methods were covered in Lecture 23. The talk is based on a paper. - SDP
and SOCP.

Pages 1-7 covered in Lecture 23.

Pages 18-20, 14-15, and 17 covered in Lecture 24.

Pages 13, 27, 30, and 22-24 covered in Lecture 25.

Pages 8-12, 21, 25-26, and 28 covered in Lecture 26.

Pages 31-33 covered in Lecture 27. The example on sensor location is here.~~Page 16 covered in Lecture 28.~~

~~Lecture 28: Pages 35/72 to 53/72 and 67/72 to 69/72 from this talk on interior point cutting plane methods should be covered. It is based on a paper.~~

The book Convex Optimization, by Boyd and Vandenberghe, contains a wealth of material on SDP, SOCP, and conic programming. - Completely positive programming. Lecture 28.
~~The material on linear programs with complementarity constraints in Lecture 28 is from two talks: slides 4/54 to 14/54 from this one and slides 12/41 to 24/41 from this one.~~

- Linear algebra. (Lecture 1.)
- Subspaces, affine sets, convex sets, and cones. (Lecture 1.)
- Dimension, polyhedra, and faces. (Lecture 1.)
- An iteration of the simplex algorithm and the algorithm. (Lecture 4.)
- Handling upper bounds in the simplex algorithm. (Lecture 5.)
- The dual simplex algorithm. (Lecture 6.)
- Extreme points and extreme rays of polyhedra. (Lecture 8.)
- An example of Dantzig-Wolfe decomposition. (Lecture 9.)

**Papers and resources:**

- An electronic copy of the textbook for the second half of the course is available for free through the library: S. Wright, Primal-dual Interior Point Methods, SIAM, 1997. Note: more generally, SIAM e-books are free to RPI students through the library. In addition, SIAM membership is free for RPI students, and SIAM books are discounted 30% for members if you buy directly from SIAM.
- The book A First Course in Linear Optimization, Second Edition, Reex Press, 2013/4/5 by Jon Lee is available online.
- The book Convex Optimization, by Boyd and Vandenberghe, contains a wealth of material on SDP, SOCP, and conic programming.
*Introduction to Stochastic Programming*by Birge and Louveaux is available online via the RPI library.- A computational study of Dantzig-Wolfe decomposition, the doctoral thesis of James Tebboth.
- Myths and counterexamples in linear programming (and other areas of optimization). This site shows that you have to be careful about your assumptions when you state some things that are "obvious" in linear programming.
- Issue 87 of the Mathematical Optimization Society newsletter Optima, discussing the geometry of polyhedra and simplex pivoting rules.

John Mitchell's homepage. Mathematics Course Materials.