MP in Action‎ > ‎Related Techniques‎ > ‎

The Many Faces of Duality

This article is based on a special meeting of the Mathematical Programming Study Group which was held to celebrate the work of Steven Vajda.

Duality is an important principle in Mathematical Programming. In the opening talk, Paul Williams (University of Southampton) explored different forms of duality in various areas of mathematics.

Duality exists as a function between objects X and Y. The fundamental properties of duality are:

  • reflexivity: the dual of the dual brings you back to where you started;
  • relations in X are reflected in relations in the dual of X.

Examples of duality include:

  • the complement in set theory;
  • negation in logic;
  • the swapping of faces and vertices in convex polyhedra: the dual of the cube is the octahedron; the dual of the dodecahedron is the icosahedron; while the tetrahedron is its own dual;
  • the swapping of regions and vertices in planar graphs.

In Linear Programming (LP) we can define a dual problem for every problem by swapping rows and columns; right hand sides and cost coefficients; and reversing the signs of the inequalities and the direction of optimisation. As with other forms of duality, the dual of the dual is the original problem. However, there is a much stronger property with LP duality: if the original problem is feasible, then both it and the dual have optimal solutions with the same objective value. This leads on to the practical uses of LP duality: to sensitivity analysis, the marginal value of goods, the economic costs of constraints and the proof of optimality.

Integer Programming (IP) presents more difficulties. With an LP problem the optimal solution to a problem lies at a point where all the constraints are binding. With an IP problem the optimal solution lies at some integer solution which in general lies strictly inside the surface of binding constraints.

Figure 1: Optimal Integer Solution Lies Strictly in Interior

Nonetheless there are theoretical results for the dual of an IP and these have economic implications for the allocation of fixed costs, project selection with limited capital, the tariff structures of public utilities, etc.

Chairing the meeting, Ailsa Land (London School of Economics) noted that everybody using LP in the UK was for many years entirely dependent on Vajda's books, and indeed was almost certainly taught by him, since he gave courses and one-off lectures in very many places.

Steven Vajda spoke next. He noted that there had been numerous occurrences of coincident but independent discovery in science and mathematics. The greatest was Leibnitz's and Newton's discovery of the calculus. Fifty years ago, Dantzig and Kantorovich had separately discovered algorithms for solving LPs.

With a worked example, he compared Dantzig's Simplex method with Kantorovich's method, showing the similarities. Both involve looking at values in tableaux, checking a property to assess whether one is at the optimum and, if not, exchanging variables. However, while the optimal Simplex tableau yields dual values, these are not apparent with Kantorovich's method.

Jakob Krarup (DIKU, Copenhagen) explored pairs of problems which were duals in the sense that they were based on the same data and their optimal solutions were essentially the same. In the seventeenth century Fermat had posed this problem: given three points in the plane, find a fourth point such that the sum of the distances from it to the three points is minimised. Torricelli (inventor of the barometer) had solved this with an elegant geometric construction. In the eighteenth century, in The Lady's Diary or Woman's Almanack, Thomas Moss had posed the problem: given a triangle, circumscribe the largest possible equilateral triangle. The centre of this triangle is the Torricelli point which solves Fermat's problem.

Figure 2: Construction of the Torricelli Point

Gautam Appa (London School of Economics) was concerned with duality in the residual problems left when all the integer variables in a zero-one MIP problem had been fixed at integer values. New and old results were derived by this approach for problems such as where to site an obnoxious facility and cutting a rectangle to minimise the number of defects. He showed that the main algorithm used in robust regression (regression which aims to avoid being driven by spurious outliers) does not necessarily give locally optimal solutions and that multiple optimal solutions may exist such that any data point could be an outlier.

The first stage in solving an IP problem is to find the continuous optimum, i.e. the optimum solution in which the decision variables are permitted to take non-integer values. The difference between this continuous optimum and the integer optimum is known as the duality gap. Martin Fieldhouse (Haverly Systems Europe) was considering a particular class of IP problems known as cutting stock problems. These are concerned with dividing up wide rolls of material (e.g. paper) into narrower widths. He asked what was the largest duality gap for cutting stock problems.

It had been thought that the largest duality gap was 1. This occurs, for instance in making widths of 50, 38, 36, 28, 24 and 24 from material 100 wide. The continuous solution is:

0.5 * {50, 50}
0.5 * {38, 38, 24}
0.5 * {36, 36, 28}
0.5 * {28, 24, 24, 24}

with a total use of 2.0 widths of material. The integer solution is 3.0, e.g.:

1.0 * {50, 38} trim loss 12
1.0 * {36, 28, 24} trim loss 12
1.0 * {24} trim loss 76

He had found a problem with a duality gap of 1.0333 and this had since been increased to 1.0666 but it was not known whether this was the largest or, indeed, whether the duality gap was even finite. The reader is invited to find trim problems with duality gaps greater than 1.

As a theoretical physicist, Peter Landsburg (University of Southampton) found the LP community an optimising group and that reminded him of entropy maximisation. He therefore considered some properties which entropy functions may, or may not, possess. Typical among these are superadditivity (entropy of A+B > entropy of A + entropy of B) and concavity (as in OR). These were discussed and, as an extra, he used the laws of thermodynamics to "prove" that the arithmetic mean was greater than the geometric mean.

Mike Dempster (University of Essex) explored the extension of Mathematical Programming duality into Nonlinear Programming and Stochastic Programming, i.e. where the probabilistic response of the system was explicitly recognised. There were wide areas of application, including the management of financial portfolios, forestry, electricity generation and water resources. In each of these one could take actions now but the response of the system depended on factors beyond our control, e.g. financial markets, the weather, consumer actions.

The approach to solving such problems was to develop sub-problems in many parallel scenarios, ascribe a probability to each scenario and seek to find solutions which were good across all scenarios. However, the number of such scenarios grew exponentially with the number of time periods and sampling was essential to make the technique practicable. This led to algorithms based on progressive hedging against future uncertainty and to the concept of the shadow price of information.

At the culmination of the meeting Lyn Thomas, the President of the OR Society, presented Steven Vajda with the Companionship of OR in recognition of his outstanding contribution over 50 years.

After the meeting about 20 participants repaired to the Senior Common Room at the London School of Economics where a fine dinner was held in Steven Vajda's honour. Thanks go to all the participants and in particular to Susan Powell for organising the meeting and to Maurice Shutler for arranging for the use of the conference room at the Monopolies and Mergers Commission.

To find other articles, refer to the MP in Action page.

Comments