Download PDFOpen PDF in browser

Surrogate “Level-Based” Lagrangian Relaxation for Mixed-Integer Linear Programming

EasyChair Preprint no. 7595

25 pagesDate: March 17, 2022

Abstract

To efficiently solve MILP problems, a novel decomposition and coordination Lagrangian Relaxation-based methodology is developed. The success of the method relies on the optimization of the associated non-smooth dual function. To efficiently optimize the dual function, linear convergence - the best convergence rate theoretically possible - inherent to Polyak’s step-sizing formula is exploited without the generally unknown optimal dual value. The key idea is to obtain “level” values, approaching the optimal dual value from above, through novel hyper-parameter-free “multiplier-convergence-feasibility” LP constraint satisfaction problems, from which the sought-for “level” estimates are inferred whenever the problem is infeasible. Testing results for Generalized Assignment Problems (GAP) demonstrate that 1. “level” estimates decrease and dual values increase faster than those obtained by other existing methods; the “level” values also provide tight upper bounds to quantify the quality of Lagrangian multipliers, 2. the new method converges fast and obtains high-quality feasible solutions – for the largest instances of GAP, globally optimal solutions are obtained, 3. the new method is robust with respect to initial parameters such as step-sizes and multipliers. Stochastic versions of Job-Shop Scheduling problems are also tested demonstrating that the new method is at least 2 orders of magnitude faster than branch-and-cut.

Keyphrases: Assignment Problems, discrete optimization, Job Shop Scheduling, Lagrangian relaxation, Mixed Integer Linear Programming, non-smooth optimization

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:7595,
  author = {Mikhail Bragin},
  title = {Surrogate “Level-Based” Lagrangian Relaxation for Mixed-Integer Linear Programming},
  howpublished = {EasyChair Preprint no. 7595},

  year = {EasyChair, 2022}}
Download PDFOpen PDF in browser