background
logo
ArxivPaperAI

Toward TransfORmers: Revolutionizing the Solution of Mixed Integer Programs with Transformers

Author:
Joshua F. Cooper, Seung Jin Choi, I. Esra Buyuktahtakin
Keyword:
Computer Science, Artificial Intelligence, Artificial Intelligence (cs.AI), Machine Learning (cs.LG), Combinatorics (math.CO), Optimization and Control (math.OC), Machine Learning (stat.ML)
journal:
--
date:
2024-02-20 00:00:00
Abstract
In this study, we introduce an innovative deep learning framework that employs a transformer model to address the challenges of mixed-integer programs, specifically focusing on the Capacitated Lot Sizing Problem (CLSP). Our approach, to our knowledge, is the first to utilize transformers to predict the binary variables of a mixed-integer programming (MIP) problem. Specifically, our approach harnesses the encoder decoder transformer's ability to process sequential data, making it well-suited for predicting binary variables indicating production setup decisions in each period of the CLSP. This problem is inherently dynamic, and we need to handle sequential decision making under constraints. We present an efficient algorithm in which CLSP solutions are learned through a transformer neural network. The proposed post-processed transformer algorithm surpasses the state-of-the-art solver, CPLEX and Long Short-Term Memory (LSTM) in solution time, optimal gap, and percent infeasibility over 240K benchmark CLSP instances tested. After the ML model is trained, conducting inference on the model, including post-processing, reduces the MIP into a linear program (LP). This transforms the ML-based algorithm, combined with an LP solver, into a polynomial-time approximation algorithm to solve a well-known NP-Hard problem, with almost perfect solution quality.
PDF: Toward TransfORmers: Revolutionizing the Solution of Mixed Integer Programs with Transformers.pdf
Empowered by ChatGPT