background
logo
ArxivPaperAI

COA-GPT: Generative Pre-trained Transformers for Accelerated Course of Action Development in Military Operations

Author:
Vinicius G. Goecks, Nicholas Waytowich
Keyword:
Computer Science, Artificial Intelligence, Artificial Intelligence (cs.AI), Computation and Language (cs.CL), Human-Computer Interaction (cs.HC), Machine Learning (cs.LG)
journal:
--
date:
2024-02-01 00:00:00
Abstract
The development of Courses of Action (COAs) in military operations is traditionally a time-consuming and intricate process. Addressing this challenge, this study introduces COA-GPT, a novel algorithm employing Large Language Models (LLMs) for rapid and efficient generation of valid COAs. COA-GPT incorporates military doctrine and domain expertise to LLMs through in-context learning, allowing commanders to input mission information - in both text and image formats - and receive strategically aligned COAs for review and approval. Uniquely, COA-GPT not only accelerates COA development, producing initial COAs within seconds, but also facilitates real-time refinement based on commander feedback. This work evaluates COA-GPT in a military-relevant scenario within a militarized version of the StarCraft II game, comparing its performance against state-of-the-art reinforcement learning algorithms. Our results demonstrate COA-GPT's superiority in generating strategically sound COAs more swiftly, with added benefits of enhanced adaptability and alignment with commander intentions. COA-GPT's capability to rapidly adapt and update COAs during missions presents a transformative potential for military planning, particularly in addressing planning discrepancies and capitalizing on emergent windows of opportunities.
PDF: COA-GPT: Generative Pre-trained Transformers for Accelerated Course of Action Development in Military Operations.pdf
Empowered by ChatGPT