Exploiting hidden structures in non-convex games for convergence to Nash equilibrium - POLARIS - Performance analysis and Optimization of LARge Infrastructure and Systems
Conference Papers Year : 2023

Exploiting hidden structures in non-convex games for convergence to Nash equilibrium

Abstract

A wide array of modern machine learning applications – from adversarial models to multi-agent reinforcement learning – can be formulated as non-cooperative games whose Nash equilibria represent the system's desired operational states. Despite having a highly non-convex loss landscape, many cases of interest possess a latent convex structure that could potentially be leveraged to yield convergence to an equilibrium. Driven by this observation, our paper proposes a flexible first-order method that successfully exploits such "hidden structures" and achieves convergence under minimal assumptions for the transformation connecting the players' control variables to the game's latent, convex-structured layer. The proposed method – which we call preconditioned hidden gradient descent (PHGD) – hinges on a judiciously chosen gradient preconditioning scheme related to natural gradient methods. Importantly, we make no separability assumptions for the game's hidden structure, and we provide explicit convergence rate guarantees for both deterministic and stochastic environments.
Fichier principal
Vignette du fichier
Main.pdf (5.91 Mo) Télécharger le fichier
Origin Files produced by the author(s)
licence

Dates and versions

hal-04312981 , version 1 (28-11-2023)

Licence

Identifiers

  • HAL Id : hal-04312981 , version 1

Cite

Iosif Sakos, Emmanouil-Vasileios Vlatakis-Gkaragkounis, Panayotis Mertikopoulos, Georgios Piliouras. Exploiting hidden structures in non-convex games for convergence to Nash equilibrium. NeurIPS 2023 - 37th Conference on Neural Information Processing Systems, Dec 2023, New Orleans (LA), United States. pp.1-32. ⟨hal-04312981⟩
95 View
37 Download

Share

More