background
logo
ArxivPaperAI

Unlearning regularization for Boltzmann Machines

Author:
Enrico Ventura, Simona Cocco, Rémi Monasson, Francesco Zamponi
Keyword:
Condensed Matter, Disordered Systems and Neural Networks, Disordered Systems and Neural Networks (cond-mat.dis-nn)
journal:
--
date:
2023-11-15 00:00:00
Abstract
Boltzmann Machines (BMs) are graphical models with interconnected binary units, employed for the unsupervised modeling of data distributions. As all models, in absence of a proper regularization BMs may over-fit the training data and thus provide a poor estimate of the distribution that generated them. In this study, we introduce a regularization method for BMs to improve generalization by reducing the susceptibility of the model under rescaling of the parameters. The new technique shares formal similarities with the unlearning algorithm, an iterative procedure used to improve memory associativity in Hopfield-like neural networks. We test our unlearning regularization on synthetic data generated by two simple models, the Curie-Weiss ferromagnetic model and the Sherrington-Kirkpatrick spin glass model, and we show that it outperforms $L_p$-norm schemes. Finally, we discuss the role of parameter initialization.
PDF: Unlearning regularization for Boltzmann Machines.pdf
Empowered by ChatGPT