background
logo
ArxivPaperAI

Phase transitions in the mini-batch size for sparse and dense neural networks

Author:
Raffaele Marino, Federico Ricci-Tersenghi
Keyword:
Condensed Matter, Disordered Systems and Neural Networks, Disordered Systems and Neural Networks (cond-mat.dis-nn), Statistical Mechanics (cond-mat.stat-mech), Artificial Intelligence (cs.AI), Machine Learning (cs.LG)
journal:
--
date:
2023-05-09 16:00:00
Abstract
The use of mini-batches of data in training artificial neural networks is nowadays very common. Despite its broad usage, theories explaining quantitatively how large or small the optimal mini-batch size should be are missing. This work presents a systematic attempt at understanding the role of the mini-batch size in training two-layer neural networks. Working in the teacher-student scenario, with a sparse teacher, and focusing on tasks of different complexity, we quantify the effects of changing the mini-batch size $m$. We find that often the generalization performances of the student strongly depend on $m$ and may undergo sharp phase transitions at a critical value $m_c$, such that for $m<m_c$ the training process fails, while for $m>m_c$ the student learns perfectly or generalizes very well the teacher. Phase transitions are induced by collective phenomena firstly discovered in statistical mechanics and later observed in many fields of science. Finding a phase transition varying the mini-batch size raises several important questions on the role of a hyperparameter which have been somehow overlooked until now.
PDF: Phase transitions in the mini-batch size for sparse and dense neural networks.pdf
Empowered by ChatGPT