High-dimensional manifold of solutions in neural networks: insights from statistical physics

Enrico M. Malatesta
Condensed Matter, Disordered Systems and Neural Networks, Disordered Systems and Neural Networks (cond-mat.dis-nn), Machine Learning (cs.LG), Probability (math.PR), Statistics Theory (math.ST)
2023-09-16 16:00:00
In these pedagogic notes I review the statistical mechanics approach to neural networks, focusing on the paradigmatic example of the perceptron architecture with binary an continuous weights, in the classification setting. I will review the Gardner's approach based on replica method and the derivation of the SAT/UNSAT transition in the storage setting. Then, I discuss some recent works that unveiled how the zero training error configurations are geometrically arranged, and how this arrangement changes as the size of the training set increases. I also illustrate how different regions of solution space can be explored analytically and how the landscape in the vicinity of a solution can be characterized. I give evidence how, in binary weight models, algorithmic hardness is a consequence of the disappearance of a clustered region of solutions that extends to very large distances. Finally, I demonstrate how the study of linear mode connectivity between solutions can give insights into the average shape of the solution manifold.
PDF: High-dimensional manifold of solutions in neural networks: insights from statistical physics.pdf
Empowered by ChatGPT