The Role of Permutation Invariance in Linear Mode Connectivity of Neural Networks

Rahim Entezari, Hanie Sedghi, Olga Saukh, Behnam Neyshabur

Research output: Contribution to conferencePaperpeer-review

Abstract

In this paper, we conjecture that if the permutation invariance of neural networks is taken into account, SGD solutions will likely have no barrier in the linear interpolation between them. Although it is a bold conjecture, we show how extensive empirical attempts fall short of refuting it. We further provide a preliminary theoretical result to support our conjecture. Our conjecture has implications for lottery ticket hypothesis, distributed training, and ensemble methods.
Original languageEnglish
Publication statusAccepted/In press - 25 Jan 2022
Event10th International Conference on Learning Representations: ICLR 2022 - online, Virtuell
Duration: 25 Apr 202229 Apr 2022
https://iclr.cc/

Conference

Conference10th International Conference on Learning Representations
Abbreviated titleICLR 2022
CityVirtuell
Period25/04/2229/04/22
Internet address

Keywords

  • cs.LG

Fingerprint

Dive into the research topics of 'The Role of Permutation Invariance in Linear Mode Connectivity of Neural Networks'. Together they form a unique fingerprint.

Cite this