The Role of Permutation Invariance in Linear Mode Connectivity of Neural Networks

Rahim Entezari, Hanie Sedghi, Olga Saukh, Behnam Neyshabur

Research output: Contribution to conferencePosterpeer-review

Abstract

In this paper, we conjecture that if the permutation invariance of neural networks is taken intoaccount, SGD solutions will likely have no barrier in the linear interpolation between them. Althoughit is a bold conjecture, we show how extensive empirical attempts fall short of refuting it. We furtherprovide a preliminary theoretical result to support our conjecture. Our conjecture has implications forlottery ticket hypothesis, distributed training and ensemble methods
Original languageEnglish
Publication statusPublished - 7 Jul 2021
EventSparsity in Neural Networks
Advancing Understanding and Practice
-
Duration: 8 Jul 20219 Jul 2021
https://sites.google.com/view/sparsity-workshop-2021/home?authuser=0

Workshop

WorkshopSparsity in Neural Networks
Advancing Understanding and Practice
Period8/07/219/07/21
Internet address

Keywords

  • deep learnig
  • loss landscape
  • optimization

Fingerprint

Dive into the research topics of 'The Role of Permutation Invariance in Linear Mode Connectivity of Neural Networks'. Together they form a unique fingerprint.

Cite this