Rahim Entezari (CSH) will present an online talk within the seminar “Analysis of Complex Systems” on June 18, 2021, 3PM-4PM (CET) via Zoom.
If you would like to attend, please email email@example.com
Title: Understanding Neural Networks Loss Landscape
Optimizing a neural network is often investigated as finding a minimum in an objective landscape. Therefore, understanding the geometric properties of this landscape has emerged as an important goal. Related works show that independently trained models are connected by a curve in weight space along which loss remains low. This curve could be a line if these trained models are residing in the same basin. We know many different hyperparameters affect where the endpoints stand, e.g. networks that share only a few epochs of their optimization trajectory are converging in the same basin, hence connected by a linear path of high accuracy. Aligned with this literature, in this talk, we try to understand how we can predict the geometry of the loss landscape. We also conjecture that if the permutation invariance of neural networks is taken into account, SGD solutions will likely have no barrier in the linear interpolation between them.