Our current deep neural networks - e.g. large language models such as GPT3 - are highly inefficient which makes them very expensive to train. This makes standard practices of black box neural network training such as hyperparameter tuning ineffective and a principled understanding of these dynamical systems is needed. Empirically, we know that the power consumption of our brain is far lower so an efficient solution ought to exist. To address these questions I am interested in understanding and optimising neural network dynamics.

A duality connecting neural network and cosmological dynamics

Sven Krippendorf and Michael Spannowsky

We show that the dynamics of wide neural networks and scalar fields in FLRW are identical; providing a novel perspective to explaining neural networks and to simulate early Universe models.

Towards a Phenomenological Understanding of Neural Networks: Data

Sam Tovey, Sven Krippendorf, Konstantin Nikolaou, Christian Holm

We introduce collective variables for neural network dynamics and demonstrate their successful application for data selection.