**Date/Time**

Date(s) - 09/04/2024*3:30 pm - 4:30 pm*

**Date/Time:** April 9, 2024 from 3:30PM – 4:30PM (will bring refreshments to KTH B105)

**Location:** KTH B105

**Speaker:** Mihai Nica (Assistant Professor, Department of Mathematics and Statistics, University of Guelph)

**Title: **Neural Networks in the Limit of Infinite Depth-and-Width

**Abstract: **Neural networks have become so large that their behaviour can be well approximated by “infinite neural networks”, which are obtained by considering the limit as the number of neurons goes to infinity. However, there are many possible infinite limits one can take! For example, one well-known limit is the “neural tangent kernel” (NTK) limit, where the depth is fixed and the layer width goes to infinity. In this talk, I will introduce a different alternative infinite limit, the infinite depth-and-width limit, where both the depth and the width are scaled to infinity simultaneously. This leads to exotic non-Gaussian distributions that are very different from NTK-type behaviour but match the output of finite neural networks more accurately. Some of what I will talk about is available via an online recording at https://youtu.be/93X0L1U5C0E and from the more recent work https://arxiv.org/abs/2302.09712.