IFDS is located in the Central and Pacific Time zone. Please note the zone when accessing a particular event.

Loading Events

Speaker: Greg Canal

Abstract: Neural Collapse is a recently discovered phenomenon in deep neural network training that describes class separation in the final network layers. When a classification network is trained past the point of zero training error, it has been observed that the penultimate layer activations collapse to their respective class means, the means themselves form a simplex equiangular tight frame, and the final layer linear classifiers align with each respective mean. Neural collapse has been demonstrated both empirically on deep networks as well as theoretically on simplified models, and has inspired new questions on generalization, robustness, and architecture design. In this talk I will review the original discovery of neural collapse, as well as recent literature that expands on related questions.

Go to Top