PyTorch BatchNorm2D Weights Explained

Understanding PyTorch BatchNorm2D and its weights

Javier

--

The Batch Normalization Layer, proposed for the first time on 2015 on the famous paper Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift, has been the most used normalization layer in deep neural networks till very recently. As a widely used and well stablished normalization technique it has out-of-the-box implementations in every major Deep Learning framework: PyTorch, TensorFlow, MXNET…

--

--

Javier

AI Research Engineer in Deep Learning. Living between the math and the code. A philosophic seeker interested in the meaning of everything from quarks to AI.