Felix, qui, quod amat, defendere fortiter audet
Home -> Publications
Home
  Publications
    
edited volumes
  Awards
  Research
  Teaching
  Miscellaneous
  Full CV [pdf]
  BLOG






  Events








  Past Events





Publications of Torsten Hoefler
Elad Hoffer, Tal Ben-Nun, Itay Hubara, Niv Giladi, Torsten Hoefler, Daniel Soudry:

 Increasing batch size through instance repetition improves generalization

(In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 2020)

Abstract

Large-batch SGD is important for scaling training of deep neural networks. However, without fine-tuning hyperparameter schedules, the generalization of the model may be hampered. We propose to use batch augmentation: replicating instances of samples within the same batch with different data augmentations. Batch augmentation acts as a regularizer and an accelerator, increasing both generalization and performance scaling for a fixed budget of optimization steps. We analyze the effect of batch augmentation on gradient variance and show that it empirically improves convergence for a wide variety of networks and datasets. Our results show that batch augmentation reduces the number of necessary SGD updates to achieve the same accuracy as the state-of-the-art. Overall, this simple yet effective method enables faster training and better generalization by allowing more computational resources to be used concurrently.

Documents

    
 

BibTeX

@inproceedings{,
  author={Elad Hoffer and Tal Ben-Nun and Itay Hubara and Niv Giladi and Torsten Hoefler and Daniel Soudry},
  title={{Increasing batch size through instance repetition improves generalization}},
  year={2020},
  month={Jun.},
  booktitle={The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  source={http://www.unixer.de/~htor/publications/},
}


serving: 18.218.1.38:60362© Torsten Hoefler