Dropout as data augmentation

Xavier Bouthillier, Kishore Konda, Pascal Vincent and Roland Memisevic
arXiv preprint arXiv:1506.08700, 2015

PDF Code

Dropout is typically interpreted as bagging a large number of models sharing parameters. We show that using dropout in a network can also be interpreted as a kind of data augmentation in the input space without domain knowledge. We present an approach to projecting the dropout noise within a network back into the input space, thereby generating augmented versions of the training data, and we show that training a deterministic network on the augmented samples yields similar results. Finally, we propose a new dropout noise scheme based on our observations and show that it improves dropout results without adding significant computational cost.

BibTeX:

@article{bouthillier2015dropout,
    author = {Bouthillier, Xavier and Konda, Kishore and Vincent, Pascal and Memisevic, Roland},
    journal = {arXiv preprint arXiv:1506.08700},
    month = {june},
    title = {Dropout as data augmentation},
    url = {https://arxiv.org/pdf/1506.08700},
    year = {2015}
}