Loading…
NIPS 2013 has ended
Friday, December 6 • 7:00pm - 11:59pm
Adaptive dropout for training deep neural networks

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Recently, it was shown that by dropping out hidden activities with a probability of 0.5, deep neural networks can perform very well. We describe a model in which a binary belief network is overlaid on a neural network and is used to decrease the information content of its hidden units by selectively setting activities to zero. This ''dropout network" can be trained jointly with the neural network by approximately computing local expectations of binary dropout variables, computing derivatives using back-propagation, and using stochastic gradient descent. Interestingly, experiments show that the learnt dropout network parameters recapitulate the neural network parameters, suggesting that a good dropout network regularizes activities according to magnitude. When evaluated on the MNIST and NORB datasets, we found our method can be used to achieve lower classification error rates than other feather learning methods, including standard dropout, denoising auto-encoders, and restricted Boltzmann machines. For example, our model achieves 5.8% error on the NORB test set, which is better than state-of-the-art results obtained using convolutional architectures.
None


Friday December 6, 2013 7:00pm - 11:59pm PST
Harrah's Special Events Center, 2nd Floor
  Posters
  • posterid Fri15
  • location Poster# Fri15