Abstract

Restricted Boltzmann machines were developed using binary stochastic hidden units. These can be generalized by replacing each binary unit by an infinite number of copies that all have the same weights but have progressively more negative biases. The learning and inference rules for these Stepped Sigmoid Units are unchanged. They can be approximated efficiently by noisy, rectified linear units. Compared with binary units, these units learn features that are better for object recognition on the NORB dataset and face verification on the Labeled Faces in the Wild dataset. Unlike binary units, rectified linear units preserve information about relative intensities as information travels through multiple layers of feature detectors.

Keywords

Boltzmann machineBinary numberSigmoid functionComputer scienceObject (grammar)Restricted Boltzmann machineArtificial intelligenceInferencePattern recognition (psychology)Feature (linguistics)Cognitive neuroscience of visual object recognitionAlgorithmMathematicsComputer visionDeep learningArithmeticArtificial neural network

Affiliated Institutions

Related Publications

Publication Info

Year
2010
Type
article
Pages
807-814
Citations
13197
Access
Closed

External Links

Citation Metrics

13197
OpenAlex

Cite This

Vinod Nair, Geoffrey E. Hinton (2010). Rectified Linear Units Improve Restricted Boltzmann Machines. International Conference on Machine Learning , 807-814.