A Fast Learning Algorithm for Deep Belief Nets
We show how to use “complementary priors” to eliminate the explaining-away effects that make inference difficult in densely connected belief nets that have many hidden layers. U...
Explore 2,022 academic publications
We show how to use “complementary priors” to eliminate the explaining-away effects that make inference difficult in densely connected belief nets that have many hidden layers. U...
The Gibbs sampler, the algorithm of Metropolis and similar iterative simulation methods are potentially very helpful for summarizing multivariate distributions. Used naively, ho...
We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, two-class logistic regression, and multi- nom...
Abstract Each year, the American Cancer Society estimates the numbers of new cancer cases and deaths in the United States and compiles the most recent data on population‐based c...
Abstract Summary: RAxML-VI-HPC (randomized axelerated maximum likelihood for high performance computing) is a sequential and parallel program for inference of large phylogenies ...
Training Deep Neural Networks is complicated by the fact that the distribution of each layer's inputs changes during training, as the parameters of the previous layers change. T...
Two extended basis sets (termed 5–31G and 6–31G) consisting of atomic orbitals expressed as fixed linear combinations of Gaussian functions are presented for the first row atoms...
The classification of diabetes mellitus and the tests used for its diagnosis were brought into order by the National Diabetes Data Group of the USA and the second World Health O...