Variational closed-Form deep neural net inference.

Research output: Contribution to journalJournal articleResearchpeer-review


We introduce a Bayesian construction for deep neural networks that is amenable to mean field variational inference that operates solely by closed-form update rules. Hence, it does not require any learning rate to be manually tuned. We show that by this virtue it becomes possible with our model to perform effective deep learning on three setups where conventional neural nets are known to perform suboptimally: i) online learning, ii) learning from small data, and iii) active learning. We compare our approach to earlier Bayesian neural network inference techniques spanning from expectation propagation to gradient-based variational Bayes, as well as deterministic neural nets with various activations functions. We observe our approach to improve on all these alternatives in two mainstream vision benchmarks and two medical data sets: diabetic retinopathy screening and exudate detection from eye fundus images.
Original languageEnglish
JournalPattern Recognition Letters
Pages (from-to)145-151
Publication statusPublished - 1. Sept 2018
Externally publishedYes


Dive into the research topics of 'Variational closed-Form deep neural net inference.'. Together they form a unique fingerprint.

Cite this