Bayesian Evidential Deep Learning with PAC Regularization

Manuel Haussmann, Sebastian Gerwinn, Melih Kandemir

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

12 Downloads (Pure)

Abstract

We propose a novel method for closed-form predictive distribution modeling with neural nets. In quantifying prediction uncertainty, we build on Evidential Deep Learning, which has been impactful as being both simple to implement and giving closed-form access to predictive uncertainty. We employ it to model aleatoric uncertainty and extend it to account also for epistemic uncertainty by converting it to a Bayesian Neural Net. While extending its uncertainty quantification capabilities, we maintain its analytically accessible predictive distribution model by performing progressive moment matching for the first time for approximate weight marginalization. The eventual model introduces a prohibitively large number of hyperparameters for stable training. We overcome this drawback by deriving a vacuous PAC bound that comprises the marginal likelihood of the predictor and a complexity penalty. We observe on regression, classification, and out-of-domain detection benchmarks that our method improves model fit and uncertainty quantification.
Original languageEnglish
Title of host publicationAdvances in Approximate Bayesian Inference : AABI
Publication date2020
Publication statusPublished - 2020

Bibliographical note

Presented at AABI 2020

Fingerprint

Dive into the research topics of 'Bayesian Evidential Deep Learning with PAC Regularization'. Together they form a unique fingerprint.

Cite this