Applied Math Seminar: Jason Adams, Sandia
Event Description:
Title: Improving and Assessing the Quality of Uncertainty Quantification in Machine Learning
Abstract: Standard deep learning methods, which have been used to achieve impressive results on a variety of problems, produce point predictions but do not provide measures of uncertainty. From both a statistical and ethical perspective, the lack of uncertainty quantification in deep learning models is a major issue. Recent work has focused on the development of deep learning methods that include uncertainty estimates alongside point predictions. However, little attention has been given to the quality of the estimated uncertainties. We demonstrate that different models trained on the same data can produce vastly different uncertainties. We discuss the challenges of assessing the quality of uncertainty estimates and comparing models in terms of estimated uncertainties. We describe several recent developments in both assessing and improving the quality of uncertainty estimates including a novel method for generating synthetic image data, an adjustment to variational autoencoders that enable uncertainty quality assessment, and conformal prediction methods.