15 research outputs found

    Quantifying Epistemic Uncertainty in Deep Learning

    Full text link
    Uncertainty quantification is at the core of the reliability and robustness of machine learning. In this paper, we provide a theoretical framework to dissect the uncertainty, especially the epistemic component, in deep learning into procedural variability (from the training procedure) and data variability (from the training data), which is the first such attempt in the literature to our best knowledge. We then propose two approaches to estimate these uncertainties, one based on influence function and one on batching. We demonstrate how our approaches overcome the computational difficulties in applying classical statistical methods. Experimental evaluations on multiple problem settings corroborate our theory and illustrate how our framework and estimation can provide direct guidance on modeling and data collection effort to improve deep learning performance

    What's Behind the Mask: Estimating Uncertainty in Image-to-Image Problems

    Full text link
    Estimating uncertainty in image-to-image networks is an important task, particularly as such networks are being increasingly deployed in the biological and medical imaging realms. In this paper, we introduce a new approach to this problem based on masking. Given an existing image-to-image network, our approach computes a mask such that the distance between the masked reconstructed image and the masked true image is guaranteed to be less than a specified threshold, with high probability. The mask thus identifies the more certain regions of the reconstructed image. Our approach is agnostic to the underlying image-to-image network, and only requires triples of the input (degraded), reconstructed and true images for training. Furthermore, our method is agnostic to the distance metric used. As a result, one can use LpL_p-style distances or perceptual distances like LPIPS, which contrasts with interval-based approaches to uncertainty. Our theoretical guarantees derive from a conformal calibration procedure. We evaluate our mask-based approach to uncertainty on image colorization, image completion, and super-resolution tasks, demonstrating high quality performance on each

    The 40th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering

    Get PDF
    These proceedings aim to collect the ideas presented, discussed, and disputed at the 40th Workshop on Bayesian Inference and Maximum Entropy, MaxEnt 2021. Skilling and Knuth seek to rebuild the foundations of quantum mechanics from probability theory, and Caticha competes in that endeavour with a very different entropy-based approach. Costa connects entropy with general relativity, Pessoa reports new insights on ecology and Yousefi derives classical density functional theory, both through the maximum entropy principle. Von Toussaint, Preuss, Albert, Rath, Ranftl and Kvas report the latest developments in regression and surrogate-based inference with applications to optimization and inverse problems in plasma physics, biomechanics and geodesy. Van Soom presents new priors for phonetics, Stern et al. propose a new haphazard sampling method, and Kelter uncovers two measure theoretic iss phonetics ues with hypothesis testing

    Predicting Financial Markets using Text on the Web

    Get PDF

    Reassessing the Paradigms of Statistical Model-Building

    Get PDF
    Statistical model-building is the science of constructing models from data and from information about the data-generation process, with the aim of analysing those data and drawing inference from that analysis. Many statistical tasks are undertaken during this analysis; they include classification, forecasting, prediction and testing. Model-building has assumed substantial importance, as new technologies enable data on highly complex phenomena to be gathered in very large quantities. This creates a demand for more complex models, and requires the model-building process itself to be adaptive. The word “paradigm” refers to philosophies, frameworks and methodologies for developing and interpreting statistical models, in the context of data, and applying them for inference. In order to solve contemporary statistical problems it is often necessary to combine techniques from previously separate paradigms. The workshop addressed model-building paradigms that are at the frontiers of modern statistical research. It tried to create synergies, by delineating the connections and collisions among different paradigms. It also endeavoured to shape the future evolution of paradigms

    Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain

    Get PDF
    The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio

    Applications

    Get PDF
    Volume 3 describes how resource-aware machine learning methods and techniques are used to successfully solve real-world problems. The book provides numerous specific application examples: in health and medicine for risk modelling, diagnosis, and treatment selection for diseases in electronics, steel production and milling for quality control during manufacturing processes in traffic, logistics for smart cities and for mobile communications
    corecore