1,995 research outputs found

    Using a Bayesian averaging model for estimating the reliability of decisions in multimodal biometrics

    Get PDF
    The issue of reliable authentication is of increasing importance in modern society. Corporations, businesses and individuals often wish to restrict access to logical or physical resources to those with relevant privileges. A popular method for authentication is the use of biometric data, but the uncertainty that arises due to the lack of uniqueness in biometrics has lead there to be a great deal of effort invested into multimodal biometrics. These multimodal biometric systems can give rise to large, distributed data sets that are used to decide the authenticity of a user. Bayesian model averaging (BMA) methodology has been used to allow experts to evaluate the reliability of decisions made in data mining applications. The use of decision tree (DT) models within the BMA methodology gives experts additional information on how decisions are made. In this paper we discuss how DT models within the BMA methodology can be used for authentication in multimodal biometric systems

    Prediction of survival probabilities with Bayesian Decision Trees

    Get PDF
    Practitioners use Trauma and Injury Severity Score (TRISS) models for predicting the survival probability of an injured patient. The accuracy of TRISS predictions is acceptable for patients with up to three typical injuries, but unacceptable for patients with a larger number of injuries or with atypical injuries. Based on a regression model, the TRISS methodology does not provide the predictive density required for accurate assessment of risk. Moreover, the regression model is difficult to interpret. We therefore consider Bayesian inference for estimating the predictive distribution of survival. The inference is based on decision tree models which recursively split data along explanatory variables, and so practitioners can understand these models. We propose the Bayesian method for estimating the predictive density and show that it outperforms the TRISS method in terms of both goodness-of-fit and classification accuracy. The developed method has been made available for evaluation purposes as a stand-alone application

    Nonlinear association structures in flexible Bayesian additive joint models

    Full text link
    Joint models of longitudinal and survival data have become an important tool for modeling associations between longitudinal biomarkers and event processes. The association between marker and log-hazard is assumed to be linear in existing shared random effects models, with this assumption usually remaining unchecked. We present an extended framework of flexible additive joint models that allows the estimation of nonlinear, covariate specific associations by making use of Bayesian P-splines. Our joint models are estimated in a Bayesian framework using structured additive predictors for all model components, allowing for great flexibility in the specification of smooth nonlinear, time-varying and random effects terms for longitudinal submodel, survival submodel and their association. The ability to capture truly linear and nonlinear associations is assessed in simulations and illustrated on the widely studied biomedical data on the rare fatal liver disease primary biliary cirrhosis. All methods are implemented in the R package bamlss to facilitate the application of this flexible joint model in practice.Comment: Changes to initial commit: minor language editing, additional information in Section 4, formatting in Supplementary Informatio

    Truth and Regret in Online Scheduling

    Full text link
    We consider a scheduling problem where a cloud service provider has multiple units of a resource available over time. Selfish clients submit jobs, each with an arrival time, deadline, length, and value. The service provider's goal is to implement a truthful online mechanism for scheduling jobs so as to maximize the social welfare of the schedule. Recent work shows that under a stochastic assumption on job arrivals, there is a single-parameter family of mechanisms that achieves near-optimal social welfare. We show that given any such family of near-optimal online mechanisms, there exists an online mechanism that in the worst case performs nearly as well as the best of the given mechanisms. Our mechanism is truthful whenever the mechanisms in the given family are truthful and prompt, and achieves optimal (within constant factors) regret. We model the problem of competing against a family of online scheduling mechanisms as one of learning from expert advice. A primary challenge is that any scheduling decisions we make affect not only the payoff at the current step, but also the resource availability and payoffs in future steps. Furthermore, switching from one algorithm (a.k.a. expert) to another in an online fashion is challenging both because it requires synchronization with the state of the latter algorithm as well as because it affects the incentive structure of the algorithms. We further show how to adapt our algorithm to a non-clairvoyant setting where job lengths are unknown until jobs are run to completion. Once again, in this setting, we obtain truthfulness along with asymptotically optimal regret (within poly-logarithmic factors)

    Inference for reaction networks using the Linear Noise Approximation

    Full text link
    We consider inference for the reaction rates in discretely observed networks such as those found in models for systems biology, population ecology and epidemics. Most such networks are neither slow enough nor small enough for inference via the true state-dependent Markov jump process to be feasible. Typically, inference is conducted by approximating the dynamics through an ordinary differential equation (ODE), or a stochastic differential equation (SDE). The former ignores the stochasticity in the true model, and can lead to inaccurate inferences. The latter is more accurate but is harder to implement as the transition density of the SDE model is generally unknown. The Linear Noise Approximation (LNA) is a first order Taylor expansion of the approximating SDE about a deterministic solution and can be viewed as a compromise between the ODE and SDE models. It is a stochastic model, but discrete time transition probabilities for the LNA are available through the solution of a series of ordinary differential equations. We describe how a restarting LNA can be efficiently used to perform inference for a general class of reaction networks; evaluate the accuracy of such an approach; and show how and when this approach is either statistically or computationally more efficient than ODE or SDE methods. We apply the LNA to analyse Google Flu Trends data from the North and South Islands of New Zealand, and are able to obtain more accurate short-term forecasts of new flu cases than another recently proposed method, although at a greater computational cost

    JBendge: An Object-Oriented System for Solving, Estimating and Selecting Nonlinear Dynamic Models

    Get PDF
    We present an object-oriented software framework allowing to specify, solve, and estimate nonlinear dynamic general equilibrium (DSGE) models. The imple- mented solution methods for nding the unknown policy function are the standard linearization around the deterministic steady state, and a function iterator using a multivariate global Chebyshev polynomial approximation with the Smolyak op- erator to overcome the course of dimensionality. The operator is also useful for numerical integration and we use it for the integrals arising in rational expecta- tions and in nonlinear state space lters. The estimation step is done by a parallel Metropolis-Hastings (MH) algorithm, using a linear or nonlinear lter. Implemented are the Kalman, Extended Kalman, Particle, Smolyak Kalman, Smolyak Sum, and Smolyak Kalman Particle lters. The MH sampling step can be interactively moni- tored and controlled by sequence and statistics plots. The number of parallel threads can be adjusted to benet from multiprocessor environments. JBendge is based on the framework JStatCom, which provides a standardized ap- plication interface. All tasks are supported by an elaborate multi-threaded graphical user interface (GUI) with project management and data handling facilities.Dynamic Stochastic General Equilibrium (DSGE) Models, Bayesian Time Series Econometrics, Java, Software Development
    corecore