358 research outputs found

    Bayesian Optimization for Probabilistic Programs

    Full text link
    We present the first general purpose framework for marginal maximum a posteriori estimation of probabilistic program variables. By using a series of code transformations, the evidence of any probabilistic program, and therefore of any graphical model, can be optimized with respect to an arbitrary subset of its sampled variables. To carry out this optimization, we develop the first Bayesian optimization package to directly exploit the source code of its target, leading to innovations in problem-independent hyperpriors, unbounded optimization, and implicit constraint satisfaction; delivering significant performance improvements over prominent existing packages. We present applications of our method to a number of tasks including engineering design and parameter optimization

    Dynamic modelling to describe the effect of plant extracts and customised starter culture on Staphylococcus aureus survival in goat's raw milk soft cheese

    Get PDF
    This study characterises the effect of a customised starter culture (CSC) and plant extracts (lemon balm, sage, and spearmint) on Staphylococcus aureus (SA) and lactic acid bacteria (LAB) kinetics in goat’s raw milk soft cheeses. Raw milk cheeses were produced with and without the CSC and plant extracts, and analysed for pH, SA, and LAB counts throughout ripening. The pH change over maturation was described by an empirical decay function. To assess the effect of each bio-preservative on SA, dynamic Bigelow-type models were adjusted, while their effect on LAB was evaluated by classical Huang models and dynamic Huang–Cardinal models. The models showed that the bio-preservatives decreased the time necessary for a one-log reduction but generally affected the cheese pH drop and SA decay rates (logDref=0.621–1.190 days; controls: 0.796–0.996 days). Spearmint and sage extracts affected the LAB specific growth rate (0.503 and 1.749 ln CFU/g day−1; corresponding controls: 1.421 and 0.806 ln CFU/g day−1), while lemon balm showed no impact (p > 0.05). The Huang–Cardinal models uncovered different optimum specific growth rates of indigenous LAB (1.560–1.705 ln CFU/g day−1) and LAB of cheeses with CSC (0.979–1.198 ln CFU/g day−1). The models produced validate the potential of the tested bio-preservatives to reduce SA, while identifying the impact of such strategies on the fermentation process.The authors are grateful to the Foundation for Science and Technology (FCT, Portugal) for financial support through national funds FCT/MCTES (PIDDAC) to CIMO (UIDB/00690/2020 and UIDP/00690/2020) and SusTEC (LA/P/0007/2020). They are also grateful to the EU PRIMA program and FCT for funding the ArtiSaneFood project (PRIMA/0001/2018). This study was supported by FCT under the scope of the strategic funding of the UIDB/04469/2020 unit and the BioTecNorte operation (NORTE-01-0145-FEDER-000004) funded by the European Regional Development Fund under the scope of Norte2020—Programa Operacional Regional do Norte. B.N. Silva acknowledges the financial support provided by FCT through the Ph.D. grant SFRH/BD/137801/2018.info:eu-repo/semantics/publishedVersio

    Bayesian optimization of the PC algorithm for learning Gaussian Bayesian networks

    Full text link
    The PC algorithm is a popular method for learning the structure of Gaussian Bayesian networks. It carries out statistical tests to determine absent edges in the network. It is hence governed by two parameters: (i) The type of test, and (ii) its significance level. These parameters are usually set to values recommended by an expert. Nevertheless, such an approach can suffer from human bias, leading to suboptimal reconstruction results. In this paper we consider a more principled approach for choosing these parameters in an automatic way. For this we optimize a reconstruction score evaluated on a set of different Gaussian Bayesian networks. This objective is expensive to evaluate and lacks a closed-form expression, which means that Bayesian optimization (BO) is a natural choice. BO methods use a model to guide the search and are hence able to exploit smoothness properties of the objective surface. We show that the parameters found by a BO method outperform those found by a random search strategy and the expert recommendation. Importantly, we have found that an often overlooked statistical test provides the best over-all reconstruction results
    • …
    corecore