11 research outputs found

    On the usefulness of imprecise Bayesianism in chemical kinetics

    Get PDF
    International audienceBayesian methods are growing ever more popular in chemical kinetics. The reasons for this and general challenges related to kinetic parameter estimation are shortly reviewed. Most authors content themselves with using one single (mostly uniform) prior distribution. The goal of this paper is to go into some serious issues this raises. The problems of confusing knowledge and ignorance and of reparametrisation are examined. The legitimacy of a probabilistic Ockham’s razor is called into question. A synthetic example involving two reaction models was used to illustrate how merging the parameter space volume with the model accuracy into a single number might be unwise. Robust Bayesian analysis appears to be a simple andstraightforward way to avoid the problems mentioned throughout this article

    Numerical optimisation for model evaluation in combustion kinetics

    Get PDF
    Numerical optimisation related to the estimation of kinetic parameters and model evaluation is playing an increasing role in combustion as well as in other areas of applied energy research. The present work aims at presenting the current probability-based approaches along applications to real problems of combustion chemical kinetics. The main methods related to model and parameter evaluation have been explicated. An in-house program for the systematic adjustment of kinetic parameters to experimental measurements has been described and numerically validated. The GRI (Gas research institute) mechanism (version 3.0) has been shown to initially lead to results which are greatly at variance with experimental data concerning the combustion of CH3CH3 and C2H6C2H6. A thorough optimisation of all parameters has been performed with respect to these profiles. A considerable improvement could be reached and the new predictions appear to be compatible with the measurement uncertainties. It was also found that neither GRI 3.0 nor three other reaction mechanisms considered during the present work should be employed (without prior far-reaching optimisation) for numerical simulations of combustors and engines where CH3CH3 and C2H6C2H6 play an important role. Overall, this study illustrates the link between optimisation methods and model evaluation in the field of combustion chemical kinetics

    A statistical and machine learning approach to the study of astrochemistry

    Get PDF
    This thesis uses a variety of statistical and machine learning techniques to provide new insight into astrochemical processes. Astrochemistry is the study of chemistry in the universe. Due to the highly non-linear nature of a variety of competing factors, it is often difficult to understand the impact of any individual parameter on the abundance of molecules of interest. It is for this reason we present a number of techniques that provide insight. Chapter 2 is a chemical modelling study that considers the sensitivity of a glycine chemical network to the addition of two H2 addition reactions across a number of physical environments. This work considers the concept of a ``hydrogen economy" within the context of chemical reaction networks and demonstrates that H2 decreases the abundance of glycine, one of the simplest amino acids, as well as its precursors. Chapter 3 considers a methodology that involves utilising the topology of a chemical network in order to accelerate the Bayesian inference problem by reducing the dimensionality of the parameters to be inferred at once. We demonstrate that a network can be simplified as well as split into smaller pieces for the inference problem by using a toy network. Chapter 4 considers how the dimensionality can be simplified by exploiting the physics of the underlying chemical reaction mechanisms. We do this by realising that the most pertinent reaction rate parameter is the binding energy of the more mobile species. This significantly reduces the dimensionality of the problem we have to solve. Chapter 5 builds on the work done in Chapters 3 and 4. The MOPED algorithm is utilised to identify which species should be prioritised for detection in order to reduce the variance of our binding energy posterior distributions. Chapter 6 introduces the use of machine learning interpretability to provide better insights into the relationships between the physical input parameters of a chemical code and the final abundances of various species. By identifying the relative importance of various parameters and quantifying this, we make qualitative comparisons to observations and demonstrate good agreement. Chapter 7 uses the same methods as in Chapters 4, 5 and 6 in light of new JWST observations. The relationship between binding energies and the abundances of species is also explored using machine learning interpretability techniques

    Representation learning with structured invariance

    Get PDF
    Invariance is crucial for neural networks, enabling them to generalize effectively across variations of the input data by focusing on key attributes while filtering out irrelevant details. In this thesis, we study representation learning in neural networks through the lens of structured invariance. We start by studying the properties and limitations of the invariance that neural networks can learn from the data. Next, we develop a method to extract the structure of invariance learned by a neural network, providing a more nuanced analysis of the quality of learned invariance. In the next chapter, we focus on contrastive learning, demonstrating how more structured supervision results in a better quality of learned representations. The last two chapters that follow, focus on practical aspects of representation learning with structured invariance in computer vision

    Second Generation General System Theory: Perspectives in Philosophy and Approaches in Complex Systems

    Get PDF
    Following the classical work of Norbert Wiener, Ross Ashby, Ludwig von Bertalanffy and many others, the concept of System has been elaborated in different disciplinary fields, allowing interdisciplinary approaches in areas such as Physics, Biology, Chemistry, Cognitive Science, Economics, Engineering, Social Sciences, Mathematics, Medicine, Artificial Intelligence, and Philosophy. The new challenge of Complexity and Emergence has made the concept of System even more relevant to the study of problems with high contextuality. This Special Issue focuses on the nature of new problems arising from the study and modelling of complexity, their eventual common aspects, properties and approaches—already partially considered by different disciplines—as well as focusing on new, possibly unitary, theoretical frameworks. This Special Issue aims to introduce fresh impetus into systems research when the possible detection and correction of mistakes require the development of new knowledge. This book contains contributions presenting new approaches and results, problems and proposals. The context is an interdisciplinary framework dealing, in order, with electronic engineering problems; the problem of the observer; transdisciplinarity; problems of organised complexity; theoretical incompleteness; design of digital systems in a user-centred way; reaction networks as a framework for systems modelling; emergence of a stable system in reaction networks; emergence at the fundamental systems level; behavioural realization of memoryless functions

    Development and application of a framework for model structure evaluation in environmental modelling

    Get PDF
    In a fast developing world with an ever rising population, the pressures on our natural environment are continuously increasing, causing problems such as floods, water- and air pollution, droughts,... Insight in the driving mechanisms causing these threats is essential in order to properly mitigate these problems. During the last decades, mathematical models became an essential part of scientific research to better understand and predict natural phenomena. Notwithstanding the diversity of currently existing models and modelling frameworks, the identification of the most appropriate model structure for a given problem remains a research challenge. The latter is the main focus of this dissertation, which aims to improve current practices of model structure comparison and evaluation. This is done by making individual model decisions more transparent and explicitly testable. A diagnostic framework, focusing on a flexible and open model structure definition and specifying the requirements for future model developments, is described. Methods for model structure evaluation are documented, implemented, extended and applied on both respirometric and hydrological models. For the specific case of lumped hydrological models, the unity between apparently different models is illustrated. A schematic representation of these model structures provides a more transparent communication tool, while meeting the requirements of the diagnostic approach

    Optimising cost and availability estimates at the bidding stage of performance-based contracting

    Get PDF
    Performance-Based Contracting (PBC), e.g. Contracting for Availability (CfA), has been extensively applied in many industry sectors such as defence, aerospace and railway. Under PBC, complex support activities (e.g. maintenance, training, etc.) are outsourced, under mid to long term contracting arrangements, to maintain certain level of systems’ performance (e.g. availability). However, building robust cost and availability estimates is particularly challenging at the bidding stage because therei is lack of methods and limited availability of data for analysis. Driven by this contextual challenge this PhD aims to develop a process to simulate and optimise cost and availability estimates at the bidding stage of CfA. The research methodology follows a human-centred design approach, focusing on the end-user stakeholders. An interaction with seven manufacturing organisations involved in the bidding process of CfA enabled to identify the state-of-practice and the industry needs, and a review of literature in PBC and cost estimation enabled to identify the research gaps. A simulation model for cost and availability trade-off and estimation (CATECAB) has been developed, to support cost engineers during the bidding preparation. Also, a multi-objective genetic algorithm (EMOGA) has been developed to combine with the CATECAB and build a cost and availability estimation and optimisation model (CAEOCAB). Techniques such as Monte-Carlo simulation, bootstrapping resampling, multi-regression analysis and genetic algorithms have been applied. This model is able to estimate the optimal investment in the attributes that impact the availability of the systems, according to total contract cost, availability and duration targets. The validation of the models is performed by means of four case studies with twenty-one CfA scenarios, in the maritime and air domains. The outcomes indicate a representable accuracy for the estimates produced by the models, which has been considered suitable for the early stages of the bidding process

    PSA 2018

    Get PDF
    These preprints were automatically compiled into a PDF from the collection of papers deposited in PhilSci-Archive in conjunction with the PSA 2018

    Historical and Conceptual Foundations of Information Physics

    Get PDF
    The main objective of this dissertation is to philosophically assess how the use of informational concepts in the field of classical thermostatistical physics has historically evolved from the late 1940s to the present day. I will first analyze in depth the main notions that form the conceptual basis on which 'informational physics' historically unfolded, encompassing (i) different entropy, probability and information notions, (ii) their multiple interpretative variations, and (iii) the formal, numerical and semantic-interpretative relationships among them. In the following, I will assess the history of informational thermophysics during the second half of the twentieth century. Firstly, I analyse the intellectual factors that gave rise to this current in the late forties (i.e., popularization of Shannon's theory, interest in a naturalized epistemology of science, etc.), then study its consolidation in the Brillouinian and Jaynesian programs, and finally claim how Carnap (1977) and his disciples tried to criticize this tendency within the scientific community. Then, I evaluate how informational physics became a predominant intellectual current in the scientific community in the nineties, made possible by the convergence of Jaynesianism and Brillouinism in proposals such as that of Tribus and McIrvine (1971) or Bekenstein (1973) and the application of algorithmic information theory into the thermophysical domain. As a sign of its radicality at this historical stage, I explore the main proposals to include information as part of our physical reality, such as Wheeler’s (1990), Stonier’s (1990) or Landauer’s (1991), detailing the main philosophical arguments (e.g., Timpson, 2013; Lombardi et al. 2016a) against those inflationary attitudes towards information. Following this historical assessment, I systematically analyze whether the descriptive exploitation of informational concepts has historically contributed to providing us with knowledge of thermophysical reality via (i) explaining thermal processes such as equilibrium approximation, (ii) advantageously predicting thermal phenomena, or (iii) enabling understanding of thermal property such as thermodynamic entropy. I argue that these epistemic shortcomings would make it impossible to draw ontological conclusions in a justified way about the physical nature of information. In conclusion, I will argue that the historical exploitation of informational concepts has not contributed significantly to the epistemic progress of thermophysics. This would lead to characterize informational proposals as 'degenerate science' (à la Lakatos 1978a) regarding classical thermostatistical physics or as theoretically underdeveloped regarding the study of the cognitive dynamics of scientists in this physical domain
    corecore