2,216 research outputs found

    Steps towards the development of an experimentally verified simulation of pool nucleate boiling on a silicon wafer with artificial sites

    Get PDF
    Nucleate boiling is a very effective heat transfer cooling process, used in numerous industrial applications. Despite intensive research over decades, a reliable model of nucleate pool boiling is still not available. This paper presents a numerical and experimental investigation of nucleate boiling from artificial nucleation sites. The numerical investigation described in the first section of the paper is carried out by a hybrid mechanistic numerical code first developed at the University of Ljubljana to simulate the temperature field in a heated stainless steel plate with a large number of nucleation sites during pool boiling of water at atmospheric pressure. It is now being redeveloped to interpret experiments on pool boiling at artificial sites on a silicon plate and as a design tool to investigate different arrangements of sites to achieve high heat fluxes. The code combines full simulation of the temperature field in the solid wall with simplified models or correlations for processes in the liquid-vapour region. The current capabilities and limitations of the code are reviewed and improvements are discussed. Examples are given of the removal of computational constraints on the activation of sites in close proximity and improvements to the bubble growth model. Preliminary simulations are presented to compare the wall conditions to be used in the experiments on silicon at Edinburgh University with the conditions in current experiments on thin metal foils at Ljubljana. An experimental rig for boiling experiments with artificial cavities on a 0.38 mm thick silicon wafer immersed in FC-72, developed at Edinburgh University, is described in the second part of the paper

    Bayesian inference of biochemical kinetic parameters using the linear noise approximation

    Get PDF
    Background Fluorescent and luminescent gene reporters allow us to dynamically quantify changes in molecular species concentration over time on the single cell level. The mathematical modeling of their interaction through multivariate dynamical models requires the deveopment of effective statistical methods to calibrate such models against available data. Given the prevalence of stochasticity and noise in biochemical systems inference for stochastic models is of special interest. In this paper we present a simple and computationally efficient algorithm for the estimation of biochemical kinetic parameters from gene reporter data. Results We use the linear noise approximation to model biochemical reactions through a stochastic dynamic model which essentially approximates a diffusion model by an ordinary differential equation model with an appropriately defined noise process. An explicit formula for the likelihood function can be derived allowing for computationally efficient parameter estimation. The proposed algorithm is embedded in a Bayesian framework and inference is performed using Markov chain Monte Carlo. Conclusion The major advantage of the method is that in contrast to the more established diffusion approximation based methods the computationally costly methods of data augmentation are not necessary. Our approach also allows for unobserved variables and measurement error. The application of the method to both simulated and experimental data shows that the proposed methodology provides a useful alternative to diffusion approximation based methods

    A Rydberg Quantum Simulator

    Full text link
    Following Feynman and as elaborated on by Lloyd, a universal quantum simulator (QS) is a controlled quantum device which reproduces the dynamics of any other many particle quantum system with short range interactions. This dynamics can refer to both coherent Hamiltonian and dissipative open system evolution. We investigate how laser excited Rydberg atoms in large spacing optical or magnetic lattices can provide an efficient implementation of a universal QS for spin models involving (high order) n-body interactions. This includes the simulation of Hamiltonians of exotic spin models involving n-particle constraints such as the Kitaev toric code, color code, and lattice gauge theories with spin liquid phases. In addition, it provides the ingredients for dissipative preparation of entangled states based on engineering n-particle reservoir couplings. The key basic building blocks of our architecture are efficient and high-fidelity n-qubit entangling gates via auxiliary Rydberg atoms, including a possible dissipative time step via optical pumping. This allows to mimic the time evolution of the system by a sequence of fast, parallel and high-fidelity n-particle coherent and dissipative Rydberg gates.Comment: 8 pages, 4 figure

    Measuring and Modeling Risk Using High-Frequency Data

    Get PDF
    Measuring and modeling financial volatility is the key to derivative pricing, asset allocation and risk management. The recent availability of high-frequency data allows for refined methods in this field. In particular, more precise measures for the daily or lower frequency volatility can be obtained by summing over squared high-frequency returns. In turn, this so-called realized volatility can be used for more accurate model evaluation and description of the dynamic and distributional structure of volatility. Moreover, non-parametric measures of systematic risk are attainable, that can straightforwardly be used to model the commonly observed time-variation in the betas. The discussion of these new measures and methods is accompanied by an empirical illustration using high-frequency data of the IBM incorporation and of the DJIA index

    Asymmetry, realised volatility and stock return risk estimates

    Get PDF
    In this paper we estimate minimum capital risk requirements for short and long positions with three investment horizons, using the traditional GARCH model and two other GARCH-type models that incorporate the possibility of asymmetric responses of volatility to price changes. We also address the problem of the extremely high estimated persistence of the GARCH model to generate observed volatility patterns by including realised volatility as an explanatory variable into the model’s variance equation. The results suggest that the inclusion of realised volatility improves the GARCH forecastability as well as its ability to calculate accurate minimum capital risk requirements and makes it quite competitive when compared with asymmetric conditional heteroscedastic models such as the GJR and the EGARCH.info:eu-repo/semantics/publishedVersio

    Timing of Intervention Affects Brain Electrical Activity in Children Exposed to Severe Psychosocial Neglect

    Get PDF
    Background: Early psychosocial deprivation has profound effects on brain activity in the young child. Previous reports have shown increased power in slow frequencies of the electroencephalogram (EEG), primarily in the theta band, and decreased power in higher alpha and beta band frequencies in infants and children who have experienced institutional care. Methodology/Principal Findings: We assessed the consequences of removing infants from institutions and placing them into a foster care intervention on brain electrical activity when children were 8 years of age. We found the intervention was successful for increasing high frequency EEG alpha power, with effects being most pronounced for children placed into foster care before 24 months of age. Conclusions/Significance: The dependence on age of placement for the effects observed on high frequency EEG alpha power suggests a sensitive period after which brain activity in the face of severe psychosocial deprivation is less amenabl

    New insights into the classification and nomenclature of cortical GABAergic interneurons.

    Get PDF
    A systematic classification and accepted nomenclature of neuron types is much needed but is currently lacking. This article describes a possible taxonomical solution for classifying GABAergic interneurons of the cerebral cortex based on a novel, web-based interactive system that allows experts to classify neurons with pre-determined criteria. Using Bayesian analysis and clustering algorithms on the resulting data, we investigated the suitability of several anatomical terms and neuron names for cortical GABAergic interneurons. Moreover, we show that supervised classification models could automatically categorize interneurons in agreement with experts' assignments. These results demonstrate a practical and objective approach to the naming, characterization and classification of neurons based on community consensus
    • …
    corecore