1,276 research outputs found

    Bayesian inference for indirectly observed stochastic processes, applications to epidemic modelling

    Get PDF
    Stochastic processes are mathematical objects that offer a probabilistic representation of how some quantities evolve in time. In this thesis we focus on estimating the trajectory and parameters of dynamical systems in cases where only indirect observations of the driving stochastic process are available. We have first explored means to use weekly recorded numbers of cases of Influenza to capture how the frequency and nature of contacts made with infected individuals evolved in time. The latter was modelled with diffusions and can be used to quantify the impact of varying drivers of epidemics as holidays, climate, or prevention interventions. Following this idea, we have estimated how the frequency of condom use has evolved during the intervention of the Gates Foundation against HIV in India. In this setting, the available estimates of the proportion of individuals infected with HIV were not only indirect but also very scarce observations, leading to specific difficulties. At last, we developed a methodology for fractional Brownian motions (fBM), here a fractional stochastic volatility model, indirectly observed through market prices. The intractability of the likelihood function, requiring augmentation of the parameter space with the diffusion path, is ubiquitous in this thesis. We aimed for inference methods robust to refinements in time discretisations, made necessary to enforce accuracy of Euler schemes. The particle Marginal Metropolis Hastings (PMMH) algorithm exhibits this mesh free property. We propose the use of fast approximate filters as a pre-exploration tool to estimate the shape of the target density, for a quicker and more robust adaptation phase of the asymptotically exact algorithm. The fBM problem could not be treated with the PMMH, which required an alternative methodology based on reparameterisation and advanced Hamiltonian Monte Carlo techniques on the diffusion pathspace, that would also be applicable in the Markovian setting

    Alkali activation of waste materials: sustainability and innovation in processing traditional ceramics

    Get PDF
    Environmental issues linked both to OPC production and waste management brought researchers to find new solutionsfor the production of more eco-efficient binders. In this frame, alkali-activated materials are receiving increasing attention. They are obtained by reaction of an alkali metal source, generally sodium or potassium, with amorphous calcium-aluminosilicate precursors. More recently, also the reuse of mining wastes was investigated due to the impressive production of sludges and muds which do not have practical applications and shall be landfilled. The aim of our researches was to investigate the use of semi-crystalline/high-crystalline by-products in the production of alkali-activated materials. Thus, two different powders were used: an alumino silicate mud, composed by quartz, feldspars, biotite and dolomite; and a carbonatic one, composed of calcite and small amounts of dolomite. Both powders were alkali-activated using a solution of NaOH and Na2SiO3. Pastes were produced mixing the activating solution and the powder in different liquid/solid ratiosandinvestigatingthe use of waste glass powder as further source of amorphous silica. Samples were oven-cured for 24h at 60-80 °C and then cured in different environments (dry, humid and immersed in water) for other 27 days before testing physical and mechanical properties. Very promising results were obtained in terms of compressive strength (about 30 MPa for the aluminosilicate sludge and up to 45 MPa for the carbonatic one), showing their potential as innovative building products

    Development and Characterization of Auto-Locked Laser Systems and Applications to Photon Echo Lifetime Measurements

    Get PDF
    We have developed and characterized a new class of vacuum-sealed, auto-locking diode laser systems with an auto-locking controller that allows these instruments to be operated with greater ease and control at desired wavelengths in the visible and near-infrared spectral range. These laser systems can be tuned and frequency stabilized with respect to atomic, molecular, and solid-state resonances without human intervention using a variety of control algorithms programmed into the same controller. We show that these lasers have exceptional long-term stability, with an Allan deviation (ADEV) floor of 210^{-12}, and a short-term linewidth of 200 kHz. These performance characteristics are related to reducing current noise and ensuring vacuum sealing. We demonstrate accurate measurements of gravitational acceleration at the level of a few parts-per-billion by incorporating the laser into an industrial gravimeter. We also realize the basis of a LIDAR transmitter that can potentially operate in a spectral range in which frequency references are not readily available. We have also developed a technique for precise measurements of atomic lifetimes using optical photon echoes. We report a measurement of 26.10(3) ns for the 5^2P_{3/2} excited-state in ^{85}Rb vapour that has a statistical uncertainty of 0.11% in 4 hours of data acquisition. We show that the best statistical uncertainty that can be obtained with the current configuration is 0.013%, which has been exceeded by only one other lifetime measurement. An analysis of the technical limitations based on a simple model shows that these limitations can be overcome using a feedback loop with a reference interferometer. Our studies indicate that it should be possible to investigate systematic effects at the level of 0.03% in 10 minutes of data acquisition. Such an outcome could potentially result in the most accurate measurement of any atomic lifetime

    Extending mixed effects models for longitudinal data before and after treatment

    Get PDF
    For the analysis of longitudinal biomedical data in which the timing of observations in each patient is irregular and in which there is substantial loss to follow-up, it is important that statistical models adequately describe both the patterns of variation within the data and any relationships between the variable of interest and time, clinical characteristics and response to treatment. We develop novel statistical models motivated by the analysis of pre- and post-treatment CD4 cell counts from HIV-infected patients, using the UK Register of Seroconverters and CASCADE datasets. The addition of stochastic process components, specifically Brownian motion, to standard linear mixed effects models has previously been shown to improve model fit for pre-treatment CD4 cell counts. We review and further develop computational techniques for such models, and also propose the use of a more general ‘fractional Brownian motion’ process in this setting. Residual diagnostic plots for such models, based on a marginal multivariate normal distribution, show very heavy tails, and we address this issue by further extending the model to allow between-patient differences in variability over time. It is known from the literature that response to treatment in HIV-patients is dependent on their baseline CD4 level at initiation. In order to further investigate the factors that determine the characteristics of recovery in CD4 counts, we develop a framework for the combined modelling of pre- and post-treatment CD4 cell counts in which key features of the response to treatment for each patient are dependent on a latent variable representing the unobserved ‘true’ baseline value, conditioned on all pre-treatment data for each patient. We further develop the model structure to account for uncertainty in the exact time of seroconversion for each patient, by integration of the log-likelihood function over all possible dates

    Factors controlling the performance of horizontal flow roughing fitters

    Get PDF
    PhD ThesisHorizontal Roughing Filtration (HRF) is a pretreatment method used to remove excess turbidity and suspended solids of surface water fed into Slow Sand Filtration units, as these can only operate satisfactorily when the concentration suspended solids is below 25 mg/1 . A critical review and discussion of current pretreatment methods, HRF research and important filtration variables are presented together with a review of mathematical models of sand and roughing filters based on clarification and trajectory theories. A detailed historical review of head—loss theories, their development and adoption in multimedia filtration is given. I. Preliminary results from studies on a small scale HRF model suggested that: - A laboratory scale model must be over 1.2 m in length: 1.6 _in turned out to be acceptable. - An outlet chamber should be provided. — Sampling must be carried out in a two dimensional field. — Intermittent sampling is adequate. One of the main objectives of this research was to identify the Important variables affecting HRF, among velocity, temperature, particle size, particles density, arrangement of the gravel bed 'Coarse—Medium—Fine (LGF),Coarse/Fine—Fine—Coarse (SGF)§, and the bed depth. II. Experiments were conducted on a 1.6m filter scale model, using Fractional Factorial Design to identify the main variables. These were found to be particles size, velocity, and temperature. III. Further runs, using a suspension of kaolin, produced results which, upon analysis for suspended solids, turbidity, particles count, revealed that the efficiency decreases with increasing temperature and velocity and increases with increasing particles size. IV. Concentration curves along the bed enabled: — The development of the removal rate equation, — Defining the operating parts of the filter at various stages of the filtration, ' — The presence of density currents. V. Efficiency variations with the amounts of accumulated solids were monitored and revealed three main trends: a) Constant efficiency; b) Gradually decreasing efficiency; c) Increasing and then decreasing efficiency. - VI. Tracer tests showed the presence of dead zones, and - short—circuiting with either increased deposits or temperature. VII. Particles size analysis revealed that: a. The effect of velocity or temperature on the grade efficiency affects mainly suspended particles in water smaller than 10 pm and 7 pm for LGF and SGF respectively. For particles of larger diameters, an unknown repulsion phenomenon increasing with temperature rise was observed. b. The main mechanisms responsible for particles removal are sedimentation and hydrodynamic forces.The Algerian Ministry of Higher Educatio

    Change-point Problem and Regression: An Annotated Bibliography

    Get PDF
    The problems of identifying changes at unknown times and of estimating the location of changes in stochastic processes are referred to as the change-point problem or, in the Eastern literature, as disorder . The change-point problem, first introduced in the quality control context, has since developed into a fundamental problem in the areas of statistical control theory, stationarity of a stochastic process, estimation of the current position of a time series, testing and estimation of change in the patterns of a regression model, and most recently in the comparison and matching of DNA sequences in microarray data analysis. Numerous methodological approaches have been implemented in examining change-point models. Maximum-likelihood estimation, Bayesian estimation, isotonic regression, piecewise regression, quasi-likelihood and non-parametric regression are among the methods which have been applied to resolving challenges in change-point problems. Grid-searching approaches have also been used to examine the change-point problem. Statistical analysis of change-point problems depends on the method of data collection. If the data collection is ongoing until some random time, then the appropriate statistical procedure is called sequential. If, however, a large finite set of data is collected with the purpose of determining if at least one change-point occurred, then this may be referred to as non-sequential. Not surprisingly, both the former and the latter have a rich literature with much of the earlier work focusing on sequential methods inspired by applications in quality control for industrial processes. In the regression literature, the change-point model is also referred to as two- or multiple-phase regression, switching regression, segmented regression, two-stage least squares (Shaban, 1980), or broken-line regression. The area of the change-point problem has been the subject of intensive research in the past half-century. The subject has evolved considerably and found applications in many different areas. It seems rather impossible to summarize all of the research carried out over the past 50 years on the change-point problem. We have therefore confined ourselves to those articles on change-point problems which pertain to regression. The important branch of sequential procedures in change-point problems has been left out entirely. We refer the readers to the seminal review papers by Lai (1995, 2001). The so called structural change models, which occupy a considerable portion of the research in the area of change-point, particularly among econometricians, have not been fully considered. We refer the reader to Perron (2005) for an updated review in this area. Articles on change-point in time series are considered only if the methodologies presented in the paper pertain to regression analysis

    Quantitative Mapping of Lung Ventilation Using Hyperpolarized Gas Magnetic Resonance Imaging

    Get PDF
    The main objective of this project was to develop and implement techniques for high-resolution quantitative imaging of ventilation in lungs using hyperpolarized gas magnetic resonance imaging (MRI). Pulmonary ventilation is an important aspect of lung function and is frequently compromised through several different mechanisms and at varying degrees in presence of certain lung conditions, such as chronic obstructive pulmonary diseases. The primary focus of this development is on large mammalian species as a steppingstone towards translation to human subjects. The key deliverables of this project are a device for real-time mixing and delivery of hyperpolarized gases such as 3He and 129Xe in combination with O2, an MRI acquisition scheme for practical imaging of ventilation signal build-up in the lungs, and a robust mathematical model for estimation of regional fractional ventilation values at a high resolution. A theoretical framework for fractional gas replacement in the lungs is presented to describe MRI signal dynamics during continuous breathing of a mixture of hyperpolarized gases in presence of several depolarization mechanisms. A hybrid ventilation and imaging acquisition scheme is proposed to acquire a series of images during short end-inspiratory breath-holds over several breaths. The sensitivity of the estimation algorithm is assessed with respect to noise, model uncertainty and acquisition parameters, and subsequently an optimal set of acquisition parameters is proposed to minimize the fractional ventilation estimation error. This framework is then augmented by an undersampled parallel MRI scheme to accelerate image acquisition to enable fractional ventilation imaging over the entire lung volume in a single pass. The image undersampling was also leveraged to minimize the coupling associated with signal buildup in the airways and the irreversible effect of RF pulses. The proposed technique was successfully implemented in pigs under mechanical ventilation, and preliminary measurements were performed in an adult human subject under voluntary breathing

    A Study of the Use of Process Simulation and Pilot-Scale Verification Trials for the Design of Bioprocesses

    Get PDF
    This thesis examines the use of process simulation tools and pilot-scale verification trials for the design of efficient bioprocesses. The use of process simulation tools requires the development of predictive, robust unit operation models were the models are used for the calculation of mass and energy balances, and ultimately economic analysis and optimisation. Verification trials are employed to assess how the model compares to reality. Models describing key unit operations such as protein precipitation and centrifugation are often very simplistic and do not take into account the added complications that biological materials present, such as hindered settling at high solids concentrations in centrifuges and susceptibility to shear forces. The generation of useful engineering models, testing by comparison with real process data and their use in design are covered in this work. Two models have been developed in this thesis; batch protein precipitation and disc-stack centrifugation. The batch protein precipitation model calculates the enzyme and total protein solubilities upon precipitant addition, together with the precipitate phase particle size distribution and enables the effects of precipitant concentration and batch ageing conditions to be predicted. A mass and activity balance is then completed around the unit operation. The disc-stack centrifuge model is capable of predicting the separation of a range of biological materials including whole yeast cell, cell debris and shear-sensitive precipitate particle suspensions. A centrifuge feedzone breakage model has also been developed, which accounts for the shear breakage of precipitate particle suspensions that occurs in the feedzone of the centrifuge. The capacity to predict the much finer particle size distribution which enters the active disc stack where particle separation occurs enables accurate predictions of separation performance to be made. The centrifuge model also enables mass and activity balances to be completed around the unit operation. The models have been linked together so that they predict mass balances around a complete process sequence for the isolation of an intracellular yeast enzyme. Pilot-scale process verification trials have been conducted for the process sequence. The simulations and experimental verification trials provide total protein concentration and ADH activity data for all streams throughout the process. In the small scale trials DNA and cell debris concentration were also measured and simulated. Results show that the simulated results follow the trends of the experimental data extremely well. The utility of verification trials in indicating where further modelling is required, such as the centrifuge feedzone is demonstrated. Process perturbation trials have been used to show that the models can be used outside their normal operating conditions. The models developed for the yeast ADH test bed have also been tested for a process for the isolation of β-galactosidase from Escherichia coli. Results have shown that only limited experimental data is required to calculate the parameters used in the models to effect an accurate simulation. This thesis demonstrates the use of a combination of modelling and experimental verification trials for the design of bioprocesses. Recommendations for future work in the areas of further model improvement and development of other unit operation models, investigation of simulation frameworks, incorporation of on-line control techniques and the use of "what-if" studies are made

    A study of the ionic diffusion under the effect of electric field (computer simulation with reference to biological membrane)

    Get PDF
    The biophysical studies of the biological system are far from being conclusive. Not only because this science is relatively recent, but also because of the lack of physical data. Also there are a lot of contradicting views among researchers as well as the poor theoretical interpretation of the reported experimental data. However, the advent of computer science with the considerable storage capability and highly vast calculations gives modeling techniques a great advantage and opens a real door to better understanding of the complicated biological phenomena. The present thesis addressed the problem of ionic penetration through biological tissue under the effect of external electric field (DC and AC). This was done by studying the diffusion coefficient D as an indicating parameter for such effects. The work was based on stochastic computer simulation of the problem such that the tissue was considered as a matrix that contains the elements under study. The size of the matrix was up to 30,000 x 30,000. Two dimensional honey comb cellular pattern was simulated such that it allowed six maximum possible element-to-element communications. The diffusants were let to diffuse under different electric field strengths in DC forward and opposite directions, and AC field with different frequencies. The effect of vacancies concentration and annealing time were tested in the absence of electric field. Two different vacancies concentrations were studied under the effect of electric field. Fist, 90% of the tissue was vacant and subjected to DC and AC fields as well as zero field. Second, 50% of the tissue was vacant and investigated under similar conditions. The results showed that for the 90% case, the penetration increased with increasing of electric field strength. While in the 50% case, the penetration increases with increasing the current until a point at which the diffusion is hindered. The DC results of forward current were compared to that of backward direct current and the results showed that the backward direction hindered diffusion. The effect of alternating current shows that penetration was inversely proportional with the frequency which agrees with literature. Comparisons of the effects of sinusoidal and square waves were illustrated. The square waves showed to have more ionic penetration and diffusion coefficient values than the sinusoidal ones. As the frequency of alternating current is decreased, its effect on diffusion became close to that of direct current. Despite the fact that the results obtained by simulation are in essence virtual and based on arbitrary units, yet the effects were clear and indicative
    corecore