9,141 research outputs found

    Optical microsensor for counting food substance particles in lab-on-a-chips

    Full text link
    Integrated optical detection is considered to be an important operation in lab-on-a-chips. This paper presents an optical fiber-based micro-sensor that is capable of detecting food substance particles in a lab-on-a-chip. The system consists of a microcontroller and associated circuitry, a laser emitter, a laser receiver, fiber optic cables, a microfluidics chip, and the food substance samples to be tested. When the particles flow through the microfluidic channel in the chip, the receiver’s output voltage varies due to the particles blocking the passage of the laser ray. The changes in the collected signals are analyzed to count the number of particles. Experiments are conducted on several food substance samples including talcum powder, ground ginger, and soy sauce. The experimental results are presented and discussed

    Adaptive Step Size for Hybrid Monte Carlo Algorithm

    Get PDF
    We implement an adaptive step size method for the Hybrid Monte Carlo a lgorithm. The adaptive step size is given by solving a symmetric error equation. An integr ator with such an adaptive step size is reversible. Although we observe appreciable variations of the step size, the overhead of the method exceeds its benefits. We propose an explanation for this phenomenon.Comment: 13 pages, 5 Postscript figures, late

    Small area estimation strategy for the 2011 Census in England and Wales

    Full text link
    © 2018-IOS Press and the authors. All rights reserved. The use of model-based small area estimation for adjusting census results in the UK was first introduced in the 2001 Census. The aim was to obtain local level population estimates by age-sex groups, adjusted for the level of undercount that combined results from the Census and the Census Coverage Survey. A similar approach was adopted for the 2011 Census but with new features and this paper describes the work carried out to arrive at the chosen small area strategy. Simulation studies are used to investigate three proposed small area estimation methods: a local fixed effects model (the 2001 Census approach), a direct estimator and a synthetic estimator. The results indicate that both the synthetic and the local fixed effect models constitute good options to produce accurate and reliable local authority population estimates. A proposal is made to implement a small area estimation procedure that accommodates both the synthetic and local fixed models, as in some selected areas with differing local authority under-coverage rates a local fixed effects model may perform best. We examine this strategy under real census conditions based on the final results from the 2011 census

    Insensitivity of alkenone carbon isotopes to atmospheric CO<sub>2</sub> at low to moderate CO<sub>2</sub> levels

    Get PDF
    Atmospheric pCO2 is a critical component of the global carbon system and is considered to be the major control of Earth’s past, present and future climate. Accurate and precise reconstructions of its concentration through geological time are, therefore, crucial to our understanding of the Earth system. Ice core records document pCO2 for the past 800 kyrs, but at no point during this interval were CO2 levels higher than today. Interpretation of older pCO2 has been hampered by discrepancies during some time intervals between two of the main ocean-based proxy methods used to reconstruct pCO2: the carbon isotope fractionation that occurs during photosynthesis as recorded by haptophyte biomarkers (alkenones) and the boron isotope composition (δ11B) of foraminifer shells. Here we present alkenone and δ11B-based pCO2 reconstructions generated from the same samples from the Plio-Pleistocene at ODP Site 999 across a glacial-interglacial cycle. We find a muted response to pCO2 in the alkenone record compared to contemporaneous ice core and δ11B records, suggesting caution in the interpretation of alkenone-based records at low pCO2 levels. This is possibly caused by the physiology of CO2 uptake in the haptophytes. Our new understanding resolves some of the inconsistencies between the proxies and highlights that caution may be required when interpreting alkenone-based reconstructions of pCO2

    Development of Decision Support Systems for Estimating Salinity Instrusion Effects due to Climate Change on the South Carolina and Georgia Coast

    Get PDF
    2010 S.C. Water Resources Conference - Science and Policy Challenges for a Sustainable Futur

    The impact of structural error on parameter constraint in a climate model

    Get PDF
    Uncertainty in the simulation of the carbon cycle contributes significantly to uncertainty in the projections of future climate change. We use observations of forest fraction to constrain carbon cycle and land surface input parameters of the global climate model FAMOUS, in the presence of an uncertain structural error. Using an ensemble of climate model runs to build a computationally cheap statistical proxy (emulator) of the climate model, we use history matching to rule out input parameter settings where the corresponding climate model output is judged sufficiently different from observations, even allowing for uncertainty. Regions of parameter space where FAMOUS best simulates the Amazon forest fraction are incompatible with the regions where FAMOUS best simulates other forests, indicating a structural error in the model. We use the emulator to simulate the forest fraction at the best set of parameters implied by matching the model to the Amazon, Central African, South East Asian, and North American forests in turn. We can find parameters that lead to a realistic forest fraction in the Amazon, but that using the Amazon alone to tune the simulator would result in a significant overestimate of forest fraction in the other forests. Conversely, using the other forests to tune the simulator leads to a larger underestimate of the Amazon forest fraction. We use sensitivity analysis to find the parameters which have the most impact on simulator output and perform a history-matching exercise using credible estimates for simulator discrepancy and observational uncertainty terms. We are unable to constrain the parameters individually, but we rule out just under half of joint parameter space as being incompatible with forest observations. We discuss the possible sources of the discrepancy in the simulated Amazon, including missing processes in the land surface component and a bias in the climatology of the Amazon.This work was supported by the Joint UK BEIS/Defra Met Office Hadley Centre Climate Programme (GA01101). Doug McNeall was supported on secondment to Exeter University by the Met Office Academic Partnership (MOAP) for part of the work. Jonny Williams was supported by funding from Statoil ASA, Norwa

    Hadron Mass Predictions of the Valence Approximation to Lattice QCD

    Full text link
    We evaluate the infinite volume, continuum limits of eight hadron mass ratios predicted by lattice QCD with Wilson quarks in the valence (quenched) approximation. Each predicted ratio differs from the corresponding observed value by less than 6\%.Comment: 13 pages of Latex + 2 PostScript files attached, IBM/HET 92-

    Graphical Reasoning in Compact Closed Categories for Quantum Computation

    Full text link
    Compact closed categories provide a foundational formalism for a variety of important domains, including quantum computation. These categories have a natural visualisation as a form of graphs. We present a formalism for equational reasoning about such graphs and develop this into a generic proof system with a fixed logical kernel for equational reasoning about compact closed categories. Automating this reasoning process is motivated by the slow and error prone nature of manual graph manipulation. A salient feature of our system is that it provides a formal and declarative account of derived results that can include `ellipses'-style notation. We illustrate the framework by instantiating it for a graphical language of quantum computation and show how this can be used to perform symbolic computation.Comment: 21 pages, 9 figures. This is the journal version of the paper published at AIS

    Glueball calculations in large-N_c gauge theory

    Get PDF
    We use the light-front Hamiltonian of transverse lattice gauge theory to compute from first principles the glueball spectrum and light-front wavefunctions in the leading order of the 1/N_c colour expansion. We find 0^{++}, 2^{++}, and 1^{+-} glueballs having masses consistent with N_c=3 data available from Euclidean lattice path integral methods. The wavefunctions exhibit a light-front constituent gluon structure.Comment: 4 pages, 2 figures, uses macro boxedeps.tex, minor corrections in revised versio

    Psychometric Characteristics of the Connor-Davidson Resilience Scale (CD-RISC) in Postpartum Mothers with Histories of Childhood Maltreatment

    Get PDF
    Background: There is increased awareness that resilience serves as a protective factor against adverse psychophysiological sequelae in the context of stress. However, there are few instruments to assess this construct in adult populations. The Connor-Davidson resilience scale (CD-RISC) has been developed to assess adaptation following stress exposure. While this instrument has previously demonstrated impressive reliability and construct validity, prior research has not supported the consistency of the originally described factor structure. There is also limited evidence regarding the measurement of resilience in the context of cumulative stress exposure. Objectives: This research explores the psychometric properties of the CD-RISC in mothers with childhood histories of maltreatment Materials and Methods: Postpartum women who endorsed a history of childhood abuse or neglect (N = 141) completed the CD-RISC, the childhood trauma questionnaire and other surveys measuring positive and negative health and functioning. We calculated descriptive statistics with percentage counts and means as appropriate. Internal reliability was evaluated by Cronbach’s alpha and the calculation of item-to-total score correlations. Parallel analysis (PA) was utilized to derive the number of retained factors. Results: A recent parenting transition concomitant with a history of maltreatment was associated with lower CD-RISC scores. Internal reliability and concurrent validity analyses were satisfactory and consistent with predicted hypotheses. Exploratory factor analysis (EFA) supported a four-factor model of resilience with this population. Conclusions: This research offers further evidence of the reliability and validity of the CD-RISC. Further, the results of the EFA with parallel analysis offer an empirically-driven derivation of factors for this population
    • …
    corecore