7,609 research outputs found

    Estimation of the basic reproductive number and mean serial interval of a novel pathogen in a small, well-observed discrete population

    Get PDF
    BACKGROUND:Accurately assessing the transmissibility and serial interval of a novel human pathogen is public health priority so that the timing and required strength of interventions may be determined. Recent theoretical work has focused on making best use of data from the initial exponential phase of growth of incidence in large populations. METHODS:We measured generational transmissibility by the basic reproductive number R0 and the serial interval by its mean Tg. First, we constructed a simulation algorithm for case data arising from a small population of known size with R0 and Tg also known. We then developed an inferential model for the likelihood of these case data as a function of R0 and Tg. The model was designed to capture a) any signal of the serial interval distribution in the initial stochastic phase b) the growth rate of the exponential phase and c) the unique combination of R0 and Tg that generates a specific shape of peak incidence when the susceptible portion of a small population is depleted. FINDINGS:Extensive repeat simulation and parameter estimation revealed no bias in univariate estimates of either R0 and Tg. We were also able to simultaneously estimate both R0 and Tg. However, accurate final estimates could be obtained only much later in the outbreak. In particular, estimates of Tg were considerably less accurate in the bivariate case until the peak of incidence had passed. CONCLUSIONS:The basic reproductive number and mean serial interval can be estimated simultaneously in real time during an outbreak of an emerging pathogen. Repeated application of these methods to small scale outbreaks at the start of an epidemic would permit accurate estimates of key parameters

    Assessment of Prediction Bias in Crown Biomass Equations for Important Conifer Species of the Inland Northwest

    Get PDF
    Several crown biomass equations have been developed for local, regional, and national scale biomass estimation across the United States. The prediction equations most commonly used were developed by Brown (1978) and Jenkins et al. (2003). Because of the widespread application of these equations for managerial and scientific use in the inland northwest, USA, crown mass data for several important conifer species were collected and used to examine the direction and magnitude of bias associated with predictions made from the diameter-based equations of Brown (1978) and Jenkins et al. (2003). A total of 140 trees of 4 different conifer species were sampled, delivering 725 individual unbiased estimates of total crown mass. Regression analyses were run on differences between crown mass estimates and the Brown (1978) and Jenkins et al. (2003) equation predictions to determine whether any bias was present. Results of the regression analysis determined that bias was present in both equation sets. Brown’s (1978) equations were found to over-predict the crown mass of ponderosa pine (Pinus ponderosa) and western larch (Larix occidentalis), and under-predict the crown mass of Douglas-fir (Pseudotsuga menziesii) and lodgepole pine (Pinus contorta). Further, it was found that the magnitude of the bias increased with diameter at breast height (DBH) for all species but western larch. The Jenkins et al. (2003) equations were found to over-predict the crown mass of Douglas-fir and western larch, but under-predict for ponderosa pine, while no significant bias existed for lodgepole pine. Again, the magnitude of bias was found to increase with DBH. Bias correction models are presented which, if used within the inland northwest, could potentially increase the accuracy of these equations

    Predicting CYP3A-mediated midazolam metabolism in critically ill neonates, infants, children and adults with inflammation and organ failure

    Get PDF
    Aims: Inflammation and organ failure have been reported to have an impact on cytochrome P450 (CYP) 3A-mediated clearance of midazolam in critically ill children. Our aim was to evaluate a previously developed population pharmacokinetic model both in critically ill children and other populations, in order to allow the model to be used to guide dosing in clinical practice. Methods: The model was evaluated externally in 136 individuals, including (pre)term neonates, infants, children and adults (body weight 0.77-90 kg, C-reactive protein level 0.1-341 mg l-1 and 0-4 failing organs) using graphical and numerical diagnostics. Results: The pharmacokinetic model predicted midazolam clearance and plasma concentrations without bias in postoperative or critically ill paediatric patients and term neonates [median prediction error (MPE) 180%). Conclusion: The recently published pharmacokinetic model for midazolam, quantifying the influence of maturation, inflammation and organ failure in children, yields unbiased clearance predictions and can therefore be used for dosing instructions in term neonates, children and adults with varying levels of critical illness, including healthy adults, but not for extrapolation to preterm neonates

    Design and Implementation of Smart Sensors with Capabilities of Process Fault Detection and Variable Prediction

    Get PDF
    A typical sensor consists of a sensing element and a transmitter. The major functions of a transmitter are limited to data acquisition and communication. The recently developed transmitters with ‘smart’ functions have been focused on easy setup/maintenance of the transmitter itself such as self-calibration and self-configuration. Recognizing the growing computational capabilities of microcontroller units (MCUs) used in these transmitters and underutilized computational resources, this thesis investigates the feasibility of adding additional functionalities to a transmitter to make it ‘smart’ without modifying its foot-print, nor adding supplementary hardware. Hence, a smart sensor is defined as sensing elements combined with a smart transmitter. The added functionalities enhance a smart sensor with respect to performing process fault detection and variable prediction. This thesis starts with literature review to identify the state-of-the-arts in this field and also determine potential industry needs for the added functionalities. Particular attentions have been paid to an existing commercial temperature transmitter named NCS-TT105 from Microcyber Corporation. Detailed examination has been made in its internal hardware architecture, software execution environment, and additional computational resources available for accommodating additional functions. Furthermore, the schemes of the algorithms for realizing process fault detection and variable prediction have been examined from both theoretical and feasibility perspectives to incorporate onboard NCS-TT105. An important body of the thesis is to implement additional functions in the MCUs of NCS-TT105 by allocating real-time execution of different tasks with assigned priorities in the real-time operating system (RTOS). The enhanced NCS-TT105 has gone through extensive evaluation on a physical process control test facility under various normal/fault conditions. The test results are satisfactory and design specifications have been achieved. To the best knowledge of the author, this is the first time that process fault detection and variable prediction have been implemented right onboard of a commercial transmitter. The enhanced smart transmitter is capable of providing the information of incipient faults in the process and future changes of critical process variables. It is believed that this is an initial step towards the realization of distributed intelligence in process control, where important decisions regarding the process can be made at a sensor level

    The source counts of submillimetre galaxies detected at 1.1 mm

    Get PDF
    The source counts of galaxies discovered at sub-millimetre and millimetre wavelengths provide important information on the evolution of infrared-bright galaxies. We combine the data from six blank-field surveys carried out at 1.1 mm with AzTEC, totalling 1.6 square degrees in area with root-mean-square depths ranging from 0.4 to 1.7 mJy, and derive the strongest constraints to date on the 1.1 mm source counts at flux densities S(1100) = 1-12 mJy. Using additional data from the AzTEC Cluster Environment Survey to extend the counts to S(1100) ~ 20 mJy, we see tentative evidence for an enhancement relative to the exponential drop in the counts at S(1100) ~ 13 mJy and a smooth connection to the bright source counts at >20 mJy measured by the South Pole Telescope; this excess may be due to strong lensing effects. We compare these counts to predictions from several semi-analytical and phenomenological models and find that for most the agreement is quite good at flux densities > 4 mJy; however, we find significant discrepancies (>3sigma) between the models and the observed 1.1 mm counts at lower flux densities, and none of them are consistent with the observed turnover in the Euclidean-normalised counts at S(1100) < 2 mJy. Our new results therefore may require modifications to existing evolutionary models for low luminosity galaxies. Alternatively, the discrepancy between the measured counts at the faint end and predictions from phenomenological models could arise from limited knowledge of the spectral energy distributions of faint galaxies in the local Universe.Comment: 16 pages, 3 figures, 4 tables; accepted for publication in MNRA

    Genomic analysis for managing small and endangered populations: a case study in Tyrol Grey cattle.

    Get PDF
    Analysis of genomic data is increasingly becoming part of the livestock industry. Therefore, the routine collection of genomic information would be an invaluable resource for effective management of breeding programs in small, endangered populations. The objective of the paper was to demonstrate how genomic data could be used to analyse (1) linkage disequlibrium (LD), LD decay and the effective population size (NeLD); (2) Inbreeding level and effective population size (NeROH) based on runs of homozygosity (ROH); (3) Prediction of genomic breeding values (GEBV) using small within-breed and genomic information from other breeds. The Tyrol Grey population was used as an example, with the goal to highlight the potential of genomic analyses for small breeds. In addition to our own results we discuss additional use of genomics to assess relatedness, admixture proportions, and inheritance of harmful variants. The example data set consisted of 218 Tyrol Grey bull genotypes, which were all available AI bulls in the population. After standard quality control restrictions 34,581 SNPs remained for the analysis. A separate quality control was applied to determine ROH levels based on Illumina GenCall and Illumina GenTrain scores, resulting into 211 bulls and 33,604 SNPs. LD was computed as the squared correlation coefficient between SNPs within a 10 mega base pair (Mb) region. ROHs were derived based on regions covering at least 4, 8, and 16 Mb, suggesting that animals had common ancestors approximately 12, 6, and 3 generations ago, respectively. The corresponding mean inbreeding coefficients (F ROH) were 4.0% for 4 Mb, 2.9% for 8 Mb and 1.6% for 16 Mb runs. With an average generation interval of 5.66 years, estimated NeROH was 125 (NeROH>16 Mb), 186 (NeROH>8 Mb) and 370 (NeROH>4 Mb) indicating strict avoidance of close inbreeding in the population. The LD was used as an alternative method to infer the population history and the Ne. The results show a continuous decrease in NeLD, to 780, 120, and 80 for 100, 10, and 5 generations ago, respectively. Genomic selection was developed for and is working well in large breeds. The same methodology was applied in Tyrol Grey cattle, using different reference populations. Contrary to the expectations, the accuracy of GEBVs with very small within breed reference populations were very high, between 0.13-0.91 and 0.12-0.63, when estimated breeding values and deregressed breeding values were used as pseudo-phenotypes, respectively. Subsequent analyses confirmed the high accuracies being a consequence of low reliabilities of pseudo-phenotypes in the validation set, thus being heavily influenced by parent averages. Multi-breed and across breed reference sets gave inconsistent and lower accuracies. Genomic information may have a crucial role in management of small breeds, even if its primary usage differs from that of large breeds. It allows to assess relatedness between individuals, trends in inbreeding and to take decisions accordingly. These decisions would be based on the real genome architecture, rather than conventional pedigree information, which can be missing or incomplete. We strongly suggest the routine genotyping of all individuals that belong to a small breed in order to facilitate the effective management of endangered livestock populations.Article 173
    • 

    corecore