1,740 research outputs found

    An archival case study : revisiting the life and political economy of Lauchlin Currie

    Get PDF
    This paper forms part of a wider project to show the significance of archival material on distinguished economists, in this case Lauchlin Currie (1902-93), who studied and taught at Harvard before entering government service at the US Treasury and Federal Reserve Board as the intellectual leader of Roosevelt's New Deal, 1934-39, as FDR's White House economic adviser in peace and war, 1939-45, and as a post-war development economist. It discusses the uses made of the written and oral material available when the author was writing his intellectual biography of Currie (Duke University Press 1990) while Currie was still alive, and the significance of the material that has come to light after Currie's death

    GCRF African SWIFT Testbed 1 Report

    Get PDF
    This document describes the activities and outcomes of the GCRF African Science for Weather Information and Forecasting Techniques (SWIFT) Weather Forecasting Testbed 1. Testbed 1 was conducted in the first part of 2019, from an operational forecasting office at IMTR Nairobi, at the Kenya Meteorological Department (KMD). Other centres connected to the Testbed by video-conference. The Testbed was designed to support SWIFT’s programme of research capability-building in the science of weather prediction. New forecasting and evaluation products were tested. The outcomes of the Testbed will be used to steer the research and development of these tools, as well as to provide meteorological case studies and to stimulate new hypotheses. Successes of Testbed 1 include the real-time use of satellite-based Nowcasting products (NWC SAF products), convection-permitting model ensembles from the UK Met Office and systematic forecast evaluation. Testbed 1 also devised and refined an effective programme of work for operational synoptic forecasting, nowcasting and evaluation, which could form the basis for new Standard Operating Procedures

    Should patients with abnormal liver function tests in primary care be tested for chronic viral hepatitis: cost minimisation analysis based on a comprehensively tested cohort

    Get PDF
    Background Liver function tests (LFTs) are ordered in large numbers in primary care, and the Birmingham and Lambeth Liver Evaluation Testing Strategies (BALLETS) study was set up to assess their usefulness in patients with no pre-existing or self-evident liver disease. All patients were tested for chronic viral hepatitis thereby providing an opportunity to compare various strategies for detection of this serious treatable disease. Methods This study uses data from the BALLETS cohort to compare various testing strategies for viral hepatitis in patients who had received an abnormal LFT result. The aim was to inform a strategy for identification of patients with chronic viral hepatitis. We used a cost-minimisation analysis to define a base case and then calculated the incremental cost per case detected to inform a strategy that could guide testing for chronic viral hepatitis. Results Of the 1,236 study patients with an abnormal LFT, 13 had chronic viral hepatitis (nine hepatitis B and four hepatitis C). The strategy advocated by the current guidelines (repeating the LFT with a view to testing for specific disease if it remained abnormal) was less efficient (more expensive per case detected) than a simple policy of testing all patients for viral hepatitis without repeating LFTs. A more selective strategy of viral testing all patients for viral hepatitis if they were born in countries where viral hepatitis was prevalent provided high efficiency with little loss of sensitivity. A notably high alanine aminotransferase (ALT) level (greater than twice the upper limit of normal) on the initial ALT test had high predictive value, but was insensitive, missing half the cases of viral infection. Conclusions Based on this analysis and on widely accepted clinical principles, a "fast and frugal" heuristic was produced to guide general practitioners with respect to diagnosing cases of viral hepatitis in asymptomatic patients with abnormal LFTs. It recommends testing all patients where a clear clinical indication of infection is present (e.g. evidence of intravenous drug use), followed by testing all patients who originated from countries where viral hepatitis is prevalent, and finally testing those who have a notably raised ALT level (more than twice the upper limit of normal). Patients not picked up by this efficient algorithm had a risk of chronic viral hepatitis that is lower than the general population

    Simple, Fast and Accurate Implementation of the Diffusion Approximation Algorithm for Stochastic Ion Channels with Multiple States

    Get PDF
    The phenomena that emerge from the interaction of the stochastic opening and closing of ion channels (channel noise) with the non-linear neural dynamics are essential to our understanding of the operation of the nervous system. The effects that channel noise can have on neural dynamics are generally studied using numerical simulations of stochastic models. Algorithms based on discrete Markov Chains (MC) seem to be the most reliable and trustworthy, but even optimized algorithms come with a non-negligible computational cost. Diffusion Approximation (DA) methods use Stochastic Differential Equations (SDE) to approximate the behavior of a number of MCs, considerably speeding up simulation times. However, model comparisons have suggested that DA methods did not lead to the same results as in MC modeling in terms of channel noise statistics and effects on excitability. Recently, it was shown that the difference arose because MCs were modeled with coupled activation subunits, while the DA was modeled using uncoupled activation subunits. Implementations of DA with coupled subunits, in the context of a specific kinetic scheme, yielded similar results to MC. However, it remained unclear how to generalize these implementations to different kinetic schemes, or whether they were faster than MC algorithms. Additionally, a steady state approximation was used for the stochastic terms, which, as we show here, can introduce significant inaccuracies. We derived the SDE explicitly for any given ion channel kinetic scheme. The resulting generic equations were surprisingly simple and interpretable - allowing an easy and efficient DA implementation. The algorithm was tested in a voltage clamp simulation and in two different current clamp simulations, yielding the same results as MC modeling. Also, the simulation efficiency of this DA method demonstrated considerable superiority over MC methods.Comment: 32 text pages, 10 figures, 1 supplementary text + figur

    Seatbelt use and risk of major injuries sustained by vehicle occupants during motor-vehicle crashes: A systematic review and meta-analysis of cohort studies

    Get PDF
    BackgroundIn 2004, a World Health Report on road safety called for enforcement of measures such as seatbelt use, effective at minimizing morbidity and mortality caused by road traffic accidents. However, injuries caused by seatbelt use have also been described. Over a decade after publication of the World Health Report on road safety, this study sought to investigate the relationship between seatbelt use and major injuries in belted compared to unbelted passengers.MethodsCohort studies published in English language from 2005 to 2018 were retrieved from seven databases. Critical appraisal of studies was carried out using the Scottish Intercollegiate Guidelines Network (SIGN) checklist. Pooled risk of major injuries was assessed using the random effects meta-analytic model. Heterogeneity was quantified using I-squared and Tau-squared statistics. Funnel plots and Egger's test were used to investigate publication bias. This review is registered in PROSPERO (CRD42015020309).ResultsEleven studies, all carried out in developed countries were included. Overall, the risk of any major injury was significantly lower in belted passengers compared to unbelted passengers (RR 0.47; 95%CI, 0.29 to 0.80; I-2=99.7; P=0.000). When analysed by crash types, belt use significantly reduced the risk of any injury (RR 0.35; 95%CI, 0.24 to 0.52). Seatbelt use reduces the risk of facial injuries (RR=0.56, 95% CI=0.37 to 0.84), abdominal injuries (RR=0.87; 95% CI=0.78 to 0.98) and, spinal injuries (RR=0.56, 95% CI=0.37 to 0.84). However, we found no statistically significant difference in risk of head injuries (RR=0.49; 95% CI=0.22 to 1.08), neck injuries (RR=0.69: 95%CI 0.07 to 6.44), thoracic injuries (RR 0.96, 95%CI, 0.74 to 1.24), upper limb injuries (RR=1.05, 95%CI 0.83 to 1.34) and lower limb injuries (RR=0.77, 95%CI 0.58 to 1.04) between belted and non-belted passengers.ConclusionIn sum, the risk of most major road traffic injuries is lower in seatbelt users. Findings were inconclusive regarding seatbelt use and susceptibility to thoracic, head and neck injuries during road traffic accidents. Awareness should be raised about the dangers of inadequate seatbelt use. Future research should aim to assess the effects of seatbelt use on major injuries by crash type

    Patterns of polymorphism and linkage disequilibrium in cultivated barley

    Get PDF
    We carried out a genome-wide analysis of polymorphism (4,596 SNP loci across 190 elite cultivated accessions) chosen to represent the available genetic variation in current elite North West European and North American barley germplasm. Population sub-structure, patterns of diversity and linkage disequilibrium varied considerably across the seven barley chromosomes. Gene-rich and rarely recombining haplotype blocks that may represent up to 60% of the physical length of barley chromosomes extended across the ‘genetic centromeres’. By positioning 2,132 bi-parentally mapped SNP markers with minimum allele frequencies higher than 0.10 by association mapping, 87.3% were located to within 5 cM of their original genetic map position. We show that at this current marker density genetically diverse populations of relatively small size are sufficient to fine map simple traits, providing they are not strongly stratified within the sample, fall outside the genetic centromeres and population sub-structure is effectively controlled in the analysis. Our results have important implications for association mapping, positional cloning, physical mapping and practical plant breeding in barley and other major world cereals including wheat and rye that exhibit comparable genome and genetic features

    A Fokker-Planck formalism for diffusion with finite increments and absorbing boundaries

    Get PDF
    Gaussian white noise is frequently used to model fluctuations in physical systems. In Fokker-Planck theory, this leads to a vanishing probability density near the absorbing boundary of threshold models. Here we derive the boundary condition for the stationary density of a first-order stochastic differential equation for additive finite-grained Poisson noise and show that the response properties of threshold units are qualitatively altered. Applied to the integrate-and-fire neuron model, the response turns out to be instantaneous rather than exhibiting low-pass characteristics, highly non-linear, and asymmetric for excitation and inhibition. The novel mechanism is exhibited on the network level and is a generic property of pulse-coupled systems of threshold units.Comment: Consists of two parts: main article (3 figures) plus supplementary text (3 extra figures

    Localizing triplet periodicity in DNA and cDNA sequences

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The protein-coding regions (coding exons) of a DNA sequence exhibit a triplet periodicity (TP) due to fact that coding exons contain a series of three nucleotide codons that encode specific amino acid residues. Such periodicity is usually not observed in introns and intergenic regions. If a DNA sequence is divided into small segments and a Fourier Transform is applied on each segment, a strong peak at frequency 1/3 is typically observed in the Fourier spectrum of coding segments, but not in non-coding regions. This property has been used in identifying the locations of protein-coding genes in unannotated sequence. The method is fast and requires no training. However, the need to compute the Fourier Transform across a segment (window) of arbitrary size affects the accuracy with which one can localize TP boundaries. Here, we report a technique that provides higher-resolution identification of these boundaries, and use the technique to explore the biological correlates of TP regions in the genome of the model organism <it>C. elegans</it>.</p> <p>Results</p> <p>Using both simulated TP signals and the real <it>C. elegans </it>sequence F56F11 as an example, we demonstrate that, (1) Modified Wavelet Transform (MWT) can better define the boundary of TP region than the conventional Short Time Fourier Transform (STFT); (2) The scale parameter (a) of MWT determines the precision of TP boundary localization: bigger values of a give sharper TP boundaries but result in a lower signal to noise ratio; (3) RNA splicing sites have weaker TP signals than coding region; (4) TP signals in coding region can be destroyed or recovered by frame-shift mutations; (5) 6 bp periodicities in introns and intergenic region can generate false positive signals and it can be removed with 6 bp MWT.</p> <p>Conclusions</p> <p>MWT can provide more precise TP boundaries than STFT and the boundaries can be further refined by bigger scale MWT. Subtraction of 6 bp periodicity signals reduces the number of false positives. Experimentally-introduced frame-shift mutations help recover TP signal that have been lost by possible ancient frame-shifts. More importantly, TP signal has the potential to be used to detect the splice junctions in fully spliced mRNA sequence.</p

    Engaging communication experts in a Delphi process to identify patient behaviors that could enhance communication in medical encounters

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The communication literature currently focuses primarily on improving physicians' verbal and non-verbal behaviors during the medical interview. The Four Habits Model is a teaching and research framework for physician communication that is based on evidence linking specific communication behaviors with processes and outcomes of care. The Model conceptualizes basic communication tasks as "Habits" and describes the sequence of physician communication behaviors during the clinical encounter associated with improved outcomes. Using the Four Habits Model as a starting point, we asked communication experts to identify the verbal communication behaviors of patients that are important in outpatient encounters.</p> <p>Methods</p> <p>We conducted a 4-round Delphi process with 17 international experts in communication research, medical education, and health care delivery. All rounds were conducted via the internet. In round 1, experts reviewed a list of proposed patient verbal communication behaviors within the Four Habits Model framework. The proposed patient verbal communication behaviors were identified based on a review of the communication literature. The experts could: approve the proposed list; add new behaviors; or modify behaviors. In rounds 2, 3, and 4, they rated each behavior for its fit (agree or disagree) with a particular habit. After each round, we calculated the percent agreement for each behavior and provided these data in the next round. Behaviors receiving more than 70% of experts' votes (either agree or disagree) were considered as achieving consensus.</p> <p>Results</p> <p>Of the 14 originally-proposed patient verbal communication behaviors, the experts modified all but 2, and they added 20 behaviors to the Model in round 1. In round 2, they were presented with 59 behaviors and 14 options to remove specific behaviors for rating. After 3 rounds of rating, the experts retained 22 behaviors. This set included behaviors such as asking questions, expressing preferences, and summarizing information.</p> <p>Conclusion</p> <p>The process identified communication tasks and verbal communication behaviors for patients similar to those outlined for physicians in the Four Habits Model. This represents an important step in building a single model that can be applied to teaching patients and physicians the communication skills associated with improved satisfaction and positive outcomes of care.</p

    New algorithm improves fine structure of the barley consensus SNP map

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The need to integrate information from multiple linkage maps is a long-standing problem in genetics. One way to visualize the complex ordinal relationships is with a directed graph, where each vertex in the graph is a bin of markers. When there are no ordering conflicts between the linkage maps, the result is a directed acyclic graph, or DAG, which can then be linearized to produce a consensus map.</p> <p>Results</p> <p>New algorithms for the simplification and linearization of consensus graphs have been implemented as a package for the R computing environment called DAGGER. The simplified consensus graphs produced by DAGGER exactly capture the ordinal relationships present in a series of linkage maps. Using either linear or quadratic programming, DAGGER generates a consensus map with minimum error relative to the linkage maps while remaining ordinally consistent with them. Both linearization methods produce consensus maps that are compressed relative to the mean of the linkage maps. After rescaling, however, the consensus maps had higher accuracy (and higher marker density) than the individual linkage maps in genetic simulations. When applied to four barley linkage maps genotyped at nearly 3000 SNP markers, DAGGER produced a consensus map with improved fine structure compared to the existing barley consensus SNP map. The root-mean-squared error between the linkage maps and the DAGGER map was 0.82 cM per marker interval compared to 2.28 cM for the existing consensus map. Examination of the barley hardness locus at the 5HS telomere, for which there is a physical map, confirmed that the DAGGER output was more accurate for fine structure analysis.</p> <p>Conclusions</p> <p>The R package DAGGER is an effective, freely available resource for integrating the information from a set of consistent linkage maps.</p
    corecore