117,982 research outputs found

    Assessing health systems for type 1 diabetes in sub-Saharan Africa: developing a 'Rapid Assessment Protocol for Insulin Access'

    Get PDF
    BACKGROUND: In order to improve the health of people with Type 1 diabetes in developing countries, a clear analysis of the constraints to insulin access and diabetes care is needed. We developed a Rapid Assessment Protocol for Insulin Access, comprising a series of questionnaires as well as a protocol for the gathering of other data through site visits, discussions, and document reviews. METHODS: The Rapid Assessment Protocol for Insulin Access draws on the principles of Rapid Assessment Protocols which have been developed and implemented in several different areas. This protocol was adapted through a thorough literature review on diabetes, chronic condition management and medicine supply in developing countries. A visit to three countries in sub-Saharan Africa and meetings with different experts in the field of diabetes helped refine the questionnaires. Following the development of the questionnaires these were tested with various people familiar with diabetes and/or healthcare in developing countries. The Protocol was piloted in Mozambique then refined and had two further iterations in Zambia and Mali. Translations of questionnaires were made into local languages when necessary, with back translation to ensure precision. RESULTS: In each country the protocol was implemented in 3 areas – the capital city, a large urban centre and a predominantly rural area and their respective surroundings. Interviews were carried out by local teams trained on how to use the tool. Data was then collected and entered into a database for analysis. CONCLUSION: The Rapid Assessment Protocol for Insulin Access was developed to provide a situational analysis of Type 1 diabetes, in order to make recommendations to the national Ministries of Health and Diabetes Associations. It provided valuable information on patients' access to insulin, syringes, monitoring and care. It was thus able to sketch a picture of the health care system with regards to its ability to care for people with diabetes. In all countries where this tool was used the involvement of local stakeholders resulted in the process acting as a catalyst in bringing diabetes to the attention of the health authorities

    Predicting The Outcome of Marketing Negotiations: Role-Playing versus Unaided Opinions

    Get PDF
    Role -playing and unaided opinions were used to forecast the outcome of three negotiations. Consistent with prior re search, role-playing yielded more accurate predictions. In two studies on marketing negotiations, the predictions based on role-playing were correct for 53% of the predictions while unaided opinions were correct for only 7% (ppredicting, negotiations, marketing, role-playing, unaided opinion

    Alexander representation of tangles

    Full text link
    A tangle is an oriented 1-submanifold of the cylinder whose endpoints lie on the two disks in the boundary of the cylinder. Using an algebraic tool developed by Lescop, we extend the Burau representation of braids to a functor from the category of oriented tangles to the category of Z[t,t^{-1}]-modules. For (1,1)-tangles (i.e., tangles with one endpoint on each disk) this invariant coincides with the Alexander polynomial of the link obtained by taking the closure of the tangle. We use the notion of plat position of a tangle to give a constructive proof of invariance in this case.Comment: 13 pages, 5 figure

    Bayesian models for syndrome- and gene-specific probabilities of novel variant pathogenicity

    Get PDF
    BACKGROUND: With the advent of affordable and comprehensive sequencing technologies, access to molecular genetics for clinical diagnostics and research applications is increasing. However, variant interpretation remains challenging, and tools that close the gap between data generation and data interpretation are urgently required. Here we present a transferable approach to help address the limitations in variant annotation. METHODS: We develop a network of Bayesian logistic regression models that integrate multiple lines of evidence to evaluate the probability that a rare variant is the cause of an individual's disease. We present models for genes causing inherited cardiac conditions, though the framework is transferable to other genes and syndromes. RESULTS: Our models report a probability of pathogenicity, rather than a categorisation into pathogenic or benign, which captures the inherent uncertainty of the prediction. We find that gene- and syndrome-specific models outperform genome-wide approaches, and that the integration of multiple lines of evidence performs better than individual predictors. The models are adaptable to incorporate new lines of evidence, and results can be combined with familial segregation data in a transparent and quantitative manner to further enhance predictions. Though the probability scale is continuous, and innately interpretable, performance summaries based on thresholds are useful for comparisons. Using a threshold probability of pathogenicity of 0.9, we obtain a positive predictive value of 0.999 and sensitivity of 0.76 for the classification of variants known to cause long QT syndrome over the three most important genes, which represents sufficient accuracy to inform clinical decision-making. A web tool APPRAISE [http://www.cardiodb.org/APPRAISE] provides access to these models and predictions. CONCLUSIONS: Our Bayesian framework provides a transparent, flexible and robust framework for the analysis and interpretation of rare genetic variants. Models tailored to specific genes outperform genome-wide approaches, and can be sufficiently accurate to inform clinical decision-making

    On the exit statistics theorem of many particle quantum scattering

    Full text link
    We review the foundations of the scattering formalism for one particle potential scattering and discuss the generalization to the simplest case of many non interacting particles. We point out that the "straight path motion" of the particles, which is achieved in the scattering regime, is at the heart of the crossing statistics of surfaces, which should be thought of as detector surfaces. We sketch a proof of the relevant version of the many particle flux across surfaces theorem and discuss what needs to be proven for the foundations of scattering theory in this context.Comment: 15 pages, 4 figures; to appear in the proceedings of the conference "Multiscale methods in Quantum Mechanics", Accademia dei Lincei, Rome, December 16-20, 200

    Streamlined islands and the English Channel megaflood hypothesis

    Get PDF
    Recognising ice-age catastrophic megafloods is important because they had significant impact on large-scale drainage evolution and patterns of water and sediment movement to the oceans, and likely induced very rapid, short-term effects on climate. It has been previously proposed that a drainage system on the floor of the English Channel was initiated by catastrophic flooding in the Pleistocene but this suggestion has remained controversial. Here we examine this hypothesis through an analysis of key landform features. We use a new compilation of multi- and single-beam bathymetry together with sub-bottom profiler data to establish the internal structure, planform geometry and hence origin of a set of 36 mid-channel islands. Whilst there is evidence of modern-day surficial sediment processes, the majority of the islands can be clearly demonstrated to be formed of bedrock, and are hence erosional remnants rather than depositional features. The islands display classic lemniscate or tear-drop outlines, with elongated tips pointing downstream, typical of streamlined islands formed during high-magnitude water flow. The length-to-width ratio for the entire island population is 3.4 ± 1.3 and the degree-of-elongation or k-value is 3.7 ± 1.4. These values are comparable to streamlined islands in other proven Pleistocene catastrophic flood terrains and are distinctly different to values found in modern-day rivers. The island geometries show a correlation with bedrock type: with those carved from Upper Cretaceous chalk having larger length-to-width ratios (3.2 ± 1.3) than those carved into more mixed Paleogene terrigenous sandstones, siltstones and mudstones (3.0 ± 1.5). We attribute these differences to the former rock unit having a lower skin friction which allowed longer island growth to achieve minimum drag. The Paleogene islands, although less numerous than the Chalk islands, also assume more perfect lemniscate shapes. These lithologies therefore reached island equilibrium shape more quickly but were also susceptible to total erosion. Our observations support the hypothesis that the islands were initially carved by high-water volume flows via a unique catastrophic drainage of a pro-glacial lake in the southern North Sea at the Dover Strait rather than by fluvial erosion throughout the Pleistocene

    Comment on "Conjectures on exact solution of three-dimensional (3D) simple orthorhombic Ising lattices" [arXiv:0705.1045]

    Full text link
    It is shown that a recent article by Z.-D. Zhang [arXiv:0705.1045] is in error and violates well-known theorems.Comment: LaTeX, 3 pages, no figures, submitted to Philosophical Magazine. Expanded versio

    Listen to the market, hear the best policy decision, but don’t always choose it

    Get PDF
    The updated version of this working paper (28 February 2019) is available in ORE at http://hdl.handle.net/10871/36180Real-world policymakers want to extract investors' private information about a policy's likely effects by 'listening to' asset markets. However, this brings the risk that investors will profitably 'manipulate' prices to steer policy. We model the interaction between a policymaker and an informed (profit-seeking) investor who can buy/short-sell an asset from uninformed traders. We characterize when the investor's incentives do not align with the policymaker's, implying that to induce truth-telling behavior the policymaker must commit to sometimes ignoring the signal (as revealed by the investor's behavior driving the asset's price). This implies a commitment to executing the policy with a probability depending on the asset's price. We develop a taxonomy for the full set of relationships between private signals, asset values, and policymaker welfare, characterizing the optimal indirect mechanism for each case. We find that where the policymaker is ex-ante indifferent, she commits to sometimes/never executing after a bad signal, but always executes after a good signal. Generically, this 'listening' mechanism leads to higher (policymaker) welfare then ignoring the signals. We discuss real-world evidence, implications for legislative processes, and phenomena such as 'trial balloons' and 'committing political capital'
    • …
    corecore