403 research outputs found

    Health state utility values for diabetic retinopathy: protocol for a systematic review and meta-analysis

    Get PDF
    Background People with diabetic retinopathy tend to have lower levels of health-related quality of life than individuals with no retinopathy. Strategies for screening and treatment have been shown to be cost-effective. In order to reduce the bias in cost-effectiveness estimates, systematic reviews of health state utility values (HSUVs) are crucial for health technology assessment and the development of decision analytic models. A review and synthesis of HSUVs for the different stages of disease progression in diabetic retinopathy has not previously been conducted. Methods/Design We will conduct a systematic review of the available literature that reports HSUVs for people with diabetic retinopathy, in correspondence with current stage of disease progression and/or visual acuity. We will search Medline, EMBASE, Web of Science, Cost-Effectiveness Analysis Registry, Centre for Reviews and Dissemination Database, and EconLit to identify relevant English-language articles. Data will subsequently be synthesized using linear mixed effects modeling meta-regression. Additionally, reported disease severity classifications will be mapped to a four-level grading scale for diabetic retinopathy. Discussion The systematic review and meta-analysis will provide important evidence for future model-based economic evaluations of technologies for diabetic retinopathy. The meta-regression will enable the estimation of utility values at different disease stages for patients with particular characteristics and will also highlight where the design of the study and HSUV instrument have influenced the reported utility values. We believe this protocol to be the first of its kind to be published

    Perspectives on open access high resolution digital elevation models to produce global flood hazard layers

    Get PDF
    Global flood hazard models have recently become a reality thanks to the release of open access global digital elevation models, the development of simplified and highly efficient flow algorithms, and the steady increase in computational power. In this commentary we argue that although the availability of open access global terrain data has been critical in enabling the development of such models, the relatively poor resolution and precision of these data now limit significantly our ability to estimate flood inundation and risk for the majority of the planet’s surface. The difficulty of deriving an accurate ‘bare-earth’ terrain model due to the interaction of vegetation and urban structures with the satellite-based remote sensors means that global terrain data are often poorest in the areas where people, property (and thus vulnerability) are most concentrated. Furthermore, the current generation of open access global terrain models are over a decade old and many large floodplains, particularly those in developing countries, have undergone significant change in this time. There is therefore a pressing need for a new generation of high resolution and high vertical precision open access global digital elevation models to allow significantly improved global flood hazard models to be developed

    A high-resolution global flood hazard model

    Get PDF
    Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data-scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross-disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ∼90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high-resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ∼1 km, mean absolute error in flooded fraction falls to ∼5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2-D only variant and an independently developed pan-European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next-generation global terrain data sets will offer the best prospect for a step-change improvement in model performance.</p

    Intellectual ability in tuberous sclerosis complex correlates with predicted effects of mutations on TSC1 and TSC2 proteins.

    Get PDF
    BACKGROUND: Tuberous sclerosis complex is a multisystem genetic disease, caused by mutation in the TSC1 or TSC2 genes, associated with many features, including intellectual disability (ID). We examined psychometric profiles of patients with TSC1 or TSC2 mutations and tested whether different mutation types were associated with different degrees of intellectual ability. METHODS: One hundred subjects with known TSC1/TSC2 mutations were assessed using a range of IQ or developmental quotient (DQ) measures. Effects of mutations on TSC1/TSC2 proteins were inferred from sequence data and published biochemical studies. RESULTS: Most individuals with TSC1 mutations fell on a normal distribution identical to the general population, with ∼10% showing profound ID. Of individuals with TSC2 mutations, 34% showed profound ID, and the remainder a pattern of IQ/DQ more variable and shifted to the left than in TSC1 or the general population. Truncating TSC1 mutations were all predicted to be subject to nonsense-mediated mRNA decay. Mutations predicted to result in unstable protein were associated with less severe effects on IQ/DQ. There was a statistically significant negative correlation between length of predicted aberrant C-terminal tails arising from frameshift mutations in TSC1 and IQ/DQ; for TSC2 a positive but not statistically significant correlation was observed. CONCLUSION: We propose a model where (i) IQ/DQ correlates inversely with predicted levels and/or deleterious biochemical effects of mutant TSC1 or TSC2 protein, and (ii) longer aberrant C-terminal tails arising from frameshift mutations are more detrimental for TSC1 and less for TSC2. Predictions of the model require replication and biochemical testing.We thank the Tuberous Sclerosis Association, the Wales Gene Park, the National Research Foundation of South Africa and the Struengmann Fund for financial support. We thank Prof Chris Smith for helpful comments on the manuscript.This is the author accepted manuscript. The final version is available from the British Medical Journal via http://dx.doi.org/10.1136/jmedgenet-2015-10315

    Estimates of present and future flood risk in the conterminous United States

    Get PDF
    Past attempts to estimate rainfall-driven flood risk across the US either have incomplete coverage, coarse resolution or use overly simplified models of the flooding process. In this paper, we use a new 30m resolution model of the entire conterminous US with a 2D representation of flood physics to produce estimates of flood hazard, which match to within 90% accuracy the skill of local models built with detailed data. These flood depths are combined with exposure datasets of commensurate resolution to calculate current and future flood risk. Our data show that the total US population exposed to serious flooding is 2.6–3.1 times higher than previous estimates, and that nearly 41 million Americans live within the 1% annual exceedance probability floodplain (compared to only 13 million when calculated using FEMA flood maps). We find that population and GDP growth alone are expected to lead to significant future increases in exposure, and this change may be exacerbated in the future by climate change

    Twitter Watch: Leveraging Social Media to Monitor and Predict Collective-Efficacy of Neighborhoods

    Full text link
    Sociologists associate the spatial variation of crime within an urban setting, with the concept of collective efficacy. The collective efficacy of a neighborhood is defined as social cohesion among neighbors combined with their willingness to intervene on behalf of the common good. Sociologists measure collective efficacy by conducting survey studies designed to measure individuals' perception of their community. In this work, we employ the curated data from a survey study (ground truth) and examine the effectiveness of substituting costly survey questionnaires with proxies derived from social media. We enrich a corpus of tweets mentioning a local venue with several linguistic and topological features. We then propose a pairwise learning to rank model with the goal of identifying a ranking of neighborhoods that is similar to the ranking obtained from the ground truth collective efficacy values. In our experiments, we find that our generated ranking of neighborhoods achieves 0.77 Kendall tau-x ranking agreement with the ground truth ranking. Overall, our results are up to 37% better than traditional baselines.Comment: 10 pages, 7 figure
    • …
    corecore