96 research outputs found

    Lower edge of locked Main Himalayan Thrust unzipped by the 2015 Gorkha earthquake

    Get PDF
    Large earthquakes are thought to release strain on previously locked faults. However, the details of how earthquakes are initiated, grow and terminate in relation to pre-seismically locked and creeping patches is unclear ^1-4. The 2015 Mw 7.8 Gorkha, Nepal earthquake occurred close to Kathmandu in a region where the prior pattern of fault locking is well documented ^5. Here we analyze this event using seismological records measured at teleseismic distances and Synthetic Aperture Radar imagery. We show that the earthquake originated northwest of Kathmandu within a cluster of background seismicity that fringes the bottom of the locked portion of the Main Himalayan Thrust fault (MHT). The rupture propagated eastwards for about 140 km, unzipping the lower edge of the locked portion of the fault. High-frequency seismic waves radiated continuously as the slip pulse propagated at about 2.8 km s-1 along this zone of presumably high and heterogeneous pre-ÂŹseismic stress at the seismic-aseismic transition. Eastward unzipping of the fault resumed during the Mw 7.3 aftershock on May 12. The transfer of stress to neighbouring regions during the Gorkha earthquake should facilitate future rupture of the areas of the MHT adjacent and up-dip of the Gorkha earthquake rupture.This is the author accepted manuscript. The final version is available from Nature Publishing Group via http://dx.doi.org/10.1038/ngeo251

    Plate-boundary deformation associated with the great Sumatra–Andaman earthquake

    Get PDF
    The Sumatra–Andaman earthquake of 26 December 2004 is the first giant earthquake (moment magnitude M_w > 9.0) to have occurred since the advent of modern space-based geodesy and broadband seismology. It therefore provides an unprecedented opportunity to investigate the characteristics of one of these enormous and rare events. Here we report estimates of the ground displacement associated with this event, using near-field Global Positioning System (GPS) surveys in northwestern Sumatra combined with in situ and remote observations of the vertical motion of coral reefs. These data show that the earthquake was generated by rupture of the Sunda subduction megathrust over a distance of >1,500 kilometres and a width of <150 kilometres. Megathrust slip exceeded 20 metres offshore northern Sumatra, mostly at depths shallower than 30 kilometres. Comparison of the geodetically and seismically inferred slip distribution indicates that ~30 per cent additional fault slip accrued in the 1.5 months following the 500-second-long seismic rupture. Both seismic and aseismic slip before our re-occupation of GPS sites occurred on the shallow portion of the megathrust, where the large Aceh tsunami originated. Slip tapers off abruptly along strike beneath Simeulue Island at the southeastern edge of the rupture, where the earthquake nucleated and where an M_w = 7.2 earthquake occurred in late 2002. This edge also abuts the northern limit of slip in the 28 March 2005 M_w = 8.7 Nias–Simeulue earthquake

    Modeling the epidemiological impact of the UNAIDS 2025 targets to end AIDS as a public health threat by 2030

    Get PDF
    BACKGROUND: UNAIDS has established new program targets for 2025 to achieve the goal of eliminating AIDS as a public health threat by 2030. This study reports on efforts to use mathematical models to estimate the impact of achieving those targets. METHODS AND FINDINGS: We simulated the impact of achieving the targets at country level using the Goals model, a mathematical simulation model of HIV epidemic dynamics that includes the impact of prevention and treatment interventions. For 77 high-burden countries, we fit the model to surveillance and survey data for 1970 to 2020 and then projected the impact of achieving the targets for the period 2019 to 2030. Results from these 77 countries were extrapolated to produce estimates for 96 others. Goals model results were checked by comparing against projections done with the Optima HIV model and the AIDS Epidemic Model (AEM) for selected countries. We included estimates of the impact of societal enablers (access to justice and law reform, stigma and discrimination elimination, and gender equality) and the impact of Coronavirus Disease 2019 (COVID-19). Results show that achieving the 2025 targets would reduce new annual infections by 83% (71% to 86% across regions) and AIDS-related deaths by 78% (67% to 81% across regions) by 2025 compared to 2010. Lack of progress on societal enablers could endanger these achievements and result in as many as 2.6 million (44%) cumulative additional new HIV infections and 440,000 (54%) more AIDS-related deaths between 2020 and 2030 compared to full achievement of all targets. COVID-19-related disruptions could increase new HIV infections and AIDS-related deaths by 10% in the next 2 years, but targets could still be achieved by 2025. Study limitations include the reliance on self-reports for most data on behaviors, the use of intervention effect sizes from published studies that may overstate intervention impacts outside of controlled study settings, and the use of proxy countries to estimate the impact in countries with fewer than 4,000 annual HIV infections. CONCLUSIONS: The new targets for 2025 build on the progress made since 2010 and represent ambitious short-term goals. Achieving these targets would bring us close to the goals of reducing new HIV infections and AIDS-related deaths by 90% between 2010 and 2030. By 2025, global new infections and AIDS deaths would drop to 4.4 and 3.9 per 100,000 population, and the number of people living with HIV (PLHIV) would be declining. There would be 32 million people on treatment, and they would need continuing support for their lifetime. Incidence for the total global population would be below 0.15% everywhere. The number of PLHIV would start declining by 2023

    Comparative Analysis of Pyrosequencing and a Phylogenetic Microarray for Exploring Microbial Community Structures in the Human Distal Intestine

    Get PDF
    Background Variations in the composition of the human intestinal microbiota are linked to diverse health conditions. High-throughput molecular technologies have recently elucidated microbial community structure at much higher resolution than was previously possible. Here we compare two such methods, pyrosequencing and a phylogenetic array, and evaluate classifications based on two variable 16S rRNA gene regions. Methods and Findings Over 1.75 million amplicon sequences were generated from the V4 and V6 regions of 16S rRNA genes in bacterial DNA extracted from four fecal samples of elderly individuals. The phylotype richness, for individual samples, was 1,400–1,800 for V4 reads and 12,500 for V6 reads, and 5,200 unique phylotypes when combining V4 reads from all samples. The RDP-classifier was more efficient for the V4 than for the far less conserved and shorter V6 region, but differences in community structure also affected efficiency. Even when analyzing only 20% of the reads, the majority of the microbial diversity was captured in two samples tested. DNA from the four samples was hybridized against the Human Intestinal Tract (HIT) Chip, a phylogenetic microarray for community profiling. Comparison of clustering of genus counts from pyrosequencing and HITChip data revealed highly similar profiles. Furthermore, correlations of sequence abundance and hybridization signal intensities were very high for lower-order ranks, but lower at family-level, which was probably due to ambiguous taxonomic groupings. Conclusions The RDP-classifier consistently assigned most V4 sequences from human intestinal samples down to genus-level with good accuracy and speed. This is the deepest sequencing of single gastrointestinal samples reported to date, but microbial richness levels have still not leveled out. A majority of these diversities can also be captured with five times lower sampling-depth. HITChip hybridizations and resulting community profiles correlate well with pyrosequencing-based compositions, especially for lower-order ranks, indicating high robustness of both approaches. However, incompatible grouping schemes make exact comparison difficult

    Epigenetics and developmental programming of welfare and production traits in farm animals

    Get PDF
    The concept that postnatal health and development can be influenced by events that occur in utero originated from epidemiological studies in humans supported by numerous mechanistic (including epigenetic) studies in a variety of model species. Referred to as the ‘developmental origins of health and disease’ or ‘DOHaD’ hypothesis, the primary focus of large-animal studies until quite recently had been biomedical. Attention has since turned towards traits of commercial importance in farm animals. Herein we review the evidence that prenatal risk factors, including suboptimal parental nutrition, gestational stress, exposure to environmental chemicals and advanced breeding technologies, can determine traits such as postnatal growth, feed efficiency, milk yield, carcass composition, animal welfare and reproductive potential. We consider the role of epigenetic and cytoplasmic mechanisms of inheritance, and discuss implications for livestock production and future research endeavours. We conclude that although the concept is proven for several traits, issues relating to effect size, and hence commercial importance, remain. Studies have also invariably been conducted under controlled experimental conditions, frequently assessing single risk factors, thereby limiting their translational value for livestock production. We propose concerted international research efforts that consider multiple, concurrent stressors to better represent effects of contemporary animal production systems

    Why Are Outcomes Different for Registry Patients Enrolled Prospectively and Retrospectively? Insights from the Global Anticoagulant Registry in the FIELD-Atrial Fibrillation (GARFIELD-AF).

    Get PDF
    Background: Retrospective and prospective observational studies are designed to reflect real-world evidence on clinical practice, but can yield conflicting results. The GARFIELD-AF Registry includes both methods of enrolment and allows analysis of differences in patient characteristics and outcomes that may result. Methods and Results: Patients with atrial fibrillation (AF) and ≄1 risk factor for stroke at diagnosis of AF were recruited either retrospectively (n = 5069) or prospectively (n = 5501) from 19 countries and then followed prospectively. The retrospectively enrolled cohort comprised patients with established AF (for a least 6, and up to 24 months before enrolment), who were identified retrospectively (and baseline and partial follow-up data were collected from the emedical records) and then followed prospectively between 0-18 months (such that the total time of follow-up was 24 months; data collection Dec-2009 and Oct-2010). In the prospectively enrolled cohort, patients with newly diagnosed AF (≀6 weeks after diagnosis) were recruited between Mar-2010 and Oct-2011 and were followed for 24 months after enrolment. Differences between the cohorts were observed in clinical characteristics, including type of AF, stroke prevention strategies, and event rates. More patients in the retrospectively identified cohort received vitamin K antagonists (62.1% vs. 53.2%) and fewer received non-vitamin K oral anticoagulants (1.8% vs . 4.2%). All-cause mortality rates per 100 person-years during the prospective follow-up (starting the first study visit up to 1 year) were significantly lower in the retrospective than prospectively identified cohort (3.04 [95% CI 2.51 to 3.67] vs . 4.05 [95% CI 3.53 to 4.63]; p = 0.016). Conclusions: Interpretations of data from registries that aim to evaluate the characteristics and outcomes of patients with AF must take account of differences in registry design and the impact of recall bias and survivorship bias that is incurred with retrospective enrolment. Clinical Trial Registration: - URL: http://www.clinicaltrials.gov . Unique identifier for GARFIELD-AF (NCT01090362)

    Improved risk stratification of patients with atrial fibrillation: an integrated GARFIELD-AF tool for the prediction of mortality, stroke and bleed in patients with and without anticoagulation.

    Get PDF
    OBJECTIVES: To provide an accurate, web-based tool for stratifying patients with atrial fibrillation to facilitate decisions on the potential benefits/risks of anticoagulation, based on mortality, stroke and bleeding risks. DESIGN: The new tool was developed, using stepwise regression, for all and then applied to lower risk patients. C-statistics were compared with CHA2DS2-VASc using 30-fold cross-validation to control for overfitting. External validation was undertaken in an independent dataset, Outcome Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). PARTICIPANTS: Data from 39 898 patients enrolled in the prospective GARFIELD-AF registry provided the basis for deriving and validating an integrated risk tool to predict stroke risk, mortality and bleeding risk. RESULTS: The discriminatory value of the GARFIELD-AF risk model was superior to CHA2DS2-VASc for patients with or without anticoagulation. C-statistics (95% CI) for all-cause mortality, ischaemic stroke/systemic embolism and haemorrhagic stroke/major bleeding (treated patients) were: 0.77 (0.76 to 0.78), 0.69 (0.67 to 0.71) and 0.66 (0.62 to 0.69), respectively, for the GARFIELD-AF risk models, and 0.66 (0.64-0.67), 0.64 (0.61-0.66) and 0.64 (0.61-0.68), respectively, for CHA2DS2-VASc (or HAS-BLED for bleeding). In very low to low risk patients (CHA2DS2-VASc 0 or 1 (men) and 1 or 2 (women)), the CHA2DS2-VASc and HAS-BLED (for bleeding) scores offered weak discriminatory value for mortality, stroke/systemic embolism and major bleeding. C-statistics for the GARFIELD-AF risk tool were 0.69 (0.64 to 0.75), 0.65 (0.56 to 0.73) and 0.60 (0.47 to 0.73) for each end point, respectively, versus 0.50 (0.45 to 0.55), 0.59 (0.50 to 0.67) and 0.55 (0.53 to 0.56) for CHA2DS2-VASc (or HAS-BLED for bleeding). Upon validation in the ORBIT-AF population, C-statistics showed that the GARFIELD-AF risk tool was effective for predicting 1-year all-cause mortality using the full and simplified model for all-cause mortality: C-statistics 0.75 (0.73 to 0.77) and 0.75 (0.73 to 0.77), respectively, and for predicting for any stroke or systemic embolism over 1 year, C-statistics 0.68 (0.62 to 0.74). CONCLUSIONS: Performance of the GARFIELD-AF risk tool was superior to CHA2DS2-VASc in predicting stroke and mortality and superior to HAS-BLED for bleeding, overall and in lower risk patients. The GARFIELD-AF tool has the potential for incorporation in routine electronic systems, and for the first time, permits simultaneous evaluation of ischaemic stroke, mortality and bleeding risks. CLINICAL TRIAL REGISTRATION: URL: http://www.clinicaltrials.gov. Unique identifier for GARFIELD-AF (NCT01090362) and for ORBIT-AF (NCT01165710)

    Two-year outcomes of patients with newly diagnosed atrial fibrillation: results from GARFIELD-AF.

    Get PDF
    AIMS: The relationship between outcomes and time after diagnosis for patients with non-valvular atrial fibrillation (NVAF) is poorly defined, especially beyond the first year. METHODS AND RESULTS: GARFIELD-AF is an ongoing, global observational study of adults with newly diagnosed NVAF. Two-year outcomes of 17 162 patients prospectively enrolled in GARFIELD-AF were analysed in light of baseline characteristics, risk profiles for stroke/systemic embolism (SE), and antithrombotic therapy. The mean (standard deviation) age was 69.8 (11.4) years, 43.8% were women, and the mean CHA2DS2-VASc score was 3.3 (1.6); 60.8% of patients were prescribed anticoagulant therapy with/without antiplatelet (AP) therapy, 27.4% AP monotherapy, and 11.8% no antithrombotic therapy. At 2-year follow-up, all-cause mortality, stroke/SE, and major bleeding had occurred at a rate (95% confidence interval) of 3.83 (3.62; 4.05), 1.25 (1.13; 1.38), and 0.70 (0.62; 0.81) per 100 person-years, respectively. Rates for all three major events were highest during the first 4 months. Congestive heart failure, acute coronary syndromes, sudden/unwitnessed death, malignancy, respiratory failure, and infection/sepsis accounted for 65% of all known causes of death and strokes for <10%. Anticoagulant treatment was associated with a 35% lower risk of death. CONCLUSION: The most frequent of the three major outcome measures was death, whose most common causes are not known to be significantly influenced by anticoagulation. This suggests that a more comprehensive approach to the management of NVAF may be needed to improve outcome. This could include, in addition to anticoagulation, interventions targeting modifiable, cause-specific risk factors for death. CLINICAL TRIAL REGISTRATION: http://www.clinicaltrials.gov. Unique identifier: NCT01090362
    • 

    corecore