118 research outputs found

    Global Cropland Area Database (GCAD) derived from Remote Sensing in Support of Food Security in the Twenty-first Century: Current Achievements and Future Possibilities

    Get PDF
    The precise estimation of the global agricultural cropland- extents, areas, geographic locations, crop types, cropping intensities, and their watering methods (irrigated or rainfed; type of irrigation) provides a critical scientific basis for the development of water and food security policies (Thenkabail et al., 2012, 2011, 2010). By year 2100, the global human population is expected to grow to 10.4 billion under median fertility variants or higher under constant or higher fertility variants (Table 1) with over three quarters living in developing countries, in regions that already lack the capacity to produce enough food. With current agricultural practices, the increased demand for food and nutrition would require in about 2 billion hectares of additional cropland, about twice the equivalent to the land area of the United States, and lead to significant increases in greenhouse gas productions (Tillman et al., 2011). For example, during 1960-2010 world population more than doubled from 3 billion to 7 billion. The nutritional demand of the population also grew swiftly during this period from an average of about 2000 calories per day per person in 1960 to nearly 3000 calories per day per person in 2010..

    NASA Making Earth System Data Records for Use in Research Environments (MEaSUREs) Global Food Security Support Analysis Data (GFSAD) Crop Mask 2010 Global 1 km V001

    Get PDF
    The NASA Making Earth System Data Records for Use in Research Environments (MEaSUREs) Global Food Security Support Analysis Data (GFSAD) Crop Mask Global 1 kilometer (km) dataset was created using multiple input data including: remote sensing such as Landsat, Advanced Very High Resolution Radiometer (AVHRR), Satellite Probatoire d'Observation de la Terre (SPOT) vegetation and Moderate Resolution Imaging Spectrometer (MODIS); secondary elevation data; climate 50-year precipitation and 20-year temperature data; reference sub-meter to 5-meter resolution ground data and country statistics data. The GFSAD1KCM provides spatial distribution of a disaggregated five class global cropland extent map derived for nominal 2010 at 1-km based on four major studies: Thenkabail et al. (2009a, 2011), Pittman et al. (2010), Yu et al. (2013), and Friedl et al. (2010). The GFSAD1KCM nominal 2010 product is based on data ranging from years 2007 through 2012

    Impacts of Second-Generation Biofuel Feedstock Production in the Central U.S. on the Hydrologic Cycle and Global Warming Mitigation Potential

    Get PDF
    Biofuel feedstocks provide a renewable energy source that can reduce fossil fuel emissions; however, if produced on a large scale they can also impact local to regional water and carbon budgets. Simulation results for 2005–2014 from a regional weather model adapted to simulate the growth of two perennial grass biofuel feedstocks suggest that replacing at least half the current annual cropland with these grasses would increase water use efficiency and drive greater rainfall downwind of perturbed grid cells, but increased evapotranspiration (ET) might switch the Mississippi River basin from having a net warm-season surplus of water (precipitation minus ET) to a net deficit. While this scenario reduces land required for biofuel feedstock production relative to current use for maize grain ethanol production, it only offsets approximately one decade of projected anthropogenic warming and increased water vapor results in greater atmospheric heat content

    Chronic arthritis in children and adolescents in two Indian health service user populations

    Get PDF
    BACKGROUND: High prevalence rates for rheumatoid arthritis, spondyloarthopathies, and systemic lupus erythematosus have been described in American Indian and Alaskan Native adults. The impact of these diseases on American Indian children has not been investigated. METHODS: We used International Classification of Diseases-9 (ICD-9) codes to search two Indian Health Service (IHS) patient registration databases over the years 1998–2000, searching for individuals 19 years of age or younger with specific ICD-9-specified diagnoses. Crude estimates for disease prevalence were made based on the number of individuals identified with these diagnoses within the database. RESULTS: Rheumatoid arthritis (RA) / juvenile rheumatoid arthritis (JRA) was the most frequent diagnosis given. The prevalence rate for JRA in the Oklahoma City Area was estimated as 53 per 100,000 individuals at risk, while in the Billings Area, the estimated prevalence was nearly twice that, at 115 per 100,000. These rates are considerably higher than those reported in the most recent European studies. CONCLUSION: Chronic arthritis in childhood represents an important, though unrecognized, chronic health challenge within the American Indian population living in the United States

    The PREDICT study uncovers three clinical courses of acutely decompensated cirrhosis that have distinct pathophysiology

    Get PDF
    Background & Aims: Acute decompensation (AD) of cirrhosis is defined as the acute development of ascites, gastrointestinal hemorrhage, hepatic encephalopathy, infection or any combination thereof, requiring hospitalization. The presence of organ failure(s) in patients with AD defines acute-on-chronic liver failure (ACLF). The PREDICT study is a European, prospective, observational study, designed to characterize the clinical course of AD and to identify predictors of ACLF. Methods: A total of 1,071 patients with AD were enrolled. We collected detailed pre-specified information on the 3-month period prior to enrollment, and clinical and laboratory data at enrollment. Patients were then closely followed up for 3 months. Outcomes (liver transplantation and death) at 1 year were also recorded. Results: Three groups of patients were identified. Pre-ACLF patients (n = 218) developed ACLF and had 3-month and 1-year mortality rates of 53.7% and 67.4%, respectively. Unstable decompensated cirrhosis (UDC) patients (n = 233) required ≥1 readmission but did not develop ACLF and had mortality rates of 21.0% and 35.6%, respectively. Stable decompensated cirrhosis (SDC) patients (n = 620) were not readmitted, did not develop ACLF and had a 1-year mortality rate of only 9.5%. The 3 groups differed significantly regarding the grade and course of systemic inflammation (high-grade at enrollment with aggravation during follow-up in pre-ACLF; low-grade at enrollment with subsequent steady-course in UDC; and low-grade at enrollment with subsequent improvement in SDC) and the prevalence of surrogates of severe portal hypertension throughout the study (high in UDC vs. low in pre-ACLF and SDC). Conclusions: Acute decompensation without ACLF is a heterogeneous condition with 3 different clinical courses and 2 major pathophysiological mechanisms: systemic inflammation and portal hypertension. Predicting the development of ACLF remains a major future challenge. ClinicalTrials.gov number: NCT03056612. Lay summary: Herein, we describe, for the first time, 3 different clinical courses of acute decompensation (AD) of cirrhosis after hospital admission. The first clinical course includes patients who develop acute-on-chronic liver failure (ACLF) and have a high short-term risk of death – termed pre-ACLF. The second clinical course (unstable decompensated cirrhosis) includes patients requiring frequent hospitalizations unrelated to ACLF and is associated with a lower mortality risk than pre-ACLF. Finally, the third clinical course (stable decompensated cirrhosis), includes two-thirds of all patients admitted to hospital with AD – patients in this group rarely require hospital admission and have a much lower 1-year mortality risk

    Measuring Spatio-temporal Trends in Residential Landscape Irrigation Extent and Rate in Los Angeles, California Using SPOT-5 Satellite Imagery

    Full text link
    Irrigation is a large component of urban water budgets in semi-arid regions and is critical for the management of landscape vegetation and water resources. This is particularly true for Mediterranean climate cities such as Los Angeles, where water availability is limited during dry summers. These interactions were examined by using 10-m resolution satellite imagery and a database of monthly water use records for all residential water customers in Los Angeles in order to map vegetation greenness, the extent and distribution of irrigated areas, and irrigation rates. A water conservation ratio between rates of irrigation and vegetation water demand was calculated to assess over-irrigation. The analyses were conducted for the water years (WY) 2005–2007, which included wet, average, and dry extremes of annual rainfall. Although outdoor water usage was highest in the dry year, vegetation greenness could not be maintained as well as in wetter years, suggesting that lower greenness was due to water stress. However, annual rainfall from WY 2005 to 2007 did not significantly influence the variability in the magnitude and spatial pattern of irrigation, with mean irrigated rates ranging only from 81 to 86 mm. The water conservation ratio showed that 7 % of the postal carrier routes across the city were over-irrigated in the dry year, but 43 % were over-irrigated in the wet year. This was largely because the climatic demand for water by vegetation decreased in wet years, but irrigation rates changed little from year-to-year. This overwatering can be addressed by water conservation, planning and public education, especially in the current California drought. The approach demonstrated here should be transferable to other cities in semi-arid climates

    Quality Measures for the Diagnosis and Non-Operative Management of Carpal Tunnel Syndrome in Occupational Settings

    Get PDF
    Introduction: Providing higher quality medical care to workers with occupationally associated carpal tunnel syndrome (CTS) may reduce disability, facilitate return to work, and lower the associated costs. Although many workers’ compensation systems have adopted treatment guidelines to reduce the overuse of unnecessary care, limited attention has been paid to ensuring that the care workers do receive is high quality. Further, guidelines are not designed to enable objective assessments of quality of care. This study sought to develop quality measures for the diagnostic evaluation and non-operative management of CTS, including managing occupational activities and functional limitations. Methods: Using a variation of the well-established RAND/UCLA Appropriateness Method, we developed draft quality measures using guidelines and literature reviews. Next, in a two-round modified-Delphi process, a multidisciplinary panel of 11 U.S. experts in CTS rated the measures on validity and feasibility. Results: Of 40 draft measures, experts rated 31 (78%) valid and feasible. Nine measures pertained to diagnostic evaluation, such as assessing symptoms, signs, and risk factors. Eleven pertain to non-operative treatments, such as the use of splints, steroid injections, and medications. Eleven others address assessing the association between symptoms and work, managing occupational activities, and accommodating functional limitations. Conclusions: These measures will complement existing treatment guidelines by enabling providers, payers, policymakers, and researchers to assess quality of care for CTS in an objective, structured manner. Given the characteristics of previous measures developed with these methods, greater adherence to these measures will probably lead to improved patient outcomes at a population level

    Secukinumab, an Interleukin-17A Inhibitor, in Ankylosing Spondylitis

    Get PDF
    Background Secukinumab is an anti–interleukin-17A monoclonal antibody that has been shown to control the symptoms of ankylosing spondylitis in a phase 2 trial. We conducted two phase 3 trials of secukinumab in patients with active ankylosing spondylitis. Methods In two double-blind trials, we randomly assigned patients to receive secukinumab or placebo. In MEASURE 1, a total of 371 patients received intravenous secukinumab (10 mg per kilogram of body weight) or matched placebo at weeks 0, 2, and 4, followed by subcutaneous secukinumab (150 mg or 75 mg) or matched placebo every 4 weeks starting at week 8. In MEASURE 2, a total of 219 patients received subcutaneous secukinumab (150 mg or 75 mg) or matched placebo at baseline; at weeks 1, 2, and 3; and every 4 weeks starting at week 4. At week 16, patients in the placebo group were randomly reassigned to subcutaneous secukinumab at a dose of 150 mg or 75 mg. The primary end point was the proportion of patients with at least 20% improvement in Assessment of Spondyloarthritis International Society (ASAS20) response criteria at week 16. Results In MEASURE 1, the ASAS20 response rates at week 16 were 61%, 60%, and 29% for subcutaneous secukinumab at doses of 150 mg and 75 mg and for placebo, respectively (P<0.001 for both comparisons with placebo); in MEASURE 2, the rates were 61%, 41%, and 28% for subcutaneous secukinumab at doses of 150 mg and 75 mg and for placebo, respectively (P<0.001 for the 150-mg dose and P=0.10 for the 75-mg dose). The significant improvements were sustained through 52 weeks. Infections, including candidiasis, were more common with secukinumab than with placebo during the placebo-controlled period of MEASURE 1. During the entire treatment period, pooled exposure-adjusted incidence rates of grade 3 or 4 neutropenia, candida infections, and Crohn’s disease were 0.7, 0.9, and 0.7 cases per 100 patient-years, respectively, in secukinumab-treated patients. Conclusions Secukinumab at a subcutaneous dose of 150 mg, with either subcutaneous or intravenous loading, provided significant reductions in the signs and symptoms of ankylosing spondylitis at week 16. Secukinumab at a subcutaneous dose of 75 mg resulted in significant improvement only with a higher intravenous loading dose. (Funded by Novartis Pharma; ClinicalTrials.gov numbers, NCT01358175 and NCT01649375.
    corecore