164 research outputs found

    Sampling Schemes for Policy Analyses Using Computer Simulation Experiments

    Get PDF
    Evaluating the environmental and Economic impacts of agricultural policies is not a simple task. A systematic approach to evaluation would include the effect of policy-dependent factors (such as tillage practices, crop rotations, and chemical use) as well as the effect of policy independent covariates (such as weather, topography, and soil attributes) on response variables (such as amount of soil eroded or chemical leached into the groundwater). For comparison purposes, the effects of these input combinations on the response variable would have to be assessed under competing policy scenarios. Because the number of input combinations is high in most problems, and because policies to be evaluated are often not in use at the time of the study, practitioners have resorted to simulation experiments to generate data. But generating data from simulation models is often costly and time consuming; thus, the number of input combinations in a study may be limiting even in simulation experiments. In this paper, we discuss the problem of designing computer simulation experiments that require generating data for just a fraction of the possible input combinations. We propose an approach that is based on subsampling the 1992 National Resources Inventory (NRI) points. We illustrate the procedure by assessing soil erosion in a situation where there are observed data (reported by the Natural Resources Conservation Service (NRCS)) for comparison. Estimates for soil erosion obtained using the procedure we propose are in good agreement with NRCS reported values

    Confirmation of beach accretion by grain-size trend analysis: Camposoto beach, CĂĄdiz, SW Spain

    Get PDF
    An application of the grain size trend analysis (GSTA) is used in an exploratory approach to characterize sediment transport on Camposoto beach (Cádiz, SW Spain). In May 2009 the mesotidal beach showed a well-developed swash bar on the upper foreshore, which was associated with fair-weather conditions prevailing just before and during the field survey. The results were tested by means of an autocorrelation statistical test (index I of Moran). Two sedimentological trends were recognized, i.e. development towards finer, better sorted and more negatively skewed sediment (FB–), and towards finer, better sorted and less negatively or more positively skewed sediment (FB+). Both vector fields were compared with results obtained from more classical approaches (sand tracers, microtopography and current measurements). This revealed that both trends can be considered as realistic, the FB+ trend being identified for the first time in a beach environment. The data demonstrate that, on the well-developed swash bar, sediment transported onshore becomes both finer and better sorted towards the coast. On the lower foreshore, which exhibits a steeper slope produced by breaking waves, the higherenergy processes winnow out finer particles and thereby produce negatively skewed grain-size distributions. The upper foreshore, which has a flatter and smoother slope, is controlled by lower-energy swash-backwash and overwash processes. As a result, the skewness of the grain-size distributions evolves towards less negative or more positive values. The skewness parameter appears to be distributed as a function of the beach slope and, thus, reflects variations in hydrodynamic energy. This has novel implications for coastal management

    Universal Definition of Loss to Follow-Up in HIV Treatment Programs: A Statistical Analysis of 111 Facilities in Africa, Asia, and Latin America

    Get PDF
    Background: Although patient attrition is recognized as a threat to the long-term success of antiretroviral therapy programs worldwide, there is no universal definition for classifying patients as lost to follow-up (LTFU). We analyzed data from health facilities across Africa, Asia, and Latin America to empirically determine a standard LTFU definition. Methods and Findings: At a set ''status classification'' date, patients were categorized as either ''active'' or ''LTFU'' according to different intervals from time of last clinic encounter. For each threshold, we looked forward 365 d to assess the performance and accuracy of this initial classification. The best-performing definition for LTFU had the lowest proportion of patients misclassified as active or LTFU. Observational data from 111 health facilities - representing 180,718 patients from 19 countries - were included in this study. In the primary analysis, for which data from all facilities were pooled, an interval of 180 d (95% confidence interval [CI]: 173–181 d) since last patient encounter resulted in the fewest misclassifications (7.7%, 95% CI: 7.6%–7.8%). A secondary analysis that gave equal weight to cohorts and to regions generated a similar result (175 d); however, an alternate approach that used inverse weighting for cohorts based on variance and equal weighting for regions produced a slightly lower summary measure (150 d). When examined at the facility level, the best-performing definition varied from 58 to 383 d (mean = 150 d), but when a standard definition of 180 d was applied to each facility, only slight increases in misclassification (mean = 1.2%, 95% CI: 1.0%–1.5%) were observed. Using this definition, the proportion of patients classified as LTFU by facility ranged from 3.1% to 45.1% (mean = 19.9%, 95% CI: 19.1%–21.7%). Conclusions: Based on this evaluation, we recommend the adoption of $180 d since the last clinic visit as a standard LTFU definition. Such standardization is an important step to understanding the reasons that underlie patient attrition and establishing more reliable and comparable program evaluation worldwide

    Dietary calcium and zinc deficiency risks are decreasing but remain prevalent

    Get PDF
    Globally, more than 800 million people are undernourished while >2 billion people have one or more chronic micronutrient deficiencies (MNDs). More than 6% of global mortality and morbidity burdens are associated with undernourishment and MNDs. Here we show that, in 2011, 3.5 and 1.1 billion people were at risk of calcium (Ca) and zinc (Zn) deficiency respectively due to inadequate dietary supply. The global mean dietary supply of Ca and Zn in 2011 was 684 ± 211 and 16 ± 3 mg capita−1 d−1 (±SD) respectively. Between 1992 and 2011, global risk of deficiency of Ca and Zn decreased from 76 to 51%, and 22 to 16%, respectively. Approximately 90% of those at risk of Ca and Zn deficiency in 2011 were in Africa and Asia. To our knowledge, these are the first global estimates of dietary Ca deficiency risks based on food supply. We conclude that continuing to reduce Ca and Zn deficiency risks through dietary diversification and food and agricultural interventions including fortification, crop breeding and use of micronutrient fertilisers will remain a significant challenge

    Preoperative serum carcinoembryonic antigen, albumin and age are supplementary to UICC staging systems in predicting survival for colorectal cancer patients undergoing surgical treatment

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The aim of this study was to determine influence of prognostic factors in addition to UICC staging systems, on cancer-specific and overall survival rates for patients with colorectal cancer (CRC) undergoing surgical treatment.</p> <p>Methods</p> <p>Between January 1996 and December 2006, a total of 1367 CRC patients who underwent surgical treatment in Kaohsiung Medical University Hospital were analyzed. We retrospectively investigated clinicopathologic features of these patients. All patients were followed up intensively, and their outcomes were investigated completely.</p> <p>Results</p> <p>Of 1367 CRC patients, there were seven hundred and fifty-seven males (55.4%) and 610 (44.6%) females. The median follow-up period was 60 months (range, 3–132 months). A multivariate analysis identified that low serum albumin level (<it>P </it>= 0.011), advanced UICC stage (<it>P </it>< 0.001), and high carcinoembryonic antigen (CEA) level (<it>P </it>< 0.001) were independent prognostic factors of cancer-specific survival. Meanwhile, a multivariate analysis showed age over 65 years (<it>P </it>< 0.001), advanced UICC stage (<it>P </it>< 0.001), and high CEA level (<it>P </it>< 0.001) were independent prognostic factors of overall survival. Furthermore, combination of UICC stage, serum CEA and albumin levels as predictors of cancer-specific survival showed that the poorer the prognostic factors involved, the poorer the cancer-specific survival rate. Likewise, combination of UICC stage, age and serum CEA level as predictors of overall survival showed that the poorer the prognostic factors involved, the poorer the overall survival rate. Of these prognostic factors, preoperative serum CEA level was the only significant prognostic factor for patients with stage II and III CRCs in both cancer-specific and overall survival categories.</p> <p>Conclusion</p> <p>Preoperative serum albumin level, CEA level and age could prominently affect postoperative outcome of CRC patients undergoing surgical treatment. In addition to conventional UICC staging system, it might be imperative to take these additional characteristics of factors into account in CRC patients prior to surgical treatment.</p

    Projected Prevalence of Inadequate Nutrient Intakes in Europe

    Get PDF
    Background: The purpose of this study was to analyze the prevalence of nutrient intake inadequacy in Europe, applying the Nordic Nutritional Recommendations in the context of the EURRECA Network of Excellence. Methods: Nutrient data was obtained from the European Nutrition and Health Report II. Those nutritional surveys using a validated food frequency questionnaire or diet history and a food diary/register with at least 7 days of registers or with an adjustment for intraindividual variability were included. The nutrients analyzed were: vitamin C, vitamin D, vitamin B-12, folic acid, calcium, iron, zinc, selenium, copper, and iodine. The estimated average requirement cut point was applied to estimate inadequacy. The Nordic and Institute of Medicine nutrient recommendations were used as references. Results: The mean prevalence of inadequacy was below 11% for zinc, iron, and vitamin B-12 (only in the elderly), and it was 11-20% for copper in adults and the elderly and for vitamin B-12 in adults and vitamin C in the elderly. The prevalence was above 20% for vitamin D, folic acid, calcium, selenium, and iodine in adults and the elderly and for vitamin C in adults. Conclusions: Vitamin C, vitamin D, folic acid, calcium, selenium, and iodine were the nutrients showing a higher prevalence of inadequate intakes in Europe. Copyright (C) 2011 S. Karger AG, Base
    • 

    corecore