116 research outputs found

    How does a biopsy of endoscopically normal terminal ileum contribute to the diagnosis? Which patients should undergo biopsy?

    Get PDF
    Background: Terminal ileum endoscopy and biopsy are the diagnostic tools of diseases attacking the ileum. However, abnormal histological findings can be found in endoscopically normal terminal ileum.Objective: This study was performed to evaluate the histopathological results of biopsies from endoscopically normal terminal ileum in order to determine pre-procedure clinical and laboratory factors predicting abnormal histopathological results, if any.Methods: A total of 297 patients who underwent colonoscopy and terminal ileum biopsy and had normal terminal ileum or a few aphthous ulcers in the terminal ileum together with completely normal colon mucosa were included in the study. The patients were grouped into two arms as normal cases and cases with aphthous ulcers. Histopathological and pre-procedural laboratory results of patients were analyzed according to their indications.Results: The terminal ileum was endoscopically normal in 200 patients, and 97 patients had aphthous ulcers. Chronic ileitis rate was present in 5.5% of those with endoscopically normal terminal ileum and in 39.2% of the patients with aphthous ulcers. In both groups, the highest rate of chronic ileitis was detected in the patients with known inflammatory bowel disease (IBD) (15.4 and 50%, respectively), anemia (9.5 and 43.5%, respectively), and in the patients having chronic diarrhea together with abdominal pain (7.7 and 44.8%, respectively). We found that the sensitivity of mean platelet volume for predicting chronic ileitis was 87% and the specificity was 45% at a cut-off value lower than 9.35 fl.Conclusion: In anemia indication or chronic diarrhea together with abdominal pain, the frequency of aphthous ulcers detected by ileoscopy and the frequency of chronic ileitis detected histopathologically despite a normal-appearing ileum were elevated.Keywords: Terminal ileum; ileoscopy; chronic ileitis; inflammatory bowel diseas

    How Universal is the Relationship Between Remotely Sensed Vegetation Indices and Crop Leaf Area Index? A Global Assessment

    Get PDF
    Leaf Area Index (LAI) is a key variable that bridges remote sensing observations to the quantification of agroecosystem processes. In this study, we assessed the universality of the relationships between crop LAI and remotely sensed Vegetation Indices (VIs). We first compiled a global dataset of 1459 in situ quality-controlled crop LAI measurements and collected Landsat satellite images to derive five different VIs including Simple Ratio (SR), Normalized Difference Vegetation Index (NDVI), two versions of the Enhanced Vegetation Index (EVI and EVI2), and Green Chlorophyll Index (CI(sub Green)). Based on this dataset, we developed global LAI-VI relationships for each crop type and VI using symbolic regression and Theil-Sen (TS) robust estimator. Results suggest that the global LAI-VI relationships are statistically significant, crop-specific, and mostly non-linear. These relationships explain more than half of the total variance in ground LAI observations (R2 greater than 0.5), and provide LAI estimates with RMSE below 1.2 m2/m2. Among the five VIs, EVI/EVI2 are the most effective, and the crop-specific LAI-EVI and LAI-EVI2 relationships constructed by TS, are robust when tested by three independent validation datasets of varied spatial scales. While the heterogeneity of agricultural landscapes leads to a diverse set of local LAI-VI relationships, the relationships provided here represent global universality on an average basis, allowing the generation of large-scale spatial-explicit LAI maps. This study contributes to the operationalization of large-area crop modeling and, by extension, has relevance to both fundamental and applied agroecosystem research

    The PREDICT study uncovers three clinical courses of acutely decompensated cirrhosis that have distinct pathophysiology

    Get PDF
    Background & Aims: Acute decompensation (AD) of cirrhosis is defined as the acute development of ascites, gastrointestinal hemorrhage, hepatic encephalopathy, infection or any combination thereof, requiring hospitalization. The presence of organ failure(s) in patients with AD defines acute-on-chronic liver failure (ACLF). The PREDICT study is a European, prospective, observational study, designed to characterize the clinical course of AD and to identify predictors of ACLF. Methods: A total of 1,071 patients with AD were enrolled. We collected detailed pre-specified information on the 3-month period prior to enrollment, and clinical and laboratory data at enrollment. Patients were then closely followed up for 3 months. Outcomes (liver transplantation and death) at 1 year were also recorded. Results: Three groups of patients were identified. Pre-ACLF patients (n = 218) developed ACLF and had 3-month and 1-year mortality rates of 53.7% and 67.4%, respectively. Unstable decompensated cirrhosis (UDC) patients (n = 233) required ≥1 readmission but did not develop ACLF and had mortality rates of 21.0% and 35.6%, respectively. Stable decompensated cirrhosis (SDC) patients (n = 620) were not readmitted, did not develop ACLF and had a 1-year mortality rate of only 9.5%. The 3 groups differed significantly regarding the grade and course of systemic inflammation (high-grade at enrollment with aggravation during follow-up in pre-ACLF; low-grade at enrollment with subsequent steady-course in UDC; and low-grade at enrollment with subsequent improvement in SDC) and the prevalence of surrogates of severe portal hypertension throughout the study (high in UDC vs. low in pre-ACLF and SDC). Conclusions: Acute decompensation without ACLF is a heterogeneous condition with 3 different clinical courses and 2 major pathophysiological mechanisms: systemic inflammation and portal hypertension. Predicting the development of ACLF remains a major future challenge. ClinicalTrials.gov number: NCT03056612. Lay summary: Herein, we describe, for the first time, 3 different clinical courses of acute decompensation (AD) of cirrhosis after hospital admission. The first clinical course includes patients who develop acute-on-chronic liver failure (ACLF) and have a high short-term risk of death – termed pre-ACLF. The second clinical course (unstable decompensated cirrhosis) includes patients requiring frequent hospitalizations unrelated to ACLF and is associated with a lower mortality risk than pre-ACLF. Finally, the third clinical course (stable decompensated cirrhosis), includes two-thirds of all patients admitted to hospital with AD – patients in this group rarely require hospital admission and have a much lower 1-year mortality risk

    An international collaborative evaluation of central serous chorioretinopathy: different therapeutic approaches and review of literature. The European Vitreoretinal Society central serous chorioretinopathy study

    Get PDF
    Purpose: To study and compare the efficacy of different therapeutic options for the treatment of central serous chorioretinopathy (CSCR). Methods: This is a nonrandomized, international multicentre study on 1719 patients (1861 eyes) diagnosed with CSCR, from 63 centres (24 countries). Reported data included different methods of treatment and both results of diagnostic examinations [fluorescein angiography and/or optical coherent tomography (OCT)] and best-corrected visual acuity (BCVA) before and after therapy. The duration of observation had a mean of 11 months but was extended in a minority of cases up to 7 years. The aim of this study is to evaluate the efficacy of the different therapeutic options of CSCR in terms of both visual (BCVA) and anatomic (OCT) improvement. Results: One thousand seven hundred nineteen patients (1861 eyes) diagnosed with CSCR were included. Treatments performed were nonsteroidal anti-inflammatory eye drops, laser photocoagulation, micropulse diode laser photocoagulation, photodynamic therapy (PDT; Standard PDT, Reduced-dose PDT, Reduced-fluence PDT), intravitreal (IVT) antivascular endothelial growth factor injection (VEGF), observation and other treatments. The list of the OTHERS included both combinations of the main proposed treatments or a variety of other treatments such as eplerenone, spironolactone, acetazolamide, beta-blockers, anti-anxiety drugs, aspirin, folic acid, methotrexate, statins, vitis vinifera extract medication and pars plana vitrectomy. The majority of the patients were men with a prevalence of 77%. The odds ratio (OR) showed a partial or complete resolution of fluid on OCT with any treatment as compared with observation. In univariate analysis, the anatomical result (improvement in subretinal fluid using OCT at 1 month) was favoured by age <60 years (p < 0.005), no previous observation (p < 0.0002), duration less than 3 months (p < 0.0001), absence of CSCR in the fellow eye (p = 0.04), leakage outside of the arcade (p = 0.05) and fluid height >500 \u3bcm (p = 0.03). The OR for obtaining partial or complete resolution showed that anti-VEGF and eyedrops were not statistically significant; whereas PDT (8.5), thermal laser (11.3) and micropulse laser (8.9) lead to better anatomical results with less variability. In univariate analysis, the functional result at 1 month was favoured by first episode (p = 0.04), height of subretinal fluid >500 \u3bcm (p < 0.0001) and short duration of observation (p = 0.02). Finally, there was no statistically significant difference among the treatments at 12 months. Conclusion: Spontaneous resolution has been described in a high percentage of patients. Laser (micropulse and thermal) and PDT seem to lead to significant early anatomical improvement; however, there is little change beyond the first month of treatment. The real visual benefit needs further clarification

    Secukinumab, an Interleukin-17A Inhibitor, in Ankylosing Spondylitis

    Get PDF
    Background Secukinumab is an anti–interleukin-17A monoclonal antibody that has been shown to control the symptoms of ankylosing spondylitis in a phase 2 trial. We conducted two phase 3 trials of secukinumab in patients with active ankylosing spondylitis. Methods In two double-blind trials, we randomly assigned patients to receive secukinumab or placebo. In MEASURE 1, a total of 371 patients received intravenous secukinumab (10 mg per kilogram of body weight) or matched placebo at weeks 0, 2, and 4, followed by subcutaneous secukinumab (150 mg or 75 mg) or matched placebo every 4 weeks starting at week 8. In MEASURE 2, a total of 219 patients received subcutaneous secukinumab (150 mg or 75 mg) or matched placebo at baseline; at weeks 1, 2, and 3; and every 4 weeks starting at week 4. At week 16, patients in the placebo group were randomly reassigned to subcutaneous secukinumab at a dose of 150 mg or 75 mg. The primary end point was the proportion of patients with at least 20% improvement in Assessment of Spondyloarthritis International Society (ASAS20) response criteria at week 16. Results In MEASURE 1, the ASAS20 response rates at week 16 were 61%, 60%, and 29% for subcutaneous secukinumab at doses of 150 mg and 75 mg and for placebo, respectively (P<0.001 for both comparisons with placebo); in MEASURE 2, the rates were 61%, 41%, and 28% for subcutaneous secukinumab at doses of 150 mg and 75 mg and for placebo, respectively (P<0.001 for the 150-mg dose and P=0.10 for the 75-mg dose). The significant improvements were sustained through 52 weeks. Infections, including candidiasis, were more common with secukinumab than with placebo during the placebo-controlled period of MEASURE 1. During the entire treatment period, pooled exposure-adjusted incidence rates of grade 3 or 4 neutropenia, candida infections, and Crohn’s disease were 0.7, 0.9, and 0.7 cases per 100 patient-years, respectively, in secukinumab-treated patients. Conclusions Secukinumab at a subcutaneous dose of 150 mg, with either subcutaneous or intravenous loading, provided significant reductions in the signs and symptoms of ankylosing spondylitis at week 16. Secukinumab at a subcutaneous dose of 75 mg resulted in significant improvement only with a higher intravenous loading dose. (Funded by Novartis Pharma; ClinicalTrials.gov numbers, NCT01358175 and NCT01649375.

    Design and baseline characteristics of the finerenone in reducing cardiovascular mortality and morbidity in diabetic kidney disease trial

    Get PDF
    Background: Among people with diabetes, those with kidney disease have exceptionally high rates of cardiovascular (CV) morbidity and mortality and progression of their underlying kidney disease. Finerenone is a novel, nonsteroidal, selective mineralocorticoid receptor antagonist that has shown to reduce albuminuria in type 2 diabetes (T2D) patients with chronic kidney disease (CKD) while revealing only a low risk of hyperkalemia. However, the effect of finerenone on CV and renal outcomes has not yet been investigated in long-term trials. Patients and Methods: The Finerenone in Reducing CV Mortality and Morbidity in Diabetic Kidney Disease (FIGARO-DKD) trial aims to assess the efficacy and safety of finerenone compared to placebo at reducing clinically important CV and renal outcomes in T2D patients with CKD. FIGARO-DKD is a randomized, double-blind, placebo-controlled, parallel-group, event-driven trial running in 47 countries with an expected duration of approximately 6 years. FIGARO-DKD randomized 7,437 patients with an estimated glomerular filtration rate >= 25 mL/min/1.73 m(2) and albuminuria (urinary albumin-to-creatinine ratio >= 30 to <= 5,000 mg/g). The study has at least 90% power to detect a 20% reduction in the risk of the primary outcome (overall two-sided significance level alpha = 0.05), the composite of time to first occurrence of CV death, nonfatal myocardial infarction, nonfatal stroke, or hospitalization for heart failure. Conclusions: FIGARO-DKD will determine whether an optimally treated cohort of T2D patients with CKD at high risk of CV and renal events will experience cardiorenal benefits with the addition of finerenone to their treatment regimen. Trial Registration: EudraCT number: 2015-000950-39; ClinicalTrials.gov identifier: NCT02545049

    WATER-Model: An Optimal Allocation of Water Resources in Turkey, Syria and Iraq

    Full text link
    Political instability of several countries in the Middle East is overshadowing one of the biggest challenges of the upcoming century: Water - a natural resource that is easily taken for granted, but whose scarcity might lead to serious conflicts. This paper investigates an optimal Water Allocation of the Tigris and Euphrates Rivershed by introducing the WATER-Model. A series of scenarios are analyzed to examine the effects of different levels of cooperation for an optimal water allocation. Special emphasize is put on the effects of filling new Turkish reservoirs which can cause additional welfare losses if these actions are not done on a basin-wide coordinated basis. Modeling results show that Turkey is most efficient in its water usage. However, using the water for irrigation purposes in Turkey, instead of the Iraqi or Syrian domestic and industrial sector, decreases the overall welfare. Especially the Euphrates basin might thus encounter losses of up to 33% due to such strategic behaviour. The predicted water demand growth in the region is going to increase this water scarcity further. Minimum flow treaties between riparian countries, however, can help to increase the overall welfare and should therefore be fostered
    corecore