250 research outputs found

    A pilot study on palmyrah pinattu (dried fruit pulp) as an anti-diabetic food component

    Get PDF
    The fruit pulp of palmyrah (Borassus flabellifer L.) has been shown to inhibit intestinal glucose uptake in mice, the active principle being a steroidal saponin, flabelliferin-II which inhibits intestinal ATPase in mice at 5×10-5M level. Palmyrah fruit pulp (PFP) is widely used to manufacture many food products including dried PFP (pinattu), which has been consumed in North-East Sri Lanka for centuries. The present study was carried out to investigate whether PFP in the form of pinattu could reduce serum glucose levels of mild diabetic (Type-II) patients who were not on a drug regimen with a view to developing pinattu as an anti-diabetic food component. Patients (newly diagnosed, Type-II, mild diabetic patients) attending the diabetic clinic at the Family Practice Centre, University of Sri Jayewardenepura, Sri Lanka, were subjected to a glucose challenge (75g/50kg BW) after a 10 hour overnight fast and the blood glucose levels determined. On subsequent visits of each patient (3 days after the first visit) blood glucose was determined after administration of PFP in the form of pinattu (6g/50kg BW) or fibre (4g/50kg BW) extracted from PFP prior to the glucose challenge. The methodology employed was the cross over method where each patient was its own control. In all mild diabetic patients treated with pinattu, there was a significant reduction (p< 0.01, by 15-48%) in blood glucose concentration after a glucose challenge. Therefore the results of the present study suggest that pinattu (dried PFP) could be used as an anti-hyperglycemic agent. Keywords: Borassus flabellifer, flabelliferin, palmyrah fruit pulp, blood glucose, diabetes. International Journal of Biological and Chemical Sciences Vol. 1 (3) 2007: pp. 250-25

    Effectiveness of personal planning for residents re-entering the community during the facility initiative in Ontario

    Get PDF
    This paper reports on the relocation of people with intellectual disabilities (ID) from large-scale provincially run institutions that took place in Ontario as part of the Facility Initiative. Three case studies were examined in order to report on this process as experienced by those who lived and worked through it. Specifically, the planning process conducted by the Ministry of Community and Social Services (MCSS) to assist each person with hislher transition to community living was examined using the current standard of practice in person- centered planning approaches. Effectiveness was evaluated as the ability to apply a person-centered approach across settings and people, as well as what factors facilitated or hindered its application. Results show that, in general, the personal plans do not appear to reflect the pre-transition experience of the person. Also, the transitional planning process did not appear person-centered nor facilitate further person-centered planning in the community

    The Human Touch

    Get PDF

    The Human Touch

    Get PDF

    Progression of coronary artery calcification in conventional hemodialysis, nocturnal hemodialysis, and kidney transplantation

    Get PDF
    Introduction Cardiovascular disease is the leading cause of death in end-stage renal disease (ESRD) and is strongly associated with vascular calcification. An important driver of vascular calcification is high phosphate levels, but these become lower when patients initiate nocturnal hemodialysis or receive a kidney transplant. However, it is unknown whether nocturnal hemodialysis or kidney transplantation mitigate vascular calcification. Therefore, we compared progression of coronary artery calcification (CAC) between patients treated with conventional hemodialysis, nocturnal hemodialysis, and kidney transplant recipients. Methods We measured CAC annually up to 3 years in 114 patients with ESRD that were transplantation candidates: 32 that continued conventional hemodialysis, 34 that initiated nocturnal hemodialysis (>= 4x 8 hours/week), and 48 that received a kidney transplant. We compared CAC progression between groups as the difference in square root transformed volume scores per year (Delta CAC SQRV) using linear mixed models. Reference category was conventional hemodialysis. Results The mean age of the study population was 53 +/- 13 years, 75 (66%) were male, and median dialysis duration was 28 (IQR 12-56) months. Median CAC score at enrollment was 171 (IQR 10-647), which did not differ significantly between treatment groups (P = 0.83). Compared to conventional hemodialysis, CAC progression was non-significantly different in nocturnal hemodialysis -0.10 (95% CI -0.77 to 0.57) and kidney transplantation -0.33 (95% CI -0.96 to 0.29) in adjusted models. Conclusions Nocturnal hemodialysis and kidney transplantation are not associated with significantly less CAC progression compared to conventional hemodialysis during up to 3 years follow-up. Further studies are needed to confirm these findings, to determine which type of calcification is measured with CAC in end-stage renal disease, and whether that reflects cardiovascular risk

    Reversibility of Frailty After Bridge-to-Transplant Ventricular Assist Device Implantation or Heart Transplantation.

    Full text link
    BACKGROUND: We recently reported that frailty is independently predictive of increased mortality in patients with advanced heart failure referred for heart transplantation (HTx). The aim of this study was to assess the impact of frailty on short-term outcomes after bridge-to-transplant ventricular assist device (BTT-VAD) implantation and/or HTx and to determine if frailty is reversible after these procedures. METHODS: Between August 2013 and August 2016, 100 of 126 consecutive patients underwent frailty assessment using Fried's Frailty Phenotype before surgical intervention: 40 (21 nonfrail, 19 frail) BTT-VAD and 77 (60 nonfrail, 17 frail) HTx-including 17 of the 40 BTT-VAD supported patients. Postprocedural survival, intubation time, intensive care unit, and hospital length of stay were compared between frail and nonfrail groups. Twenty-six frail patients were reassessed at 2 months or longer postintervention. RESULTS: Frail patients had lower survival (63 ± 10% vs 94 ± 3% at 1 year, P = 0.012) and experienced significantly longer intensive care unit (11 vs 5 days, P = 0.002) and hospital (49 vs 25 days, P = 0.003) length of stay after surgical intervention compared with nonfrail patients. Twelve of 13 frail patients improved their frailty score after VAD (4.0 ± 0.8 to 1.4 ± 1.1, P < 0.001) and 12 of 13 frail patients improved their frailty score after HTx (3.2 ± 0.4 to 0.9 ± 0.9, P < 0.001). Handgrip strength and depression improved postintervention. Only a slight improvement in cognitive function was seen postintervention. CONCLUSIONS: Frail patients with advanced heart failure experience increased mortality and morbidity after surgical intervention with BTT-VAD or HTx. Among those who survive frailty is partly or completely reversible underscoring the importance of considering this factor as a dynamic not fixed entity

    Can Campylobacter coli induce Guillain-Barré syndrome?

    Get PDF
    Campylobacter jejuni enteritis is the most frequently identified infection preceding the Guillain-Barr\ue9 syndrome (GBS) and neural damage is thought to be induced through molecular mimicry between C. jejuni lipo-oligosaccharide (LOS) and human gangliosides. It has been questioned whether or not other Campylobacter species, including C. curvus, C. upsaliensis and C. coli, could be similarly involved. This is relevant because it would imply that bacterial factors considered important in the aetiology of GBS crossed species barriers. Two prior reports have appeared where C. coli was putatively associated with a case of GBS.Peer reviewed: YesNRC publication: Ye

    Scholars’ open debate paper on the World Health Organization ICD-11 gaming disorder proposal

    Get PDF
    Concerns about problematic gaming behaviors deserve our full attention. However, we claim that it is far from clear that these problems can or should be attributed to a new disorder. The empirical basis for a Gaming Disorder proposal, such as in the new ICD-11, suffers from fundamental issues. Our main concerns are the low quality of the research base, the fact that the current operationalization leans too heavily on substance use and gambling criteria, and the lack of consensus on symptomatology and assessment of problematic gaming. The act of formalizing this disorder, even as a proposal, has negative medical, scientific, public-health, societal, and human rights fallout that should be considered. Of particular concern are moral panics around the harm of video gaming. They might result in premature application of diagnosis in the medical community and the treatment of abundant false-positive cases, especially for children and adolescents. Second, research will be locked into a confirmatory approach, rather than an exploration of the boundaries of normal versus pathological. Third, the healthy majority of gamers will be affected negatively. We expect that the premature inclusion of Gaming Disorder as a diagnosis in ICD-11 will cause significant stigma to the millions of children who play video games as a part of a normal, healthy life. At this point, suggesting formal diagnoses and categories is premature: the ICD-11 proposal for Gaming Disorder should be removed to avoid a waste of public health resources as well as to avoid causing harm to healthy video gamers around the world

    Interlaboratory comparison of sample preparation methods, database expansions, and cutoff values for identification of yeasts by matrix-assisted laser desorption ionization-time of flight mass spectrometry using a yeast test panel

    Get PDF
    An interlaboratory study using matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) to determine the identification of clinically important yeasts (n35) was performed at 11 clinical centers, one company, and one reference center using the Bruker Daltonics MALDI Biotyper system. The optimal cutoff for the MALDI-TOF MS score was investigated using receiver operating characteristic (ROC) curve analyses. The percentages of correct identifications were compared for different sample preparation methods and different databases. Logistic regression analysis was performed to analyze the association between the number of spectra in the database and the percentage of strains that were correctly identified. A total of 5,460 MALDI-TOF MS results were obtained. Using all results, the area under the ROC curve was 0.95 (95% confidence interval [CI], 0.94 to 0.96). With a sensitivity of 0.84 and a specificity of 0.97, a cutoff value of 1.7 was considered optimal. The overall percentage of correct identifications (formic acid-ethanol extraction method, score>1.7) was 61.5% when the commercial Bruker Daltonics database (BDAL) was used, and it increased to 86.8% by using an extended BDAL supplemented with a Centraalbureau voor Schimmelcultures (CBS)-KNAW Fungal Biodiversity Centre in-house database (BDALCBS in-house). A greater number of main spectra (MSP) in the database was associated with a higher percentage of correct identifications (odds ratio [OR], 1.10; 95% CI, 1.05 to 1.15; P<0.01). The results from the direct transfer method ranged from 0% to 82.9% correct identifications, with the results of the top four centers ranging from 71.4% to 82.9% correct identifications. This study supports the use of a cutoff value of 1.7 for the identification of yeasts using MALDI-TOF MS. The inclusion of enough isolates of the same species in the database can enhance the proportion of correctly identified strains. Further optimization of the preparation methods, especially of the direct transfer method, may contribute to improved diagnosis of yeast-related infections

    Coronary Artery Calcification in Hemodialysis and Peritoneal Dialysis

    Get PDF
    Background: Vascular calcification is seen in most patients on dialysis and is strongly associated with cardiovascular mortality. Vascular calcification is promoted by phosphate, which generally reaches higher levels in hemodialysis than in peritoneal dialysis. However, whether vascular calcification develops less in peritoneal dialysis than in hemodialysis is currently unknown. Therefore, we compared coronary artery calcification (CAC), its progression, and calcification biomarkers between patients on hemodialysis and peritoneal dialysis. Methods: We measured CAC in 134 patients who had been treated exclusively with hemodialysis (n = 94) or peritoneal dialysis (n = 40) and were transplantation candidates. In 57 of them (34 on hemodialysis and 23 on peritoneal dialysis), we also measured CAC progression annually up to 3 years and the inactive species of desphospho-uncarboxylated matrix Gla protein (dp-ucMGP), fetuin-A, osteoprotegerin. We compared CAC cross-sectionally with Tobit regression. CAC progression was compared in 2 ways: with linear mixed models as the difference in square root transformed volume score per year (CAC SQRV) and with Tobit mixed models. We adjusted for potential confounders. Results: In the cross-sectional cohort, CAC volume scores were 92 mm(3) in hemodialysis and 492 mm(3) in peritoneal dialysis (adjusted difference 436 mm(3); 95% CI -47 to 919; p = 0.08). In the longitudinal cohort, peritoneal dialysis was associated with significantly more CAC progression defined as CAC SQRV (adjusted difference 1.20; 95% CI 0.09 to 2.31; p = 0.03), but not with Tobit mixed models (adjusted difference in CAC score increase per year 106 mm(3); 95% CI -140 to 352; p = 0.40). Peritoneal dialysis was associated with higher osteoprotegerin (adjusted p = 0.02) but not with dp-ucMGP or fetuin-A. Conclusions: Peritoneal dialysis is not associated with less CAC or CAC progression than hemodialysis, and perhaps with even more progression. This indicates that vascular calcification does not develop less in peritoneal dialysis than in hemodialysis
    corecore