145 research outputs found

    Does the risk of childhood diabetes mellitus require revision of the guideline values for nitrate in drinking water?

    Get PDF
    In recent years, several studies have addressed a possible relationship between nitrate exposure and childhood type 1 insulin-dependent diabetes mellitus. The present ecologic study describes a possible relation between the incidence of type 1 diabetes and nitrate levels in drinking water in The Netherlands, and evaluates whether the World Health Organization and the European Commission standard for nitrate in drinking water (50 mg/L) is adequate to prevent risk of this disease. During 1993-1995 in The Netherlands, 1,104 cases of type 1 diabetes were diagnosed in children 0-14 years of age. We were able to use 1,064 of these cases in a total of 2,829,020 children in this analysis. We classified mean nitrate levels in drinking water in 3,932 postal code areas in The Netherlands in 1991-1995 into two exposure categories. One category was based on equal numbers of children exposed to different nitrate levels (0.25-2.08, 2.10-6.42, and 6.44-41.19 mg/L nitrate); the other was based on cut-off values of 10 and 25 mg/L nitrate. We determined standardized incidence ratios (SIRs) for type 1 diabetes in subgroups of the 2,829,020 children with respect to both nitrate exposure categories, sex, and age and as compared in univariate analysis using the chi-square test for trend. We compared the incidence rate ratios (IRRs) by multivariate analysis in a Poisson regression model. We found an effect of increasing age of the children on incidence of type 1 diabetes, but we did not find an effect of sex or of nitrate concentration in drinking water using the two exposure categories. For nitrate levels > 25 mg/L, an increased SIR and an increased IRR of 1.46 were observed; however, this increase was not statistically significant, probably because of the small number of cases (15 of 1,064). We concluded that there is no convincing evidence that nitrate in drinking water at current exposure levels is a risk factor for childhood type 1 diabetes mellitus in The Netherlands, although a threshold value > 25 mg/L for the occurrence of this disease can not be excluded

    Nitrate contamination of drinking water: relationship with HPRT variant frequency in lymphocyte DNA and urinary excretion of N-nitrosamines.

    Get PDF
    We studied peripheral lymphocyte HPRT variant frequency and endogenous nitrosation in human populations exposed to various nitrate levels in their drinking water. Four test populations of women volunteers were compared. Low and medium tap water nitrate exposure groups (14 and 21 subjects) were using public water supplies with nitrate levels of 0.02 and 17.5 mg/l, respectively. Medium and high well water nitrate exposure groups (6 and 9 subjects) were using private water wells with mean nitrate levels of 25 and 135 mg/l, respectively. Higher nitrate intake by drinking water consumption resulted in a dose-dependent increase in 24-hr urinary nitrate excretion and in increased salivary nitrate and nitrite levels. The mean log variant frequency of peripheral lymphocytes was significantly higher in the medium well water exposure group than in the low and medium tap water exposure groups. An inverse correlation between peripheral lymphocyte labeling index and nitrate concentration of drinking water was observed. Analysis of N-nitrosamine in the urine of 22 subjects by gas chromatography-mass spectrometry revealed the presence of N-nitrosopyrrolidine in 18 subjects. Analysis of the mutagenicity of well water samples showed that a small number of the well water samples were mutagenic in the Ames Salmonella typhimurium test after concentration over XAD-2 resin. In conclusion, consumption of drinking water, especially well water, with high nitrate levels can imply a genotoxic risk for humans as indicated by increased HPRT variant frequencies and by endogenous formation of carcinogenic N-nitroso compounds from nitrate-derived nitrite

    Persistence of anticancer activity in berry extracts after simulated gastrointestinal digestion and colonic fermentation

    Get PDF
    Fruit and vegetable consumption is associated at the population level with a protective effect against colorectal cancer. Phenolic compounds, especially abundant in berries, are of interest due to their putative anticancer activity. After consumption, however, phenolic compounds are subject to digestive conditions within the gastrointestinal tract that alter their structures and potentially their function. However, the majority of phenolic compounds are not efficiently absorbed in the small intestine and a substantial portion pass into the colon. We characterized berry extracts (raspberries, strawberries, blackcurrants) produced by in vitro-simulated upper intestinal tract digestion and subsequent fecal fermentation. These extracts and selected individual colonic metabolites were then evaluated for their putative anticancer activities using in vitro models of colorectal cancer, representing the key stages of initiation, promotion and invasion. Over a physiologically-relevant dose range (0–50 µg/ml gallic acid equivalents), the digested and fermented extracts demonstrated significant anti-genotoxic, anti-mutagenic and anti-invasive activity on colonocytes. This work indicates that phenolic compounds from berries undergo considerable structural modifications during their passage through the gastrointestinal tract but their breakdown products and metabolites retain biological activity and can modulate cellular processes associated with colon cancer

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Associations of familial risk factors with social fears and social phobia: evidence for the continuum hypothesis in social anxiety disorder?

    Get PDF
    We examined parental psychopathology and family environment in subthreshold and DSM-IV threshold conditions of social anxiety disorder (SAD) in a representative cohort sample of 1,395 adolescents. Offspring and parental psychopathology was assessed using the DIA-X/M-CIDI; recalled parental rearing and family functioning via questionnaire. Diagnostic interviews in parents were supplemented by family history reports from offspring. The cumulative lifetime incidence was 23.07% for symptomatic SAD, and 18.38 and 7.41% for subthreshold and threshold SAD, respectively. The specific parent-to-offspring association for SAD occurred for threshold SAD only. For subthreshold and threshold SAD similar associations were found with other parental anxiety disorders, depression and substance use disorders. Parental rearing behaviour, but not family functioning, was associated with offspring threshold SAD, and although less strong and less consistent, also with subthreshold SAD. Results suggest a continued graded relationship between familial risk factors and offspring SAD. Parental psychopathology and negative parental styles may be used defining high-risk groups to assign individuals with already subthreshold conditions of SAD to early intervention programs

    Patient-directed self-management of pain (PaDSMaP) compared to treatment as usual following total knee replacement; a randomised controlled trial

    Get PDF
    Background Self-administration of medicines by patients whilst in hospital is being increasingly promoted despite little evidence to show the risks and benefits. Pain control after total knee replacement (TKR) is known to be poor. The aim of the study was to determine if patients operated on with a TKR who self-medicate their oral analgesics in the immediate post-operative period have better pain control than those who receive their pain control by nurse-led drug rounds (Treatment as Usual (TAU)). Methods A prospective, parallel design, open-label, randomised controlled trial comparing pain control in patient-directed self-management of pain (PaDSMaP) with nurse control of oral analgesia (TAU) after a TKR. Between July 2011 and March 2013, 144 self-medicating adults were recruited at a secondary care teaching hospital in the UK. TAU patients (n = 71) were given medications by a nurse after their TKR. PaDSMaP patients (n = 73) took oral medications for analgesia and co-morbidities after two 20 min training sessions reinforced with four booklets. Primary outcome was pain (100 mm visual analogue scale (VAS)) at 3 days following TKR surgery or at discharge (whichever came soonest). Seven patients did not undergo surgery for reasons unrelated to the study and were excluded from the intention-to-treat (ITT) analysis. Results ITT analysis did not detect any significant differences between the two groups’ pain scores. A per protocol (but underpowered) analysis of the 60% of patients able to self-medicate found reduced pain compared to the TAU group at day 3/discharge, (VAS -9.9 mm, 95% CI -18.7, − 1.1). One patient in the self-medicating group over-medicated but suffered no harm. Conclusion Self-medicating patients did not have better (lower) pain scores compared to the nurse-managed patients following TKR. This cohort of patients were elderly with multiple co-morbidities and may not be the ideal target group for self-medication

    Structure of Spontaneous UP and DOWN Transitions Self-Organizing in a Cortical Network Model

    Get PDF
    Synaptic plasticity is considered to play a crucial role in the experience-dependent self-organization of local cortical networks. In the absence of sensory stimuli, cerebral cortex exhibits spontaneous membrane potential transitions between an UP and a DOWN state. To reveal how cortical networks develop spontaneous activity, or conversely, how spontaneous activity structures cortical networks, we analyze the self-organization of a recurrent network model of excitatory and inhibitory neurons, which is realistic enough to replicate UP–DOWN states, with spike-timing-dependent plasticity (STDP). The individual neurons in the self-organized network exhibit a variety of temporal patterns in the two-state transitions. In addition, the model develops a feed-forward network-like structure that produces a diverse repertoire of precise sequences of the UP state. Our model shows that the self-organized activity well resembles the spontaneous activity of cortical networks if STDP is accompanied by the pruning of weak synapses. These results suggest that the two-state membrane potential transitions play an active role in structuring local cortical circuits

    Population‐based cohort study of outcomes following cholecystectomy for benign gallbladder diseases

    Get PDF
    Background The aim was to describe the management of benign gallbladder disease and identify characteristics associated with all‐cause 30‐day readmissions and complications in a prospective population‐based cohort. Methods Data were collected on consecutive patients undergoing cholecystectomy in acute UK and Irish hospitals between 1 March and 1 May 2014. Potential explanatory variables influencing all‐cause 30‐day readmissions and complications were analysed by means of multilevel, multivariable logistic regression modelling using a two‐level hierarchical structure with patients (level 1) nested within hospitals (level 2). Results Data were collected on 8909 patients undergoing cholecystectomy from 167 hospitals. Some 1451 cholecystectomies (16·3 per cent) were performed as an emergency, 4165 (46·8 per cent) as elective operations, and 3293 patients (37·0 per cent) had had at least one previous emergency admission, but had surgery on a delayed basis. The readmission and complication rates at 30 days were 7·1 per cent (633 of 8909) and 10·8 per cent (962 of 8909) respectively. Both readmissions and complications were independently associated with increasing ASA fitness grade, duration of surgery, and increasing numbers of emergency admissions with gallbladder disease before cholecystectomy. No identifiable hospital characteristics were linked to readmissions and complications. Conclusion Readmissions and complications following cholecystectomy are common and associated with patient and disease characteristics

    A review of zoonotic infection risks associated with the wild meat trade in Malaysia.

    Get PDF
    The overhunting of wildlife for food and commercial gain presents a major threat to biodiversity in tropical forests and poses health risks to humans from contact with wild animals. Using a recent survey of wildlife offered at wild meat markets in Malaysia as a basis, we review the literature to determine the potential zoonotic infection risks from hunting, butchering and consuming the species offered. We also determine which taxa potentially host the highest number of pathogens and discuss the significant disease risks from traded wildlife, considering how cultural practices influence zoonotic transmission. We identify 51 zoonotic pathogens (16 viruses, 19 bacteria and 16 parasites) potentially hosted by wildlife and describe the human health risks. The Suidae and the Cervidae families potentially host the highest number of pathogens. We conclude that there are substantial gaps in our knowledge of zoonotic pathogens and recommend performing microbial food safety risk assessments to assess the hazards of wild meat consumption. Overall, there may be considerable zoonotic risks to people involved in the hunting, butchering or consumption of wild meat in Southeast Asia, and these should be considered in public health strategies
    corecore