29 research outputs found

    Correlations between Coffee Consumption and Metabolic Phenotypes, Plasma Folate, and Vitamin B12: NHANES 2003 to 2006.

    Get PDF
    Metabolic syndrome (MetS) is prevalent not only among the overweight and obese but also normal weight individuals, and the phenotype is referred to as a metabolically unhealthy phenotype (MUHP). Besides normal weight individuals, overweight/obese individuals are also protected from MetS, and the phenotype is known as a metabolically healthy phenotype (MHP). Epidemiological studies indicate that coffee and micronutrients such as plasma folate or vitamin B12 (vit. B12) are inversely associated with MetS. However, correlations among coffee consumption metabolic phenotypes, plasma folate, and vit. B12 remain unknown. Our objective was to investigate the correlation between coffee consumption, metabolic phenotypes, plasma folate, and vit. B12 as well as to understand associations between plasma folate, vit. B12, and metabolic phenotypes. Associations among coffee consumption metabolic phenotypes, plasma folate, and vit. B12 were assessed in a cross-sectional study of 2201 participants, 18 years or older, from 2003-2004 and 2005-2006 National Health and Nutrition Examination Surveys (NHANES). MUHP was classified as having \u3e three metabolic abnormalities. Coffee consumption was not associated with metabolic phenotypes, but negatively correlated with several metabolic variables, including BMI (p \u3c 0.001). Plasma folate was positively associated with MUHP (p \u3c 0.004), while vit. B12 was inversely associated with MUHP (p \u3c 0.035). Our results suggest the potential protective impact of coffee on individual components of MetS and indicate a positive correlation between coffee consumption and MUHP among overweight individuals. Identifying possible dietary factors may provide practical and low-cost dietary intervention targets, specifically for early intervention. Larger and randomized intervention studies and prospective longitudinal studies are required to further evaluate these associations

    Exposure to Secondhand Smoke and the Development of Childhood Caries: NHANES (2011-2012)

    Full text link
    Dental caries continue to plague young children worldwide with numerous adverse effects including pain, poor growth and development, decreased quality of life as well as the potential for the development of life threatening secondary infections. Factors associated with the development of childhood caries are complex as they relate to social, economic and/or cultural behaviors. Recent evidence has linked secondhand smoke to the development of childhood dental caries. The purpose of the study is to re-examine the association between the frequency and extent of exposure to secondhand smoke with the development of childhood caries in the United States. Cross-sectional data of 1,511 children age 4 to 11 years from the U.S. National Health and Nutrition Examination Survey (NHANES) (2011-2012) were analyzed. Results indicate that children living in a home where one or more cigarettes were smoked inside per day were 1.59 times more likely to have caries compared to those who were not exposed to smoke inside the home (95% CI=1.02-2.47, p=0.041). Those children without insurance were also at highest risk for dental caries. However those with Medicare/Medicaid, despite having government mandated dental coverage, were also significantly affected and 1.67 times more likely to have dental caries compared to those with private insurance (95% CI=1.08-2.58, p=0.021). Creative approaches to improving health outcomes of families should include education about the adverse effects of ETS exposure, providing families with low or no cost community smoking cessation programs and reducing barriers to accessing preventive dental services for both children and their families

    The impact of immediate breast reconstruction on the time to delivery of adjuvant therapy: the iBRA-2 study

    Get PDF
    Background: Immediate breast reconstruction (IBR) is routinely offered to improve quality-of-life for women requiring mastectomy, but there are concerns that more complex surgery may delay adjuvant oncological treatments and compromise long-term outcomes. High-quality evidence is lacking. The iBRA-2 study aimed to investigate the impact of IBR on time to adjuvant therapy. Methods: Consecutive women undergoing mastectomy ± IBR for breast cancer July–December, 2016 were included. Patient demographics, operative, oncological and complication data were collected. Time from last definitive cancer surgery to first adjuvant treatment for patients undergoing mastectomy ± IBR were compared and risk factors associated with delays explored. Results: A total of 2540 patients were recruited from 76 centres; 1008 (39.7%) underwent IBR (implant-only [n = 675, 26.6%]; pedicled flaps [n = 105,4.1%] and free-flaps [n = 228, 8.9%]). Complications requiring re-admission or re-operation were significantly more common in patients undergoing IBR than those receiving mastectomy. Adjuvant chemotherapy or radiotherapy was required by 1235 (48.6%) patients. No clinically significant differences were seen in time to adjuvant therapy between patient groups but major complications irrespective of surgery received were significantly associated with treatment delays. Conclusions: IBR does not result in clinically significant delays to adjuvant therapy, but post-operative complications are associated with treatment delays. Strategies to minimise complications, including careful patient selection, are required to improve outcomes for patients

    Breast cancer management pathways during the COVID-19 pandemic: outcomes from the UK ‘Alert Level 4’ phase of the B-MaP-C study

    Get PDF
    Abstract: Background: The B-MaP-C study aimed to determine alterations to breast cancer (BC) management during the peak transmission period of the UK COVID-19 pandemic and the potential impact of these treatment decisions. Methods: This was a national cohort study of patients with early BC undergoing multidisciplinary team (MDT)-guided treatment recommendations during the pandemic, designated ‘standard’ or ‘COVID-altered’, in the preoperative, operative and post-operative setting. Findings: Of 3776 patients (from 64 UK units) in the study, 2246 (59%) had ‘COVID-altered’ management. ‘Bridging’ endocrine therapy was used (n = 951) where theatre capacity was reduced. There was increasing access to COVID-19 low-risk theatres during the study period (59%). In line with national guidance, immediate breast reconstruction was avoided (n = 299). Where adjuvant chemotherapy was omitted (n = 81), the median benefit was only 3% (IQR 2–9%) using ‘NHS Predict’. There was the rapid adoption of new evidence-based hypofractionated radiotherapy (n = 781, from 46 units). Only 14 patients (1%) tested positive for SARS-CoV-2 during their treatment journey. Conclusions: The majority of ‘COVID-altered’ management decisions were largely in line with pre-COVID evidence-based guidelines, implying that breast cancer survival outcomes are unlikely to be negatively impacted by the pandemic. However, in this study, the potential impact of delays to BC presentation or diagnosis remains unknown

    EMPTY AND FILLED BOTTLE INSPECTION SYSTEM

    No full text
    Automated Visual Inspection System (AVIS) have a strong ability for quality control in manufacturing industries by inspecting products automatically instead of manual inspections. This paper gives methods for bottle inspection in manufacturing industries. The paper describes mechanism for the defect detection, top and bottom detection, cap placement and fill level inspection. For empty bottle inspection the image is processed by contrast enhancement and then circular Hough transform is used. The location and radius of top and bottom of bottle is analyzed. After filling the bottle with liquid and placing the cap, edge detection method is used which is followed by horizontal line detection to identify whether fill level and cap closure is appropriate or not. Presented bottle inspection system works with 100% accuracy in proper illumination condition

    Asian Americans & chronic kidney disease in a nationally representative cohort

    Get PDF
    Abstract Background There is a paucity of specific data on early stages of chronic kidney disease (CKD) among Asian Americans (AAs). The objective of this study was to examine the independent association of Asian race/ethnicity and socio-demographic and co-morbidity factors with markers of early kidney damage, ascertained by ACR levels, as well as kidney dysfunction, ascertained by eGFR levels in a large cross-sectional sample of AAs enrolled in the National Health and Nutrition Examination Survey (NHANES). Methods Secondary data analyses of the NHANES 2011–2014 data of a nationally representative sample of 5907 participants 18 years and older, US citizens, and of Asian and White race. NHANES data included race (Asian vs. White), as well as other socio-demographic information and comorbidities. Urine albumin-to-creatinine ratio (ACR) categories and estimated glomerular filtration rate (eGFR) were used as indicators for CKD. Descriptive analyses using frequencies, means (standard deviations), and chi-square tests was first conducted, then multivariable logistic regression serial adjustment models were used to examine the associations between race/ethnicity, other socio-demographic factors (age, sex, education), and co-morbidities (obesity, diabetes, hypertension) with elevated ACR levels (A2 & A3 – CKD Stages 3 and 4–5, respectively) as well as reduced eGFR (G3a-G5 and G3b –G5 - CKD Stage 3–5). Results AAs were more likely than White participants to have ACR levels > 300 mg/g (A3) (adjusted OR (aOR) (95% CI) 2.77 (1.55, 4.97), p = 0.001). In contrast, adjusted analyses demonstrated that AAs were less likely to have eGFR levels  300 mg/g levels (A3) but lower risk of having eGFR levels < 60 ml/min/1.732 m2 (G3a-G5). The findings support the need to address the gaps in knowledge regarding disparities in risk of early stage CKD among AAs

    Feasibility and outcomes of an out-of-school and home-based obesity prevention pilot study for rural children on an American Indian reservation

    No full text
    Abstract Background Children living in rural areas are at higher risk for obesity compared to urban children, and Native American (NA) children have the highest prevalence of overweight/obesity for all races combined. Out-of-school programs (OOSPs) are a promising setting to improve children’s health. Parents are important in supporting their child’s obesity-related behaviors, yet it remains unclear what combination and dose of parent engagement strategies is feasible and optimal. This study’s primary objective was to assess the feasibility of an OOSP and home-based obesity prevention intervention for rural NA and non-NA children. Methods This was an 11-week, two group, randomized feasibility study. Participants were children and their parents at one OOSP on a rural American Indian reservation. Children, ages 6–9, were randomized to receive the Generations Health (GH) intervention or comparison condition. The GH group received daily activities focused on physical activity (PA), nutrition, sleep, and reducing TV/screen time, and frequently engaged parents. The comparison group received usual OOSP activities. To assess intervention feasibility, we measured recruitment and participation rates and program satisfaction. We assessed pre- to posttest changes in body composition, PA and sleep patterns, dietary intake and Healthy Eating Index-2010 (HEI-2010) scores, TV/screen time, and nutrition knowledge. We report recruitment and participation rates as percentages and participants’ program satisfaction as means. Two-tailed paired t tests and 95% confidence intervals were used to detect changes in behavioral and health outcome variables. Results Forty-six children met age eligibility criteria; following screening, 52% (24/46) met the inclusion criteria and 96% (23/24) were randomized to the study. Overall, 91% of the children participated in the intervention and 100% participated in at least some of the posttest assessments. Parents reported high program satisfaction (mean rating of 4, on a 1–5 scale). Our outcome measure for child adiposity, zBMI, was reduced by 0.15 in the GH group, but increased by 0.13 in the comparison condition. Meaningful changes were evident for total kilocalories, HEI-2010 scores, PA, TV/screen time, and nutrition knowledge. Conclusions High recruitment, participation and program satisfaction and positive health and behavioral outcomes at 11 weeks provide encouraging indications of the feasibility and potential effectiveness of the intervention. Trial registration ISRCTN2427424

    Understanding the Biological Basis of Glioblastoma Patient-Derived Spheroids

    No full text
    BACKGROUND/AIM: Resistance to glioblastoma (GB) therapy is attributed to the presence of glioblastoma stem cells (GSC). Here, we defined the behavior of GSC as it pertains to proliferation, migration, and angiogenesis. MATERIALS AND METHODS: Human-derived GSC were isolated and cultured from GB patient tumors. Xenograft GSC were extracted from the xenograft tumors, and spheroids were created and compared with human GSC spheroids by flow cytometry, migration, proliferation, and angiogenesis assays. Oct3/4 and Sox2, GFAP, and Ku80 expression was assessed by immunoanalysis. RESULTS: The xenograft model showed the formation of two different tumors with distinct characteristics. Tumors formed at 2 weeks were less aggressive with well-defined margins, whereas tumors formed in 5 months were diffuse and aggressive. Expression of Oct3/4 and Sox2 was positive in both human and xenograft GSC. Positive Ku80 expression in xenograft GSC confirmed their human origin. Human and xenograft GSC migrated vigorously in collagen and Matrigel, respectively. Xenograft GSC displayed a higher rate of migration and invasion than human GSC. CONCLUSION: Human GSC were more aggressive in growth and proliferation than xenograft GSC, while xenograft GSC had increased invasion and migration compared to human GSC. A simple in vitro spheroid system for GSC provides a superior platform for the development of precision medicine in the treatment of GB

    A comparison of thick-film microscopy, rapid diagnostic test, and polymerase chain reaction for accurate diagnosis of Plasmodium falciparum malaria

    No full text
    Abstract Background Accurate diagnosis of malaria is important for effective disease management and control. In Cameroon, presumptive clinical diagnosis, thick-film microscopy (TFM), and rapid diagnostic tests (RDT) are commonly used to diagnose cases of Plasmodium falciparum malaria. However, these methods lack sensitivity to detect low parasitaemia. Polymerase chain reaction (PCR), on the other hand, enhances the detection of sub-microscopic parasitaemia making it a much-needed tool for epidemiological surveys, mass screening, and the assessment of interventions for malaria elimination. Therefore, this study sought to determine the frequency of cases missed by traditional methods that are detected by PCR. Methods Blood samples, collected from 551 febrile Cameroonian patients between February 2014 and February 2015, were tested for P. falciparum by microscopy, RDT and PCR. The hospital records of participants were reviewed to obtain data on the clinical diagnosis made by the health care worker. Results The prevalence of malaria by microscopy, RDT and PCR was 31%, 45%, and 54%, respectively. However, of the 92% of participants diagnosed as having clinical cases of malaria by the health care worker, 38% were malaria-negative by PCR. PCR detected 23% and 12% more malaria infections than microscopy and RDT, respectively. A total of 128 (23%) individuals had sub-microscopic infections in the study population. The sensitivity of microscopy, RDT, and clinical diagnosis was 57%, 78% and 100%; the specificity was 99%, 94%, and 17%; the positive predictive values were 99%, 94%, and 59%; the negative predictive values were 66%, 78%, and 100%, respectively. Thus, 41% of the participants clinically diagnosed as having malaria had fever caused by other pathogens. Conclusions Malaria diagnostic methods, such as TFM and RDT missed 12–23% of malaria cases detected by PCR. Therefore, traditional diagnostic approaches (TFM, RDT and clinical diagnosis) are not adequate when accurate epidemiological data are needed for monitoring malaria control and elimination interventions
    corecore