64 research outputs found
Predictors of poor sleep quality among head and neck cancer patients
Objectives/Hypothesis: The objective of this study was to determine the predictors of sleep quality among head and neck cancer patients 1 year after diagnosis. Study Design: This was a prospective, multisite cohort study of head and neck cancer patients (N = 457). Methods: Patients were surveyed at baseline and 1 year after diagnosis. Chart audits were also conducted. The dependent variable was a self-assessed sleep score 1 year after diagnosis. The independent variables were a 1 year pain score, xerostomia, treatment received (radiation, chemotherapy, and/or surgery), presence of a feeding tube and/or tracheotomy, tumor site and stage, comorbidities, depression, smoking, problem drinking, age, and sex. Results: Both baseline (67.1) and 1-year postdiagnosis (69.3) sleep scores were slightly lower than population means (72). Multivariate analyses showed that pain, xerostomia, depression, presence of a tracheotomy tube, comorbidities, and younger age were statistically significant predictors of poor sleep 1 year after diagnosis of head and neck cancer ( P < .05). Smoking, problem drinking, and female sex were marginally significant ( P < .09). Type of treatment (surgery, radiation and/or chemotherapy), primary tumor site, and cancer stage were not significantly associated with 1-year sleep scores. Conclusions: Many factors adversely affecting sleep in head and neck cancer patients are potentially modifiable and appear to contribute to decreased quality of life. Strategies to reduce pain, xerostomia, depression, smoking, and problem drinking may be warranted, not only for their own inherent value, but also for improvement of sleep and the enhancement of quality of life. Laryngoscope, 2010Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/75789/1/20924_ftp.pd
Salivary Gland Sparing and Improved Target Irradiation by Conformal and Intensity Modulated Irradiation of Head and Neck Cancer
The goals of this study were to facilitate sparing of the major salivary glands while adequately treating tumor targets in patients requiring comprehensive bilateral neck irradiation (RT), and to assess the potential for improved xerostomia. Since 1994 techniques of target irradiation and locoregional tumor control with conformal and intensity modulated radiation therapy (IMRT) have been developed. In patients treated with these modalities, the salivary flow rates before and periodically after RT have been measured selectively from each major salivary gland and the residual flows correlated with glands’ dose volume histograms (DVHs). In addition, subjective xerostomia questionnaires have been developed and validated. The pattern of locoregional recurrence has been examined from computed tomography (CT) scans at the time of recurrence, transferring the recurrence volumes to the planning CT scans, and regenerating the dose distributions at the recurrence sites. Treatment plans for target coverage and dose homogeneity using static, multisegmental IMRT were found to be significantly better than standard RT plans. In addition, significant parotid gland sparing was achieved in the conformal plans. The relationships among dose, irradiated volume, and the residual saliva flow rates from the parotid glands were characterized by dose and volume thresholds. A mean radiation dose of 26 Gy was found to be the threshold for preserved stimulated saliva flow. Xerostomia questionnaire scores suggested that xerostomia was significantly reduced in patients irradiated with bilateral neck, parotid-sparing RT, compared to patients with similar tumors treated with standard RT. Examination of locoregional tumor recurrence patterns revealed that the large majority of recurrences occurred inside targets, in areas that had been judged to be at high risk and that had received RT doses according to the perceived risk. Tangible gains in salivary gland sparing and target coverage are being achieved, and an improvement in some measures of quality of life is suggested by our findings. Additional reduction of xerostomia may be achieved by further sparing of the salivary glands and the non-involved oral cavity. A mean parotid gland dose of ≤ 26 Gy should be a planning objective if significant parotid function preservation is desired. The pattern of recurrence suggests that careful escalation of the dose to areas judged to be at highest risk may improve tumor control.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/41298/1/268_2003_Article_7105.pd
Effects of Two Commercial Electronic Prescribing Systems on Prescribing Error Rates in Hospital In-Patients: A Before and After Study
In a before-and-after study, Johanna Westbrook and colleagues evaluate the change in prescribing error rates after the introduction of two commercial electronic prescribing systems in two Australian hospitals
Predicting youth participation in urban agriculture in Malaysia: insights from the theory of planned behavior and the functional approach to volunteer motivation
This study examines factors associated with the decision of Malaysian youth to participate in a voluntary urban agriculture program. Urban agriculture has generated significant interest in developing countries to address concerns over food security, growing urbanization and employment. While an abundance of data shows attracting the participation of young people in traditional agriculture has become a challenge for many countries, few empirical studies have been conducted on youth motivation to participate in urban agriculture programs, particularly in non-Western settings. Drawing on the theories of planned behavior and the functional approach to volunteer motivation, we surveyed 890 students from a public university in Malaysia about their intention to join a new urban agriculture program. Hierarchical regression findings indicated that the strongest predictor of participation was students’ attitude toward urban agriculture, followed by subjective norms, career motives and perceived barriers to participation. The findings from this study may provide useful information to the university program planners in Malaysia in identifying mechanisms for future students’ involvement in the program
Exposure assessment of process-related contaminants in food by biomarker monitoring
Exposure assessment is a fundamental part of the risk assessment paradigm, but can often present a number of challenges and uncertainties. This is especially the case for process contaminants formed during the processing, e.g. heating of food, since they are in part highly reactive and/or volatile, thus making exposure assessment by analysing contents in food unreliable. New approaches are therefore required to accurately assess consumer exposure and thus better inform the risk assessment. Such novel approaches may include the use of biomarkers, physiologically based kinetic (PBK) modelling-facilitated reverse dosimetry, and/or duplicate diet studies. This review focuses on the state of the art with respect to the use of biomarkers of exposure for the process contaminants acrylamide, 3-MCPD esters, glycidyl esters, furan and acrolein. From the overview presented, it becomes clear that the field of assessing human exposure to process-related contaminants in food by biomarker monitoring is promising and strongly developing. The current state of the art as well as the existing data gaps and challenges for the future were defined. They include (1) using PBK modelling and duplicate diet studies to establish, preferably in humans, correlations between external exposure and biomarkers; (2) elucidation of the possible endogenous formation of the process-related contaminants and the resulting biomarker levels; (3) the influence of inter-individual variations and how to include that in the biomarker-based exposure predictions; (4) the correction for confounding factors; (5) the value of the different biomarkers in relation to exposure scenario’s and risk assessment, and (6) the possibilities of novel methodologies. In spite of these challenges it can be concluded that biomarker-based exposure assessment provides a unique opportunity to more accurately assess consumer exposure to process-related contaminants in food and thus to better inform risk assessment
Antimicrobial biocompatible bioscaffolds for orthopaedic implants
Nationally, nearly 1.5 million patients in the USA suffer from ailments requiring bone grafts and hip and other joint replacements. Infections following internal fixation in orthopaedic trauma can cause osteomyelitis in 22-66% of cases and, if uncontrolled, the mortality rate can be as high as 2%. We characterize a procedure for the synthesis of antimicrobial and biocompatible poly-l-lactic acid (PLLA) and poly-ethyleneglycol (PEG) bioscaffolds designed to degrade and absorb at a controlled rate. The bioscaffold architecture aims to provide a suitable substrate for the controlled release of silver nanoparticles (SNPs) to reduce bacterial growth and to aid the proliferation of human adipose-derived stem cells (hASCs) for tissue-engineering applications. The fabricated bioscaffolds were characterized by scanning transmission microscope (SEM) and it showed that the addition of tncreasing concentrations of SNPs results in the formation of dendritic porous channels perpendicular to the axis of precipitation. The antimicrobial properties of these porous bioscaffolds were tested according to a modified ISO 22196 standard across varying concentrations of biomass-mediated SNPs to determine an efficacious antimicrobial concentration. The bioscaffolds reduced the Staphylococcus aureus and Escherichia coli viable colony-forming units by 98.85% and 99.9%, respectively, at an antimicrobial SNPs concentration of 2000 ppm. Human ASCs were seeded on bioscaffolds and cultured in vitro for 20 days to study the effect of SNPs concentration on the viability of cells. SEM analysis and the metabolic activity-based fluorescent dye, AlamarBlue®, demonstrated the growth of cells on the efficacious antimicrobial bioscaffolds. The biocompatibility of in vitro leached silver, quantified by inductively coupled plasma optical emission spectroscopy (ICP-OES), proved non-cytotoxic when tested against hASCs, as evaluated by MTT assay
Does Quitting Smoking Make a Difference Among Newly Diagnosed Head and Neck Cancer Patients?
INTRODUCTION: To determine if smoking after a cancer diagnosis makes a difference in mortality among newly diagnosed head and neck cancer patients.
METHODS: Longitudinal data were collected from newly diagnosed head and neck cancer patients with a median follow-up time of 1627 days (N = 590). Mortality was censored at 8 years or September 1, 2011, whichever came first. Based on smoking status, all patients were categorized into four groups: continuing smokers, quitters, former smokers, or never-smokers. A broad range of covariates were included in the analyses. Kaplan-Meier curves, bivariate and multivariate Cox proportional hazards models were constructed.
RESULTS: Eight-year overall mortality and cancer-specific mortality were 40.5% (239/590) and 25.4% (150/590), respectively. Smoking status after a cancer diagnosis predicted overall mortality and cancer-specific mortality. Compared to never-smokers, continuing smokers had the highest hazard ratio (HR) of dying from all causes (HR = 2.71, 95% confidence interval [CI] = 1.48-4.98). Those who smoked at diagnosis, but quit and did not relapse-quitters-had an improved hazard ratio of dying (HR = 2.38, 95% CI = 1.29-4.36) and former smokers at diagnosis with no relapse after diagnosis-former smokers-had the lowest hazard ratio of dying from all causes (HR = 1.68, 95% CI = 1.12-2.56). Similarly, quitters had a slightly higher hazard ratio of dying from cancer-specific reasons (HR = 2.38, 95% CI = 1.13-5.01) than never-smokers, which was similar to current smokers (HR = 2.07, 95% CI = 0.96-4.47), followed by former smokers (HR = 1.70, 95% CI = 1.00-2.89).
CONCLUSIONS: Compared to never-smokers, continuing smokers have the highest HR of overall mortality followed by quitters and former smokers, which indicates that smoking cessation, even after a cancer diagnosis, may improve overall mortality among newly diagnosed head and neck cancer patients. Health care providers should consider incorporating smoking cessation interventions into standard cancer treatment to improve survival among this population.
IMPLICATIONS: Using prospective observational longitudinal data from 590 head and neck cancer patients, this study showed that continuing smokers have the highest overall mortality relative to never-smokers, which indicates that smoking cessation, even after a cancer diagnosis, may have beneficial effects on long-term overall mortality. Health care providers should consider incorporating smoking cessation interventions into standard cancer treatment to improve survival among this population
Socioeconomic and Other Demographic Disparities Predicting Survival among Head and Neck Cancer Patients
BACKGROUND: The Institute of Medicine (IOM) report, Unequal Treatment, which defines disparities as racially based, indicates that disparities in cancer diagnosis and treatment are less clear. While a number of studies have acknowledged cancer disparities, they have limitations of retrospective nature, small sample sizes, inability to control for covariates, and measurement errors.
OBJECTIVE: The purpose of this study was to examine disparities as predictors of survival among newly diagnosed head and neck cancer patients recruited from 3 hospitals in Michigan, USA, while controlling for a number of covariates (health behaviors, medical comorbidities, and treatment modality).
METHODS: Longitudinal data were collected from newly diagnosed head and neck cancer patients (N = 634). The independent variables were median household income, education, race, age, sex, and marital status. The outcome variables were overall, cancer-specific, and disease-free survival censored at 5 years. Kaplan-Meier curves and univariate and multivariate Cox proportional hazards models were performed to examine demographic disparities in relation to survival.
RESULTS: Five-year overall, cancer-specific, and disease-free survival were 65.4% (407/622), 76.4% (487/622), and 67.0% (427/622), respectively. Lower income (HR, 1.5; 95% CI, 1.1-2.0 for overall survival; HR, 1.4; 95% CI, 1.0-1.9 for cancer-specific survival), high school education or less (HR, 1.4; 95% CI, 1.1-1.9 for overall survival; HR, 1.4; 95% CI, 1.1-1.9 for cancer-specific survival), and older age in decades (HR, 1.4; 95% CI, 1.2-1.7 for overall survival; HR, 1.2; 95% CI, 1.1-1.4 for cancer-specific survival) decreased both overall and disease-free survival rates. A high school education or less (HR, 1.4; 95% CI, 1.0-2.1) and advanced age (HR, 1.3; 95% CI, 1.1-1.6) were significant independent predictors of poor cancer-specific survival.
CONCLUSION: Low income, low education, and advanced age predicted poor survival while controlling for a number of covariates (health behaviors, medical comorbidities, and treatment modality). Recommendations from the Institute of Medicine\u27s Report to reduce disparities need to be implemented in treating head and neck cancer patients
- …