53 research outputs found
Evaluation of X-ray fluorescence (XRF) lead detection method for candy
The harmful effects of childhood exposure to environmental lead continue to be a major health concern. Due to the significance of this hazard, a Healthy People 2010 objective was set to reduce all young children\u27s blood lead levels to less than 10 micrograms per deciliter. Identification and removal of lead-contaminated candies is an integral part of the primary prevention of lead poisoning in children.
This research examined the efficacy of a protocol to use a portable XRF to screen candies for lead contamination. Method Detection Limits (MDLs) of 5.45 ppm and 7.05 ppm were determined in the Bulk Sample and Plastics Modes, respectively, using 45 fortified analytical samples with a candy matrix. Results also indicated that the XRF-determined concentrations were significantly different from the actual concentrations, as determined via Graphite Furnace Atomic Absorption Spectrometry (GFAAS). Regression analysis established predictive relationships between XRF data and the actual concentration of lead in candy.
The XRF\u27s current ability to screen candies may be limited to wrappers and highly contaminated samples, as candies are typically seen at concentrations below the MDL. Future research should be done to improve the sensitivity of the XRF, in conjunction with collaborative efforts to gather and disseminate information on the dangers of lead-contaminated candies
A Randomized Controlled Trial Comparing Contrast Baths to Cryotherapy in Patients with a Wrist Fracture
Cryotherapy and contrast baths are common modalities used to treat edema. Despite the fact that many hand therapists report using contrast baths with their patients, there is still little evidence on the effectiveness of them. A Randomized Controlled Trial-Repeated Measures Design was employed to compare the effectiveness of contrast baths to cryotherapy. Participants were recruited after a period of post-fracture immobilization. Blind assessors attained measurements in range of motion, pain, and edema in the 1st, 2nd, and 4th weeks. Participants were randomly assigned to either use cryotherapy or contrast baths at home during the 4-week study period. The participants in this study were asked to keep a home program log to record the number of times that they were able to complete their assigned modality. Also, at the final measurement appointment, the participants were asked to complete a survey indicating their satisfaction with their assigned home program. The results from the ANOVA indicated that there was statistical significance for all the measurements (p \u3c .05) except for palm circumferential and volumeter. The results from the home program logs indicated that the adherence for contrast baths had a mean of 2.03 (SD = 0.76) times per day and the cryotherapy group had a mean of 1.75 (SD = 0.98) times per day. The researchers also noted a trend that the contrast baths group may have a larger effect in the first two weeks of treatment, while the cryotherapy group may have more gradual improvements throughout the four weeks. The results of this study found that adherence for contrast baths and cryotherapy had similar home program adherence and that the participants had similar experiences
Best timing for measuring orthostatic vital signs?
Review of: Juraschek SP, Daya N, Rawlings AM, et al. Association of history of dizziness and long-term adverse outcomes with early vs later orthostatic hypotension assessment times in middle-aged adults. JAMA Intern Med. 2017;177:1316-1323.Best timing for measuring orthostatic vital signs? We typically take a blood pressure within 3 minutes of a patient rising from a supine to a standing position. But is that too long? PRACTICE CHANGER: Measure orthostatic vital signs within 1 minute of standing to most accurately correlate dizziness with long-term adverse outcomes. STRENGTH RECOMMENDATION: B: Based on a single, high-quality, prospective cohort study with patient-oriented outcomes and good follow-up.Deborah Phipps, MD; Erik Butler, DO; Anne Mounsey, MD; Michael M. Dickman, DO; David Bury, DO; Ashley Smith, MD, MBA; Nick Bennett, DO, MBA; Ben Arthur, MD, MBA; Bob Marshall, MD, MPH, MISM; University of North Carolina at Chapel Hill (Drs. Phipps, Butler, and Mounsey); Madigan Army Medical Center, Gig Harbor, Wash (Drs. Dickman, Bury, Smith, Bennett, Arthur, and Marshall
Outcomes of a randomized controlled trial assessing a smartphone Application to reduce unmet needs among people diagnosed with CancEr (ACE)
© 2019 The Authors. Cancer Medicine published by John Wiley & Sons Ltd. Background: Smartphone technology represents an opportunity to deliver practical solutions for people affected by cancer at a scale that was previously unimaginable, such as information, appointment monitoring, and improved access to cancer support services. This study aimed to determine whether a smartphone application (app) reduced the unmet needs among people newly diagnosed with cancer. Methods: A single blind, multisite randomized controlled trial to determine the impact of an app-based, 4-month intervention. Newly diagnosed cancer patients were approached at three health service treatment clinics. Results: Eighty-two people were randomized (intervention; n = 43 and control; n = 39), average age was 59.5 years (SD: 12.9); 71% female; 67% married or in a de facto relationship. At baseline, there were no differences in participants’ characteristics between the groups. No significant effects, in reducing unmet needs, were demonstrated at the end of intervention (4-month) or 12-month follow-up. Overall, 94% used the app in weeks 1-4, which decreased to 41% in weeks 13-16. Mean app use time per participant: Cancer Information, 6.9 (SD: 18.9) minutes; Appointment Schedule, 5.1 (SD: 9.6) minutes; Cancer Services 1.5 minutes (SD: 6.8); Hospital Navigation, 1.4 (SD: 2.8) minutes. Conclusions: Despite consumer involvement in the design of this smartphone technology, the app did not reduce unmet needs. This may have been due to the study being underpowered. To contribute to a meaningful understanding and improved implementation of smartphone technology to support people affected by cancer, practical considerations, such as recruitment issues and access to, and confidence with, apps, need to be considered. Australian New Zealand Clinical Trials Registration (ACTRN) Trial Registration: 12616001251415; WEF 7/9/2016
Risk Factors for SARS-CoV-2 Infection Among US Healthcare Personnel, May-December 2020
To determine risk factors for coronavirus disease (COVID-19) among US healthcare personnel (HCP), we conducted a case-control analysis. We collected data about activities outside the workplace and COVID-19 patient care activities from HCP with positive severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) test results (cases) and from HCP with negative test results (controls) in healthcare facilities in 5 US states. We used conditional logistic regression to calculate adjusted matched odds ratios and 95% CIs for exposures. Among 345 cases and 622 controls, factors associated with risk were having close contact with persons with COVID-19 outside the workplace, having close contact with COVID-19 patients in the workplace, and assisting COVID-19 patients with activities of daily living. Protecting HCP from COVID-19 may require interventions that reduce their exposures outside the workplace and improve their ability to more safely assist COVID-19 patients with activities of daily living
Adding 6 months of androgen deprivation therapy to postoperative radiotherapy for prostate cancer: a comparison of short-course versus no androgen deprivation therapy in the RADICALS-HD randomised controlled trial
Background
Previous evidence indicates that adjuvant, short-course androgen deprivation therapy (ADT) improves metastasis-free survival when given with primary radiotherapy for intermediate-risk and high-risk localised prostate cancer. However, the value of ADT with postoperative radiotherapy after radical prostatectomy is unclear.
Methods
RADICALS-HD was an international randomised controlled trial to test the efficacy of ADT used in combination with postoperative radiotherapy for prostate cancer. Key eligibility criteria were indication for radiotherapy after radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to radiotherapy alone (no ADT) or radiotherapy with 6 months of ADT (short-course ADT), using monthly subcutaneous gonadotropin-releasing hormone analogue injections, daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as distant metastasis arising from prostate cancer or death from any cause. Standard survival analysis methods were used, accounting for randomisation stratification factors. The trial had 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 80% to 86% (hazard ratio [HR] 0·67). Analyses followed the intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and ClinicalTrials.gov, NCT00541047.
Findings
Between Nov 22, 2007, and June 29, 2015, 1480 patients (median age 66 years [IQR 61–69]) were randomly assigned to receive no ADT (n=737) or short-course ADT (n=743) in addition to postoperative radiotherapy at 121 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 9·0 years (IQR 7·1–10·1), metastasis-free survival events were reported for 268 participants (142 in the no ADT group and 126 in the short-course ADT group; HR 0·886 [95% CI 0·688–1·140], p=0·35). 10-year metastasis-free survival was 79·2% (95% CI 75·4–82·5) in the no ADT group and 80·4% (76·6–83·6) in the short-course ADT group. Toxicity of grade 3 or higher was reported for 121 (17%) of 737 participants in the no ADT group and 100 (14%) of 743 in the short-course ADT group (p=0·15), with no treatment-related deaths.
Interpretation
Metastatic disease is uncommon following postoperative bed radiotherapy after radical prostatectomy. Adding 6 months of ADT to this radiotherapy did not improve metastasis-free survival compared with no ADT. These findings do not support the use of short-course ADT with postoperative radiotherapy in this patient population
Duration of androgen deprivation therapy with postoperative radiotherapy for prostate cancer: a comparison of long-course versus short-course androgen deprivation therapy in the RADICALS-HD randomised trial
Background
Previous evidence supports androgen deprivation therapy (ADT) with primary radiotherapy as initial treatment for intermediate-risk and high-risk localised prostate cancer. However, the use and optimal duration of ADT with postoperative radiotherapy after radical prostatectomy remains uncertain.
Methods
RADICALS-HD was a randomised controlled trial of ADT duration within the RADICALS protocol. Here, we report on the comparison of short-course versus long-course ADT. Key eligibility criteria were indication for radiotherapy after previous radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to add 6 months of ADT (short-course ADT) or 24 months of ADT (long-course ADT) to radiotherapy, using subcutaneous gonadotrophin-releasing hormone analogue (monthly in the short-course ADT group and 3-monthly in the long-course ADT group), daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as metastasis arising from prostate cancer or death from any cause. The comparison had more than 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 75% to 81% (hazard ratio [HR] 0·72). Standard time-to-event analyses were used. Analyses followed intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and
ClinicalTrials.gov
,
NCT00541047
.
Findings
Between Jan 30, 2008, and July 7, 2015, 1523 patients (median age 65 years, IQR 60–69) were randomly assigned to receive short-course ADT (n=761) or long-course ADT (n=762) in addition to postoperative radiotherapy at 138 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 8·9 years (7·0–10·0), 313 metastasis-free survival events were reported overall (174 in the short-course ADT group and 139 in the long-course ADT group; HR 0·773 [95% CI 0·612–0·975]; p=0·029). 10-year metastasis-free survival was 71·9% (95% CI 67·6–75·7) in the short-course ADT group and 78·1% (74·2–81·5) in the long-course ADT group. Toxicity of grade 3 or higher was reported for 105 (14%) of 753 participants in the short-course ADT group and 142 (19%) of 757 participants in the long-course ADT group (p=0·025), with no treatment-related deaths.
Interpretation
Compared with adding 6 months of ADT, adding 24 months of ADT improved metastasis-free survival in people receiving postoperative radiotherapy. For individuals who can accept the additional duration of adverse effects, long-course ADT should be offered with postoperative radiotherapy.
Funding
Cancer Research UK, UK Research and Innovation (formerly Medical Research Council), and Canadian Cancer Society
Colostrum management practices and effects on colostrum quality on commercial northern Victorian dairy farms
© 2016 Ashley James PhippsThe majority of Australian dairy enterprises are self-replacing herds that aim to produce and rear healthy heifer calves that will thrive and go on to achieve their full genetic potential in the dairy system. One of the major factors affecting the ability to achieve this goal is the immune status of neonatal calves.
Neonatal calves are born hypo-gammaglobulinemic or agammaglobulinemic which requires them to consume and absorb adequate colostral immunoglobulin (Ig) mass during the first 24 hours of life. It is common practice for producers to supplement their replacement heifer calves with colostrum during this period, rather than solely relying on the calves to suckle from the dam.
One of the major challenges for dairy producers is to produce, harvest, store, and feed high-quality colostrum to their replacement heifer calves. Many studies have identified a number of factors that can influence the quality of colostrum in terms of Ig content and bacterial contamination. Despite this, there is limited Australian data on colostrum management, hygiene, and quality.
The objectives of the research project were to 1) To survey northern Victorian dairy farms about individual farm details, colostrum management and general calf rearing practices, 2) To assess the impact of various risk factors on colostrum quality and to assess the percentage of colostrum samples that meet industry standards, in terms of Ig content as measured by a Brix refractometer on northern-Victorian dairy farms, 3) To evaluate the range and variation of colostrum quality in terms of bacterial contamination (TPC and TCC) and Ig concentrations (estimated by a Brix refractometer reading) on northern-Victorian dairy farms, 4) To compare colostrum quality on northern-Victorian dairy farms to the industry standards in terms of bacterial contamination and Ig concentrations and 5) To develop guidelines for dairy farmers on how to produce, manage and feed high quality colostrum to replacement heifers.
The methods used included an approach that divides the research project into three studies, outlined below:
The first study, titled “A survey of colostrum management and dairy calf rearing practices on northern-Victorian dairy herds” reports the results of a questionnaire developed to obtain information about each individual farm, colostrum management practices, and general calf rearing practices carried out on farm. The questionnaire was sent to all commercial dairy farming clients of Rochester Veterinary Practice, a large mixed-practice in northern Victoria, between June 2013 and September 2013.
A total of 58 of 150 (39%) questionnaires were returned. Many of the dairy producers in this survey were not meeting Australian dairy industry recommendations in terms of colostrum management and calf husbandry. The key areas for improvement included 1) delayed removal of the calf from the dam on 89.7% of herds, 2) relying on suckling of the dam to gain sufficient colostrum on 25.7% of herds, 3) Failure to routinely supplement calves with colostrum on 25.7% of herds, 4) pooling of colostrum from different dams on 10.3% of herds, 5) failing to assess colostrum quality or relying upon visual assessment only on 5.2% and 65.5% of herds, respectively, 6) storage of colostrum for extended period of time at ambient temperatures (average of 6.14 days), 7) dehorning/disbudding dairy calves later than the recommended age on a number of herds (average 6.31 weeks, maximum : 20 weeks), 8) weaning dairy calves based on age alone on 34.0% of herds, 9) failing to routinely isolate sick calves in the calf rearing facilities on 60.4% of farms, and 10) failing to allow the calf access to fresh water from birth (average ages of access 4.7 days).
The second study, titled “Factors associated with colostrum quality and volume on four northern-Victorian dairy herds”, was carried out on four dairy farms located near Rochester, Victoria. Four hundred and forty-two fresh colostrum samples were collected from dairy cows of varying breed and parity that calved between June 2013 and November 2013. Colostrum quality risk factors were recorded and colostral IgG concentration was estimated using a commercially available Brix refractometer. High-quality colostrum was considered to have a Brix refractometer reading of ≥22. Factors associated with colostrum Brix refractometer reading were determined using multivariable logistic regression.
Thirty-nine percent of the colostrum samples were classified as high-quality (Brix refractometer reading ≥ 22%). The strongest predictor for colostrum quality was time between calving and colostrum harvesting. Cows milked within 12 hours of calving were six times more likely to produce high-quality colostrum compared with cows milked ≥12 hours after calving (OR=6.30, CI=3.53-11.22, p=<0.001). Cows in fourth lactation or greater were approximately twice as likely (OR= 1.86, CI=0.95-3.62, p=0.001) to produce high-quality colostrum compared with first lactation cows. The odds of high-quality colostrum were nearly four times greater (OR= 3.72, CI=2.11-6.53, p=<0.001) if the calf was not allowed to suckle from the dam prior to colostrum harvesting. Cows that had not leaked colostrum prior to calving were more than three times more likely to produce high-quality colostrum (OR= 3.17, CI=1.34-7.41, p= 0.008).
The third study titled, “Survey of bovine colostrum quality and hygiene on northern Victorian dairy herds”, was carried out on 24 dairy farms located near Rochester, Victoria. Two hundred and forty colostrum samples were collected, between July 2014 and August 2014. A short questionnaire on general practices was completed with the farm owner or employees responsible for feeding replacement heifer calves colostrum on the initial visit. Each farm harvested and stored first milking colostrum under routine farm conditions. Colostrum samples were collected just prior to feeding to calves, a Brix refractometer reading was taken and the sample was then frozen. Information regarding individual samples was also collected. Samples were submitted to Gribbles Veterinary Pathology for bacteriological assessment.
Fifty-eight percent of colostrum samples met the recommended industry standard of a Total Plate Count (TPC) of < 100, 000 cfu/mL. Whilst 94% of colostrum samples met the recommended industry standard of Total Coliform Count (TCC) of < 10, 000 cfu/mL. When TPC, TCC and Brix refractometer reading for colostrum quality were considered together, only 23% of samples met these standards. This demonstrates that a large number of calves are at risk of receiving poor quality colostrum and were potentially exposed to colostrum that contained high bacterial loads. This is expected to interfere with the acquisition of passive transfer of immunity and negatively impact on calf health. Further investigation is required to identify these farm specific factors which may influence the level of bacterial contamination of colostrum. Recommendations as a result of this study include refrigeration of excess colostrum shortly (within 1 hour) after collection and thorough disinfection of the calf feeding apparatus (e.g. chlorhexidine gluconate or bleach) before using to feed colostrum to calves.
The following recommendations are made based on the present research and the most recent recommendations that have been identified in the literature:
Colostrum harvesting and management:
- Colostrum should be harvested from cows within 12 hours of calving, ideally as soon as possible after parturition
- Teats of freshly calved cows should be washed and towelled dry prior to harvesting colostrum
- Colostrum should be harvested into a sanitised collecting vessel
- Testing colostrum quality is recommended for all colostrum prior to feeding to replacement heifer calves
- Visual appraisal of colostrum quality and reliance upon other cow factors (e.g. age) is unreliable
- On-farm testing of colostrum can be easily undertaken using a Brix refractometer
- Fresh colostrum that is considered high quality (Brix refractometer reading of ≥ 22%) should be immediately fed to calves or stored appropriately
- Calf feeding apparatus should be disinfected before using them to feed colostrum to calves
- Colostrum that is not fed to calves shortly after collection should be cooled rapidly within 1 hour of harvesting under refrigeration in sanitized-covered 1 to 2 L plastic containers or frozen in 2-4L clean containers or zip-lock bags (laying bags flat in the freezer)
- Colostrum that stored under refrigeration, without a preservative such as potassium sorbate should be fed to calves within 1 to 2 days
- Colostrum that is stored under refrigeration with a preservative such as potassium sorbate, may be kept for up to 4 days before feeding to calves
- Colostrum that has been frozen should be thawed in a water bath, with temperatures not exceeding 60˚C
- Colostrum should be pasteurized within 2 hours after collection at 60˚C for 60 minutes and either cooled and fed to calves within 2 hours on farms that have pasteurization units available
Timing of feeding colostrum to calves
- Remove calves from the dam within 2 hours of birth prior to the calf suckling and supplement the calves with an adequate volume of colostrum (3-4L), containing 100 to 200g immunoglobulins, within the first 6 hours of lif
- …