48 research outputs found
Does Diet Self Efficacy and Stress Affect Body Composition in College Students?
Body composition is influenced by many variables, including stress and nutrition, which in turn is affected by the person’s belief in his or her ability to manage a diet even in the face of obstacles (Nastaskin, 2015). PURPOSE: This study examined the influence of college students’ dietary self-efficacy and responses to stress on body weight and body fat percentage from their freshman to senior year. METHODS: Fourteen participants (11W/3M, 18.1 ± 0.4 yrs, 165.3 ± 7.7 cm, 64.9 ± 14.2 kg at Year 1) underwent whole-body dual x-ray absorptiometry (DEXA, Hologic W). They also completed 2 questionnaires: 1) Diet Self-Efficacy (Knäuper, 2013), which assesses three factors that could negatively impact diet: high caloric food temptation (HCF), social/internal factors (SIF), and negative emotional events (NEE) (0-100 range for each score) and 2) the Vanderbilt Responses to Stress - Peer Stress College, which is a 57-question survey measuring coping and involuntary stress responses to specific situations (0-50 range). All assessments were completed annually from the students’ freshman to senior year. Data were analyzed using Pearson correlation. RESULTS: Overall, participants gained 5.1 ± 5.7 kg (6.6 ± 8.1%) of body weight and 0.5 ± 4.0% of body fat over the 4 years. At Year 1, diet self-efficacy scores were moderate (HCF 47.8 ± 22.1, SIF 56.7 ± 21.7, NEE 64.9 ± 22.0). Over four years, there was a strong negative correlation between NEE and body weight in 3 participants(r = -0.98, r = -0.96, r = -0.86), indicating that when these participants were better able to resist eating temptation when faced with a negative emotional event, they had a lower body weight. Also, SIF was trending towards a significant inverse relationship with body fat percentage (p = 0.07). Stress scores were inversely related with body fat percentage in the majority of the participants with the strongest correlation at (r = -0.96). CONCLUSION: Nutritional self-efficacy could influence weight changes in college students. However, any influence is highly individualized. Based on the limited number of participants in our study, it is too early to make generalized statements
Fitness Correlates to Firefighter Job Tasks
Firefighters have their focus on rescuing and responding in any emergency and fire situations (Antolini, 2015). The demand for firefighting includes a need for both aerobic and anaerobic fitness, along with muscular strength, endurance, explosive power, and reaction time (Xu, 2020). PURPOSE: The purpose of the study was to determine the relationship between fitness assessments and job task simulations in firefighter cadets. METHODS: 21 firefighter academy students performed fitness assessments and job task simulations on different days. Fitness assessments included vertical jump, lateral medicine ball throw, push up, horizontal row, and 300-yard shuttle run. Job task simulations were conducted in a sequential format, i.e., physical agility course and consisted of equipment carry, stair climb, ladder carry and raise, bear crawl, kneeling hose drag, over shoulder hose drag, tire strike, hose deploy, victim drag, and charged line. Pearson r correlation analyses were conducted to determine relationships between all variables in fitness assessments versus time to complete job task simulations. RESULTS: Positive correlations were found between the 300-yard shuttle run time and stair climb (r = .495, p = .023), ladder carry and raise (r = .433, p = .050), bear crawl (r = .516, p = .017), over shoulder hose drag (r = .486, p = .030), tire strike (r = .656, p = .002), hose deploy (r = .486, p = .030), and victim drag (r = .686, p \u3c .001). Negative correlations existed between the vertical jump and stair climb (r = .511, p = .018), ladder carry and raise (r = .439, p = .047), kneeling hose drag (r = .560, p = .008), hose deploy (r = .458, p = .042), and charged line (r = .645, p = .002). Negative correlations were found between the lateral medicine ball throw right and equipment carry (r = .529, p = .014), stair climb (r = .481, p = .027), ladder carry and raise (r = .489, p = .025), kneeling hose drag (r = .498, p = .021), and charged line (r = .486, p = .030). With the left side of the lateral medicine ball throw, negative correlations existed with stair climb (r = .465, p = .034), ladder carry and raise (r = .445, p = .043), kneeling hose drag (r = .508, p = .019), and charged line (r = .471, p = .036). Negative correlations were found between push up and stair climb (r = .616, p = .003), ladder carry and raise (r = .608, p = .003), bear crawl (r = .571, p = .007), kneeling hose drag (r = .594, p = .005), over shoulder hose drag (r = .629, p = .003), hose deploy (r = .539, p = .014), victim drag (r = .587, p = .006), and charged line (r = .511, p = .021). Finally, a negative correlation was evident between the horizontal row and over shoulder hose drag (r = .487, p = .029). CONCLUSION: Job task simulation scores are highly associated with a number of fitness assessments. Firefighters and academy instructors should focus on improving fitness, especially power, agility, and muscular endurance to improve specific job tasks
Colorectal Cancer Stage at Diagnosis Before vs During the COVID-19 Pandemic in Italy
IMPORTANCE Delays in screening programs and the reluctance of patients to seek medical
attention because of the outbreak of SARS-CoV-2 could be associated with the risk of more advanced
colorectal cancers at diagnosis.
OBJECTIVE To evaluate whether the SARS-CoV-2 pandemic was associated with more advanced
oncologic stage and change in clinical presentation for patients with colorectal cancer.
DESIGN, SETTING, AND PARTICIPANTS This retrospective, multicenter cohort study included all
17 938 adult patients who underwent surgery for colorectal cancer from March 1, 2020, to December
31, 2021 (pandemic period), and from January 1, 2018, to February 29, 2020 (prepandemic period),
in 81 participating centers in Italy, including tertiary centers and community hospitals. Follow-up was
30 days from surgery.
EXPOSURES Any type of surgical procedure for colorectal cancer, including explorative surgery,
palliative procedures, and atypical or segmental resections.
MAIN OUTCOMES AND MEASURES The primary outcome was advanced stage of colorectal cancer
at diagnosis. Secondary outcomes were distant metastasis, T4 stage, aggressive biology (defined as
cancer with at least 1 of the following characteristics: signet ring cells, mucinous tumor, budding,
lymphovascular invasion, perineural invasion, and lymphangitis), stenotic lesion, emergency surgery,
and palliative surgery. The independent association between the pandemic period and the outcomes
was assessed using multivariate random-effects logistic regression, with hospital as the cluster
variable.
RESULTS A total of 17 938 patients (10 007 men [55.8%]; mean [SD] age, 70.6 [12.2] years)
underwent surgery for colorectal cancer: 7796 (43.5%) during the pandemic period and 10 142
(56.5%) during the prepandemic period. Logistic regression indicated that the pandemic period was
significantly associated with an increased rate of advanced-stage colorectal cancer (odds ratio [OR],
1.07; 95%CI, 1.01-1.13; P = .03), aggressive biology (OR, 1.32; 95%CI, 1.15-1.53; P < .001), and stenotic
lesions (OR, 1.15; 95%CI, 1.01-1.31; P = .03).
CONCLUSIONS AND RELEVANCE This cohort study suggests a significant association between the
SARS-CoV-2 pandemic and the risk of a more advanced oncologic stage at diagnosis among patients
undergoing surgery for colorectal cancer and might indicate a potential reduction of survival for
these patients
Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study
Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world.
Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231.
Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001).
Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication
A multi-element psychosocial intervention for early psychosis (GET UP PIANO TRIAL) conducted in a catchment area of 10 million inhabitants: study protocol for a pragmatic cluster randomized controlled trial
Multi-element interventions for first-episode psychosis (FEP) are promising, but have mostly been conducted in non-epidemiologically representative samples, thereby raising the risk of underestimating the complexities involved in treating FEP in 'real-world' services
Exploring the Composition of Egyptian Faience
Egyptian Faience, a revolutionary innovation in ancient ceramics, was used for crafting various objects, including amulets, vessels, ornaments, and funerary figurines, like shabtis. Despite extensive research, many aspects of ancient shabti production technology, chemistry and mineralogy remain relatively understudied from the 21st to the 22nd Dynasty, belonging to a recovered 19th-century private collection. The fragments’ origin is tentatively identified in the middle Nile valley in the Luxor area. Our study focused on a modest yet compositionally interesting small collection of shabti fragments to provide information on the glaze’s components and shabti’s core. We found that the core is a quartz and K-feldspars silt blended with an organic component made of plastic resins and vegetable fibres soaked with natron. The studied shabti figurines, after being modelled, dried, and covered with coloured glaze, were subjected to a firing process. Sodium metasilicate and sulphate compounds formed upon contact of the glaze with the silica matrix, forming a shell that holds together the fragile inner matrix. The pigments dissolved in the sodic glaze glass, produced by quartz, K-feldspars, and natron frit, are mainly manganese (Mn) and copper (Cu) compounds. The ratio Cu2O/CaO > 5 produces a blue colour; if <5, the glaze is green. In some cases, Mg and As may have been added to produce a darker brown and an intense blue, respectively. Reaction minerals provided information on the high-temperature firing process that rapidly vitrified the glaze. These data index minerals for the firing temperature of a sodic glaze, reaching up to a maximum of 1050 °C
Feasibility and effectiveness of a disease and care management model in the primary health care system for patients with heart failure and diabetes (Project Leonardo)
Marco Matteo Ciccone1, Ambrogio Aquilino2, Francesca Cortese1, Pietro Scicchitano1, Marco Sassara1, Ernesto Mola3, Rodolfo Rollo4,Pasquale Caldarola5, Francesco Giorgino6, Vincenzo Pomo2, Francesco Bux21Section of Cardiovascular Disease, Department of Emergency and Organ Transplantation, School of Medicine, University of Bari, Bari, Italy; 2Agenzia Regionale Sanitaria &ndash; Regione Puglia (ARES), Apulia, Italy; 3ASL, Lecce, Italy; 4ASL, Brindisi, Italy; 5Cardiologia, Ospedale &ldquo;Sarcone&rdquo;, Terlizzi, Italy; 6Section of Endocrinology, Department of Emergency and Organ Transplantation, School of Medicine, University of Bari, Bari, ItalyPurpose: Project Leonardo represented a feasibility study to evaluate the impact of a disease and care management (D&amp;CM) model and of the introduction of &ldquo;care manager&rdquo; nurses, trained in this specialized role, into the primary health care system. Patients and methods: Thirty care managers were placed into the offices of 83 general practitioners and family physicians in the Apulia Region of Italy with the purpose of creating a strong cooperative and collaborative &ldquo;team&rdquo; consisting of physicians, care managers, specialists, and patients. The central aim of the health team collaboration was to empower 1,160 patients living with cardiovascular disease (CVD), diabetes, heart failure, and/or at risk of cardiovascular disease (CVD risk) to take a more active role in their health. With the support of dedicated software for data collection and care management decision making, Project Leonardo implemented guidelines and recommendations for each condition aimed to improve patient health outcomes and promote appropriate resource utilization.Results: Results show that Leonardo was feasible and highly effective in increasing patient health knowledge, self-management skills, and readiness to make changes in health behaviors. Patient skill-building and ongoing monitoring by the health care team of diagnostic tests and services as well as treatment paths helped promote confidence and enhance safety of chronic patient management at home.Conclusion: Physicians, care managers, and patients showed unanimous agreement regarding the positive impact on patient health and self-management, and attributed the outcomes to the strong &ldquo;partnership&rdquo; between the care manager and the patient and the collaboration between the physician and the care manager. Future studies should consider the possibility of incorporating a patient empowerment model which considers the patient as the most important member of the health team and care managers as key health care collaborators able to enhance and support services to patients provided by physicians in the primary health care system.Keywords: partnerships, health team, patient empowerment, care coordinatio
Chaetomorpha linum in the bioremediation of aquaculture wastewater: Optimization of nutrient removal efficiency at the laboratory scale
Marine pollution from aquaculture wastewater is a widespread and increasing ecological problem. Algae, with their
ability to remove surplus nutrients from wastewater, are a good tool for achieving more sustainable aquaculture. In
this study, the capability of different biomasses of Chaetomorpha linum and Cladophora prolifera for the bioremediation
of nutrient-rich (ammonium, nitrate and phosphate) seawater was compared. The results suggest that 10 g L−1C. linum
is an excellent candidate for aquaculture wastewater bioremediation. However, the bioremediation efficiency of C.
linum was significantly affected by seasonality, with the greatest performance in nutrient removal exhibited by algae
harvested in summer. C. linum harvested in winter and acclimated to lab conditions for two months, significantly
improved the removal efficiency of both ammonium and nitrate, while worsening that of phosphate. Irrespective of
season and acclimation, the simultaneous presence of ammonium and nitrate in seawater strongly inhibited nitrate
removal. Thus, we propose the use of a two-step system, tested at the laboratory scale, in which nutrient-enriched
seawater can pass through two different algal ponds. C. linum was able to achieve almost complete removal of
ammonium in 24 h in the first step, while the second step improved both nitrate and phosphate removal efficiency.
The two-step system is an effective innovation for the use of algae in bioremediation of aquaculture wastewaters.Marine pollution from aquaculture wastewater is a widespread and increasing ecological problem. Algae, with their ability to remove surplus nutrients from wastewater, are a good tool for achieving more sustainable aquaculture. In this study, the capability of different biomasses of Chaetomorpha linum and Cladophora prolifera for the bioremediation of nutrient-rich (ammonium, nitrate and phosphate) seawater was compared. The results suggest that 10 g L−1C. linum is an excellent candidate for aquaculture wastewater bioremediation. However, the bioremediation efficiency of C. linum was significantly affected by seasonality, with the greatest performance in nutrient removal exhibited by algae harvested in summer. C. linum harvested in winter and acclimated to lab conditions for two months, significantly improved the removal efficiency of both ammonium and nitrate, while worsening that of phosphate. Irrespective of season and acclimation, the simultaneous presence of ammonium and nitrate in seawater strongly inhibited nitrate removal. Thus, we propose the use of a two-step system, tested at the laboratory scale, in which nutrient-enriched seawater can pass through two different algal ponds. C. linum was able to achieve almost complete removal of ammonium in 24 h in the first step, while the second step improved both nitrate and phosphate removal efficiency. The two-step system is an effective innovation for the use of algae in bioremediation of aquaculture wastewaters
MDCT in acute ischaemic left colitis: a pictorial essay
The pathogenesis of acute ischaemic colitis depends on two different forms of vascular colonic insult: occlusive injury and non-occlusive injury. Clinically, ischaemic colitis may be classified as two major forms: mild (non-gangrenous) and acute fulminant (gangrenous). The classic presentation is abdominal pain, diarrhoea and/or rectal bleeding, but it is not specific and highly variable and so the diagnosis usually depends on clinical suspicion and is supported by serologic and colonoscopic findings. Imaging methods have their role in diagnosing IC. While plain radiography and ultrasound can orient the diagnosis, CT allows to define the morphofunctional alterations discriminating the non-occlusive forms from the occlusive forms and in most cases to estimate the timing of ischaemic damage. Purpose of the review is to define the role of CT in the early identification of pathological findings and in the definition of evolution of colonic ischaemic lesions, in order to plan the correct therapeutic approach, suggesting the decision of medical or surgical treatment