31 research outputs found
Trace metals in soils and several brassicaceae plant species from serpentine sites of Serbia
Serpentine soils from 16 sample points in Serbia as well as the roots and shoots of eight Brassicaceae family species: Aethionema saxatile, Alyssum montanum, Alyssum repens, Cardamine plumieri, Erysimum linariifolium, Erysimum carniolicum, Isatis tinctoria, Rorippa lippizensis, were analyzed with regard to their concentrations of P, K, Fe, Ca, Mg, Ni, Zn, Mn, Cu, Cr, Cd, and Pb. Most of the soil samples were typical of ultramafic sites with low concentrations of P, K and Ca and high concentrations of Mg, Fe, Ni and Zn. Ca/Mg ratio was lt 1 in most soil samples and Brassicaceae plants. Only in A. montanum, A. repens, E. linariifolium and R. lippizensis was the Ca/Mg ratio >1. The levels of P, K, Fe and Zn were high, Mn and Cu occurred in low amounts, whereas Cr, Cd, Co and Pb were only traceable. In the roots and shoots of A. montanum and A. repens the measured concentrations of Ni were 657 mg kg(-1) and 676 mg kg(-1) respectively, which is the first instance that such high concentrations of Ni were detected in these two species
Can the conventional cytology technique be sufficient in a center lacking ROSE?: Retrospective study during the COVID-19 pandemic
While rapid on-site evaluation (ROSE) is considered to be an additional tool to optimize the yield of tissue acquisition during EUS-guided FNA of the gastrointestinal tract (1)(2) it is not readily available at all times while performing these procedures. We reviewed twenty-seven EUS-guided FNA procedures done at our institution in Tripoli central hospital with general working center restrictions due to local COVID-19 prevention protocols. Approximately 92.6 % of tissue adequacy was achieved despite the lack of ROSE which is comparable to ROSE-based tissue acquisition results. This is a small size retrospective chart review study to illustrate the optimal tissue adequacy during EUS-guided FNA of the upper gastrointestinal tract in a suboptimal hospital setting, lack of ROSE and merely utilizing visual inspection of those specimens by the performing physician and its effects on the diagnosis
Non-surface mass balance of glaciers in Iceland
Publisher's version (útgefin grein)Non-surface mass balance is non-negligible for glaciers in Iceland. Several Icelandic glaciers are in the neo-volcanic zone where a combination of geothermal activity, volcanic eruptions and geothermal heat flux much higher than the global average lead to basal melting close to 150 mm w.e. a−1 for the Mýrdalsjökull ice cap and 75 mm w.e. a−1 for the largest ice cap, Vatnajökull. Energy dissipation in the flow of water and ice is also rather large for the high-precipitation, temperate glaciers of Iceland resulting in internal and basal melting of 20–150 mm w.e. a−1. The total non-surface melting of glaciers in Iceland in 1995–2019 was 45–375 mm w.e. a−1 on average for the main ice caps, and was largest for Mýrdalsjökull, the south side of Vatnajökull and Eyjafjallajökull. Geothermal melting, volcanic eruptions and the energy dissipation in the flow of water and ice, as well as calving, all contribute, and thus these components should be considered in mass-balance studies. For comparison, the average mass balance of glaciers in Iceland since 1995 is −500 to −1500 mm w.e. a−1. The non-surface mass balance corresponds to a total runoff contribution of 2.1 km3 a−1 of water from Iceland.Financial support for lidar mapping of glaciers in Iceland in 2008–2012 was provided by the Icelandic Research Fund (163391-052), the Landsvirkjun (National Power Company of Iceland) Research Fund, the Icelandic Road Administration, the Reykjavík Energy Environmental and Energy Research Fund, the National Land Survey of Iceland, the Klima- og Luftgruppen (KoL) research fund of the Nordic Council of Ministers, and the Vatnajökull National Park. The acquisition of the Hofsjökull 2013 DEM was funded by AlpS GmbH and the University of Innsbruck. The acquisition of the Langjökull 2013 DEM was funded by NERC grant IG 13/12 and the DEM was provided by Ian Willis at the Scott Polar Research Institute. The work on estimating geothermal and volcanic power is based on funding from many sources, including the Research Fund of the University of Iceland, ISAVIA (the Icelandic Aviation Service), the Icelandic Road Administration and Landsvirkjun; logistical support has been provided by the Iceland Glaciological Society.Peer Reviewe
The impact of the COVID-19 pandemic on the delivery of primary percutaneous coronary intervention in STEMI
Objectives: The clinical environment has been forced to adapt to meet the unprecedented challenges posed by the COVID-19 pandemic. Intensive care facilities were expanded in anticipation of the pandemic where the consequences include severe delays in elective procedures. Emergent procedures such as Percutaneous Coronary Intervention (PCI) in acute myocardial infarction (AMI) in which delays in timely delivery have well established adverse prognostic effects must also be explored in the context of changes in procedure and public behaviour associated with the COVID-19 pandemic. The aim for this single centre retrospective cohort study is to determine if door-to-balloon (D2B) times in PCI for ST Elevation Myocardial Infarction (STEMI) during the United Kingdom’s first wave of the COVID-19 pandemic differed from pre-COVID-19 populations. Methods: Data was extracted from our single centre PCI database for all patients that underwent pPCI for STEMI. The reference (Pre-COVID-19) cohort was collected over the period 01-03-2019 to 31-05-2019 and the exposure group (COVID-19) over the period 01-03-2020 to 31-05-2020. Baseline patient characteristics for both populations were extracted. The primary outcome measurement was D2B times. Secondary outcome measurements included: time of symptom onset to call for help, transfer time to first hospital, transfer time from non-PCI to PCI centre, time from call-to-help to PCI centre, time to table and onset of symptoms to balloon time. Categorical and continuous variables were assessed with Chi squared and Mann-Whitney U analysis respectively. Procedural times were calculated and compared in the context of heterogeneity findings. Results: 4 baseline patient characteristics were unbalanced between populations with statistical significance (P<0.05). The pre-covid-19 cohort was more likely to have suffered out of hospital cardiac arrest (OHCA) and had left circumflex disease, whereas the 1st wave cohort were more likely to have been investigated with left ventriculography and be of Afro-Caribbean origin. No statistically significant difference in in-hospital procedural times was found with D2B, C2B, O2B times comparable between groups. Pre-hospital delays were the greatest contributors in missed target times: the 1st wave group had significantly longer delayed time of symptom onset to call for help (Control: 31 mins; IQR [82.5] vs 1st wave: 60 mins; IQR [90.0], P=0.001) and time taken from call for help to arrival at the PCI hospital (control: 72 mins; IQR [23] vs 1st wave: 80 mins; IQR [66.5], P=0.042). Conclusion: Enhanced infection prevention and control procedures considering the COVID-19 pandemic did not impede the delivery of pPCI in our single centre cohort. The public health impact of the pandemic has been demonstrated with times being significantly impacted by patient related delays. The recovery of public engagement in emergency medical services must become the focus for public health initiatives as we emerge from the height of COVID-19 disease burden in the UK.Publisher PDFPeer reviewe
An experimental investigation of the effect of age and sex/gender on pain sensitivity in healthy human participants
© 2018 El-Tumi et al. Background: Ageing is associated with alterations of the structure and function of somatosensory tissue that can impact on pain perception. The aim of this study was to investigate the relationship between age and pain sensitivity responses to noxious thermal and mechanical stimuli in healthy adults. Methods: 56 unpaid volunteers (28 women) aged between 20 and 55 years were categorised according to age into one of seven possible groups. The following measurements were taken: thermal detection thresholds, heat pain threshold and tolerance using a TSA-II NeuroSensory Analyzer; pressure pain threshold using a handheld electronic pressure algometer; and cold pressor pain threshold, tolerance, intensity and unpleasantness. Results: There was a positive correlation between heat pain tolerance and age (r = 0.228, P = 0.046), but no statistically significant differences between age groups for cold or warm detection thresholds, or heat pain threshold or tolerance. Forward regression found increasing age to be a predictor of increased pressure pain threshold (B = 0.378, P = 0.002), and sex/gender to be a predictor of cold pressor pain tolerance, with women having lower tolerance than men (B =-0.332, P = 0.006). Conclusion: The findings of this experimental study provide further evidence that pressure pain threshold increases with age and that women have lower thresholds and tolerances to innocuous and noxious thermal stimuli. Significance: The findings demonstrate that variations in pain sensitivity response to experimental stimuli in adults vary according to stimulus modality, age and sex and gender
Lava field evolution and emplacement dynamics of the 2014–2015 basaltic fissure eruption at Holuhraun, Iceland
The 6-month long eruption at Holuhraun (August 2014–February 2015) in the Bárðarbunga-Veiðivötn volcanic system was the largest effusive eruption in Iceland since the 1783–1784 CE Laki eruption. The lava flow field covered ~84 km2 and has an estimated bulk (i.e., including vesicles) volume of ~1.44 km3. The eruption had an
average discharge rate of ~90 m3/s making it the longest effusive eruption in modern times to sustain such high average flux. The first phase of the eruption (August 31, 2014 to mid-October 2014) had a discharge rate of ~350 to 100 m3/s and was typified by lava transport via open channels and the formation of four lava flows,
no. 1–4,which were emplaced side by side. The eruption began on a 1.8 km long fissure, feeding partly incandescent sheets of slabby pāhoehoe up to 500 m wide. By the following day the lava transport got confined to open channels and the dominant lava morphology changed to rubbly pāhoehoe and ‘a’ā. The latter became the dominating morphology of lava flows no. 1–8. The second phase of the eruption (Mid-October to end November)
had a discharge of ~100–50 m3/s. During this time the lava transport system changed, via the formation of a b1 km2 lava pond ~1 km east of the vent. The pond most likely formed in a topographical low created by a the pre-existing Holuhraun and the newHoluhraun lava flow fields. This pond became themain point of lava distribution,
controlling the emplacement of subsequent flows (i.e. no. 5–8). Towards the end of this phase inflation plateaus developed in lava flowno. 1. These inflation plateaus were the surface manifestation of a growing lava tube system, which formed as lava ponded in the open lava channels creating sufficient lavastatic pressure in the fluid lava to lift the roof of the lava channels. This allowed new lava into the previously active lava channel
lifting the channel roof via inflation. The final (third) phase, lasting from December to end-February 2015 had a mean discharge rate of ~50 m3/s. In this phase the lava transport was mainly confined to lava tubes within lava flows no. 1–2, which fed breakouts that resurfaced N19 km2 of the flow field. The primary lava morphology from this phase was spiny pāhoehoe, which superimposed on the ‘a’ā lava flows no. 1–3 and extended the entire length of the flow field (i.e. 17 km). Thismade the 2014–2015 Holuhraun a paired flow field,where both lava morphologies had similar length. We suggest that the similar length is a consequence of the pāhoehoe is fed from the tube systemutilizing the existing ‘a’ā lava channels, and thereby are controlled by the initial length of the ‘a’ā flows.The work was financed with crisis response funding from the Icelandic Government along with European Community's Seventh Framework Programme Grant No. 308377 (Project FUTUREVOLC) and along with the Icelandic Research fund, Rannis, Grant of Excellence No. 152266-052 (Project EMMIRS). Furthermore, Vinur Vatnajökuls are thanked for support.Peer Reviewe
Mortality from gastrointestinal congenital anomalies at 264 hospitals in 74 low-income, middle-income, and high-income countries: a multicentre, international, prospective cohort study
Background: Congenital anomalies are the fifth leading cause of mortality in children younger than 5 years globally. Many gastrointestinal congenital anomalies are fatal without timely access to neonatal surgical care, but few studies have been done on these conditions in low-income and middle-income countries (LMICs). We compared outcomes of the seven most common gastrointestinal congenital anomalies in low-income, middle-income, and high-income countries globally, and identified factors associated with mortality. // Methods: We did a multicentre, international prospective cohort study of patients younger than 16 years, presenting to hospital for the first time with oesophageal atresia, congenital diaphragmatic hernia, intestinal atresia, gastroschisis, exomphalos, anorectal malformation, and Hirschsprung's disease. Recruitment was of consecutive patients for a minimum of 1 month between October, 2018, and April, 2019. We collected data on patient demographics, clinical status, interventions, and outcomes using the REDCap platform. Patients were followed up for 30 days after primary intervention, or 30 days after admission if they did not receive an intervention. The primary outcome was all-cause, in-hospital mortality for all conditions combined and each condition individually, stratified by country income status. We did a complete case analysis. // Findings: We included 3849 patients with 3975 study conditions (560 with oesophageal atresia, 448 with congenital diaphragmatic hernia, 681 with intestinal atresia, 453 with gastroschisis, 325 with exomphalos, 991 with anorectal malformation, and 517 with Hirschsprung's disease) from 264 hospitals (89 in high-income countries, 166 in middle-income countries, and nine in low-income countries) in 74 countries. Of the 3849 patients, 2231 (58·0%) were male. Median gestational age at birth was 38 weeks (IQR 36–39) and median bodyweight at presentation was 2·8 kg (2·3–3·3). Mortality among all patients was 37 (39·8%) of 93 in low-income countries, 583 (20·4%) of 2860 in middle-income countries, and 50 (5·6%) of 896 in high-income countries (p<0·0001 between all country income groups). Gastroschisis had the greatest difference in mortality between country income strata (nine [90·0%] of ten in low-income countries, 97 [31·9%] of 304 in middle-income countries, and two [1·4%] of 139 in high-income countries; p≤0·0001 between all country income groups). Factors significantly associated with higher mortality for all patients combined included country income status (low-income vs high-income countries, risk ratio 2·78 [95% CI 1·88–4·11], p<0·0001; middle-income vs high-income countries, 2·11 [1·59–2·79], p<0·0001), sepsis at presentation (1·20 [1·04–1·40], p=0·016), higher American Society of Anesthesiologists (ASA) score at primary intervention (ASA 4–5 vs ASA 1–2, 1·82 [1·40–2·35], p<0·0001; ASA 3 vs ASA 1–2, 1·58, [1·30–1·92], p<0·0001]), surgical safety checklist not used (1·39 [1·02–1·90], p=0·035), and ventilation or parenteral nutrition unavailable when needed (ventilation 1·96, [1·41–2·71], p=0·0001; parenteral nutrition 1·35, [1·05–1·74], p=0·018). Administration of parenteral nutrition (0·61, [0·47–0·79], p=0·0002) and use of a peripherally inserted central catheter (0·65 [0·50–0·86], p=0·0024) or percutaneous central line (0·69 [0·48–1·00], p=0·049) were associated with lower mortality. // Interpretation: Unacceptable differences in mortality exist for gastrointestinal congenital anomalies between low-income, middle-income, and high-income countries. Improving access to quality neonatal surgical care in LMICs will be vital to achieve Sustainable Development Goal 3.2 of ending preventable deaths in neonates and children younger than 5 years by 2030
Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study
Purpose Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom. Methods Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded. Results The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia. Conclusion We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes
Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study
Purpose:
Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom.
Methods:
Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded.
Results:
The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia.
Conclusion:
We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes