482 research outputs found

    Mollusk carbonate thermal behaviour and its implications in understanding prehistoric fire events in shell middens

    Get PDF
    Archaeological shell middens are particularly important for reconstructing prehistoric human subsistence strategies. However, very little is known about shellfish processing, especially when related to the use of fire for dietary and disposal purposes. To shed light on prehistoric food processing techniques, an experimental study was undertaken on modern gastropod shells (Phorcus lineatus). The shells were exposed to high temperatures (200-700 °C) to investigate subsequent mineralogy and macro- and microstructural changes. Afterwards, the three-pronged approach was applied to archaeological shells from Haua Fteah cave, Libya (Phorcus turbinatus) and from shell midden sites in the United Arab Emirates (Anadara uropigimelana and Terebralia palustris) to determine exposure temperatures. Results indicated that shells from the Haua Fteah were exposed to high temperatures (600 - 700 °C) during the Mesolithic period (c. 12.7 - 9 ka), whereas specimens from the Neolithic period (c. 8.5 - 5.4 ka) were mainly exposed to lower temperatures (300 - 500 °C). The thermally-induced changes in A. uropigimelana and T. palustris shells from the South East Arabian archaeological sites were similar to those seen in Phorcus spp. suggesting a broad applicability of the experimental results at an interspecific level. Although heat significantly altered the appearance and mineralogy of the shells, 14CAMS ages obtained on burnt shells fit within the expected age ranges for their associated archaeological contexts, indicating that robust radiocarbon ages may still be obtained from burnt shells. Our study indicates that the combination of microstructural and mineralogical observations can provide important information to infer shellfish processing strategies in prehistoric cultures and their change through time.Funding for this study was kindly provided by the EU within the framework (FP7) of the Marie Curie International Training Network ARAMACC (604802) to SM and by the Alexander von Humboldt Postdoctoral Fellowship (1151310) and McKenzie Postdoctoral Fellowship to AP. The Haua Fteah excavations were undertaken with the permission of the Libyan Department of Antiquities and with funding to GB from the Society for Libyan Studies and from the European Research Council (Advanced Investigator Grant 230421), whose support is also gratefully acknowledge

    Year-round shellfish exploitation in the Levant and implications for Upper Palaeolithic hunter-gatherer subsistence

    Get PDF
    Recent studies have shown that the use of aquatic resources has greater antiquity in hominin diets than pre- viously thought. At present, it is unclear when hominins started to habitually consume marine resources. This study examines shellfish exploitation from a behavioural ecology perspective, addressing how and when past hunter-gatherers from the Levant used coastal resources for subsistence purposes. We investigate the seasonality of shellfish exploitation in the Levantine Upper Palaeolithic through oxygen isotope analysis on shells of the intertidal rocky shore mollusc Phorcus (Osilinus) turbinatus from the key site Ksâr ‘Akil (Lebanon). At this rockshelter, multi-layered archaeological deposits contained remains of both marine and terrestrial molluscs in relatively large quantities, which were consumed and used as tools and ornaments by the occupants of the site. Our results indicate that at the start of the Initial Upper Palaeolithic (IUP), there is no evidence for shellfish consumption. Humans started to take fresh shellfish to the rockshelter from the second half of the IUP onward, albeit in low quantities. During the Early Upper Palaeolithic (EUP) shellfish exploitation became increasingly frequent. Oxygen isotope data show that shellfish exploitation was practised in every season throughout most of the Upper Palaeolithic (UP), with an emphasis on the colder months. This suggests that coastal resources had a central role in early UP foraging strategies, rather than a seasonally restricted supplementary one. Year-round shellfish gathering, in turn, suggests that humans occupied the rockshelter at different times of the year, al- though not necessarily continuously. Our oxygen isotope data is complemented with broader-scale exploitation patterns of faunal resources, both vertebrate and invertebrate, at the site. The inclusion of coastal marine re- sources signifies a diversification of the human diet from the EUP onward, which is also observed in foraging practices linked to the exploitation of terrestrial fauna.H2020 Marie Skłodowska-Curie fellowship “EU-BEADS”, project number: 656325 and the Max Planck Society

    Changing patterns of eastern Mediterranean shellfish exploitation in the Late Glacial and Early Holocene: Oxygen isotope evidence from gastropod in Epipaleolithic to Neolithic human occupation layers at the Haua Fteah cave, Libya

    Get PDF
    The seasonal pattern of shellfish foraging at the archaeological site of Haua Fteah in the Gebel Akhdar, Libya was investigated from the Epipaleolithic to the Neolithic via oxygen isotope (d18O) analyses of the topshell Phorcus (Osilinus) turbinatus. To validate this species as faithful year-round palaeoenvironmental recorder, the intra-annual variability of d18O in modern shells and sea water was analysed and compared with measured sea surface temperature (SST). The shells were found to be good candidates for seasonal shellfish forging studies as they preserve nearly the complete annual SST cycle in their shell d18O with minimal slowing or stoppage of growth. During the terminal Pleistocene Early Epipaleolithic (locally known as the Oranian, with modeled dates of 17.2-12.5 ka at 2sigma probability, Douka et al., 2014), analysis of archaeological specimens indicates that shellfish were foraged year-round. This complements other evidence from the archaeological record that shows that the cave was more intensively occupied in this period than before or afterwards. This finding is significant as the period of the Oranian was the coldest and driest phase of the last glacial cycle in the Gebel Akhdar, adding weight to the theory that the Gebel Akhdar may have served as a refugium for humans in North Africa during times of global climatic extremes. Mollusc exploitation in the Latest Pleistocene and Early Holocene, during the Late Epipaleolithic (locally known as the Capsian, c. 12.7 to 9 ka) and the Neolithic (c. 8.5 to 5.4 ka), occurred predominantly during winter. Other evidence from these archaeological phases shows that hunting activities occurred during the warmer months. Therefore, the timing of Holocene shellfish exploitation in the Gebel Akhdar may have been influenced by the seasonal availability of other resources at these times and possibly shellfish were used as a dietary supplement when other foods were less abundant

    Contemporary Presentation and Management of Valvular Heart Disease The EURObservational Research Programme Valvular Heart Disease II Survey

    Get PDF
    Background: Valvular heart disease (VHD) is an important cause of mortality and morbidity and has been subject to important changes in management. The VHD II survey was designed by the EURObservational Research Programme of the European Society of Cardiology to analyze actual management of VHD and to compare practice with guidelines. Methods: Patients with severe native VHD or previous valvular intervention were enrolled prospectively across 28 countries over a 3-month period in 2017. Indications for intervention were considered concordant if the intervention was performed or scheduled in symptomatic patients, corresponding to Class I recommendations specified in the 2012 European Society of Cardiology and in the 2014 American Heart Association/American College of Cardiology VHD guidelines. Results: A total of 7247 patients (4483 hospitalized, 2764 outpatients) were included in 222 centers. Median age was 71 years (interquartile range, 62-80 years); 1917 patients (26.5%) were >= 80 years; and 3416 were female (47.1%). Severe native VHD was present in 5219 patients (72.0%): aortic stenosis in 2152 (41.2% of native VHD), aortic regurgitation in 279 (5.3%), mitral stenosis in 234 (4.5%), mitral regurgitation in 1114 (21.3%; primary in 746 and secondary in 368), multiple left-sided VHD in 1297 (24.9%), and right-sided VHD in 143 (2.7%). Two thousand twenty-eight patients (28.0%) had undergone previous valvular intervention. Intervention was performed in 37.0% and scheduled in 26.8% of patients with native VHD. The decision for intervention was concordant with Class I recommendations in symptomatic patients with severe single left-sided native VHD in 79.4% (95% CI, 77.1-81.6) for aortic stenosis, 77.6% (95% CI, 69.9-84.0) for aortic regurgitation, 68.5% (95% CI, 60.8-75.4) for mitral stenosis, and 71.0% (95% CI, 66.4-75.3) for primary mitral regurgitation. Valvular interventions were performed in 2150 patients during the survey; of them, 47.8% of patients with single left-sided native VHD were in New York Heart Association class III or IV. Transcatheter procedures were performed in 38.7% of patients with aortic stenosis and 16.7% of those with mitral regurgitation. Conclusions: Despite good concordance between Class I recommendations and practice in patients with aortic VHD, the suboptimal number in mitral VHD and late referral for valvular interventions suggest the need to improve further guideline implementation

    Cardiac rehabilitation to improve health-related quality of life following trans-catheter aortic valve implantation: a randomised controlled feasibility study: RECOVER-TAVI Pilot, ORCA 4, for the Optimal Restoration of Cardiac Activity Group

    Get PDF
    Objectives: Transcatheter aortic valve implantation (TAVI) is often undertaken in the oldest frailest cohort of patients undergoing cardiac interventions. We plan to investigate the potential benefit of cardiac rehabilitation (CR) in this vulnerable population. Design: We undertook a pilot randomised trial of CR following TAVI to inform the feasibility and design of a future randomised clinical trial (RCT). Participants: We screened patients undergoing TAVI at a single institution between June 2016 and February 2017. Interventions: Participants were randomised post-TAVI to standard of care (control group) or standard of care plus exercise-based CR (intervention group). Outcomes: We assessed recruitment and attrition rates, uptake of CR, and explored changes in 6-min walk test, Nottingham Activities of Daily Living, Fried and Edmonton Frailty scores and Hospital Anxiety and Depression Score, from baseline (30 days post TAVI) to 3 and 6 months post randomisation. We also undertook a parallel study to assess the use of the Kansas City Cardiomyopathy Questionnaire (KCCQ) in the post-TAVI population. Results: Of 82 patients screened, 52 met the inclusion criteria and 27 were recruited (3 patients/month). In the intervention group, 10/13 (77%) completed the prescribed course of 6 sessions of CR (mean number of sessions attended 7.5, SD 4.25) over 6 weeks. At 6 months, all participants were retained for follow-up. There was apparent improvement in outcome scores at 3 and 6 months in control and CR groups. There were no recorded adverse events associated with the intervention of CR. The KCCQ was well accepted in 38 post-TAVI patients: mean summary score 72.6 (SD 22.6). Conclusions: We have demonstrated the feasibility of recruiting post-TAVI patients into a randomised trial of CR. We will use the findings of this pilot trial to design a fully powered multicentre RCT to inform the provision of CR and support guideline development to optimise health-related quality of life outcomes in this vulnerable population. Retrospectively registered 3rd October 2016 clinicaltrials.gov NCT02921880. Trial registration: Clinicaltrials.Gov identifier NCT0292188

    Lumbar Plexopathy Caused by Metastatic Tumor, Which Was Mistaken for Postoperative Femoral Neuropathy

    Get PDF
    Surgical excision was performed on a 30-years old woman with a painful mass on her left thigh. The pathologic findings on the mass indicated fibromatosis. After the operation, she complained of allodynia and spontaneous pain at the operation site and ipsilateral lower leg. We treated her based on postoperative femoral neuropathy, but symptom was aggravated. We found a large liposarcoma in her left iliopsoas muscle which compressed the lumbar plexus. In conclusion, the cause of pain was lumbar plexopathy related to a mass in the left iliopsoas muscle. Prompt diagnosis of acute neuropathic pain after an operation is important and management must be based on exact causes

    Long-term virological outcome in children on antiretroviral therapy in the UK and Ireland

    Get PDF
    Objective: To assess factors at the start of antiretroviral therapy (ART) associated with long-term virological response in children. Design: Multicentre national cohort. Methods: Factors associated with viral load below 400 copies/ml by 12 months and virologic failure among children starting 3/4-drug ART in the UK/Irish Collaborative HIV Paediatric Study were assessed using Poisson models. Results: Nine hundred and ninety-seven children started ART at a median age of 7.7 years (inter-quartile range 2.9–11.7), 251 (25%) below 3 years: 411 (41%) with efavirenz and two nucleoside reverse transcriptase inhibitors (EFVþ2NRTIs), 264 (26%) with nevirapine and two NRTIs (NVPþ2NRTIs), 119 (12%; 106 NVP, 13 EFV) with non-nucleoside reverse transcriptase inhibitor and three NRTIs (NNRTIþ3NRTIs), and 203 (20%) with boosted protease inhibitor-based regimens. Median follow-up after ART initiation was 5.7 (3.0–8.8) years. Viral load was less than 400 copies/ml by 12 months in 92% [95% confidence interval (CI) 91–94%] of the children. Time to suppression was similar across regimens (P¼0.10), but faster over calendar time, with older age and lower baseline viral load. Three hundred and thirtynine (34%) children experienced virological failure. Although progression to failure varied by regimen (P<0.001) and was fastest for NVPþ2NRTIs regimens, risk after 2 years on therapy was similar for EFVþ2NRTIs and NVPþ2NRTIs, and lowest for NNRTIþ3NRTIs regimens (P-interaction¼0.03). Older age, earlier calendar periods and maternal ART exposure were associated with increased failure risk. Early treatment discontinuation for toxicity occurred more frequently for NVP-based regimens, but 5-year cumulative incidence was similar: 6.1% (95% CI 3.9–8.9%) NVP, 8.3% (95% CI 5.6–11.6) EFV, and 9.8% (95% CI 5.7–15.3%) protease inhibitor-based regimens (P¼0.48). Conclusion: Viral load suppression by 12 months was high with all regimens. NVPþ3NRTIs regimens were particularly efficacious in the longer term and may be a good alternative to protease inhibitor-based ART in young children

    Carbon isotope signatures from land snail shells: Implications for palaeovegetation reconstruction in the eastern Mediterranean

    Get PDF
    In this studywecompare carbon isotope values inmodern Helix melanostoma shell carbonate (d13Cshell) from the Gebel al-Akhdar region of Libya with carbon isotope values in H. melanostomabody tissue (d13Cbody), local vegetation (d13Cplant) and soil (d13Csoil). All vegetation in the study area followed the C3 photosynthetic pathway. However, the d13Cplant values of different species formed two distinct isotopic groups. This can be best explained by different water use efficiencies with arid adapted species having significantly more positive d13Cplant values than less water efficient species. The ranges and means of d13Cbody and d13Cplant were statistically indistinguishable from one another suggesting that d13Cbody was primarily a function of local vegetation composition. H. melanostoma d13Cshell reflected the d13Cplant of local vegetation with a positive offset between body/diet and shell of 14.5± 1.4‰. Therefore, in the Gebel al-Akhdar where only C3 plants are present, highermeand13C shell values likely reflect greater abundances ofwater-efficientC3 plants in the snails diet and therefore in the landscape, whilst lower mean d13Cshell values likely reflect the consumption of less water-efficient C3 plants. The distribution of these plants is in turn affected by environmental factors such as rainfall. These findings can be applied to archaeological and geological shell deposits to reconstruct late Pleistocene to Holocene vegetation change in the southeast Mediterranean

    Contemporary Management of Severe Symptomatic Aortic Stenosis

    Get PDF
    There were gaps between guidelines and practice when surgery was the only treatment for aortic stenosis (AS).This study analyzed the decision to intervene in patients with severe AS in the EORP VHD (EURObservational Research Programme Valvular Heart Disease) II survey.Among 2,152 patients with severe AS, 1,271 patients with high-gradient AS who were symptomatic fulfilled a Class I recommendation for intervention according to the 2012 European Society of Cardiology guidelines; the primary end point was the decision for intervention.A decision not to intervene was taken in 262 patients (20.6%). In multivariate analysis, the decision not to intervene was associated with older age (odds ratio [OR]: 1.34 per 10-year increase; 95% CI: 1.11 to 1.61; P = 0.002), New York Heart Association functional classes I and II versus III (OR: 1.63; 95% CI: 1.16 to 2.30; P = 0.005), higher age-adjusted Charlson comorbidity index (OR: 1.09 per 1-point increase; 95% CI: 1.01 to 1.17; P = 0.03), and a lower transaortic mean gradient (OR: 0.81 per 10-mm Hg decrease; 95% CI: 0.71 to 0.92; P < 0.001). During the study period, 346 patients (40.2%, median age 84 years, median EuroSCORE II [European System for Cardiac Operative Risk Evaluation II] 3.1%) underwent transcatheter intervention and 515 (59.8%, median age 69 years, median EuroSCORE II 1.5%) underwent surgery. A decision not to intervene versus intervention was associated with lower 6-month survival (87.4%; 95% CI: 82.0 to 91.3 vs 94.6%; 95% CI: 92.8 to 95.9; P < 0.001).A decision not to intervene was taken in 1 in 5 patients with severe symptomatic AS despite a Class I recommendation for intervention and the decision was particularly associated with older age and combined comorbidities. Transcatheter intervention was extensively used in octogenarians

    TCRß sequencing reveals spatial and temporal evolution of clonal CD4 T cell responses in a breach of tolerance model of inflammatory arthritis

    Get PDF
    Effective tolerogenic intervention in Rheumatoid Arthritis (RA) will rely upon understanding the evolution of articular antigen specific CD4 T cell responses. TCR clonality of endogenous CD4 T cell infiltrates in early inflammatory arthritis was assessed to monitor evolution of the TCR repertoire in the inflamed joint and associated lymph node (LN). Mouse models of antigen-induced breach of self-tolerance and chronic polyarthritis were used to recapitulate early and late phases of RA. The infiltrating endogenous, antigen experienced CD4 T cells in inflamed joints and LNs were analysed using flow cytometry and TCRβ sequencing. TCR repertoires from inflamed late phase LNs displayed increased clonality and diversity compared to early phase LNs, while inflamed joints remained similar with time. Repertoires from late phase LNs accumulated clones with a diverse range of TRBV genes, while inflamed joints at both phases contained clones expressing similar TRBV genes. Repertoires from LNs and joints at the late phase displayed reduced CDR3β sequence overlap compared to the early disease phase, however the most abundant clones in LNs accumulate in the joint at the later phase. The results indicate CD4 T cell repertoire clonality and diversity broadens with progression of inflammatory arthritis and is first reflected in LNs before mirroring in the joint. These observations imply that antigen specific tolerogenic therapies could be more effective if targeted at earlier phases of disease when CD4 T cell clonality is least diverse
    corecore