21 research outputs found

    A unique case of esophageal perforation caused by prickly pears

    Get PDF
    This is a report of a previously healthy 20-year-old male presenting with the sensation of a foreign object being stuck in the throat and difficulty speaking after the ingestion of 2 prickly pears. Tests were performed, confirming an esophageal perforation which was managed medically. The patient was discharged after 7 days in hospital with no complications.peer-reviewe

    Post-tonsillectomy bleeding in Malta

    Get PDF
    Patients presenting with post-tonsillectomy haemorrhage are frequently readmitted to hospital, usually for observation, but surgical intervention may sometimes be necessary. A 10 year retrospective analysis of 3553 patients who underwent tonsillectomy with or without adenoidectomy at the Mater Dei hospital was carried out. Clinical notes were used to determine the post-tonsillectomy haemorrhage rate and its relationship to the use of antibiotics.peer-reviewe

    Probing the Complex and Variable X-ray Absorption of Markarian 6 with XMM-Newton

    Full text link
    We report on an X-ray observation of the Seyfert 1.5 galaxy Mrk 6 obtained with the EPIC instruments onboard XMM-Newton. Archival BeppoSAX PDS data from 18-120 keV were also used to constrain the underlying hard power-law continuum. The results from our spectral analyses generally favor a double partial-covering model, although other spectral models such as absorption by a mixture of partially ionized and neutral gas cannot be firmly ruled out. Our best-fitting model consists of a power law with a photon index of 1.81+/-0.20 and partial covering with large column densities up to 10^{23} cm**-2. We also detect a narrow emission line consistent with Fe Kalpha fluorescence at 6.45+/-0.04 keV with an equivalent width of ~93+/-25 eV. Joint analyses of XMM-Newton, ASCA, and BeppoSAX data further provide evidence for both spectral variability (a factor of ~2 change in absorbing column) and absorption-corrected flux variations (by ~60%) during the ~4 year period probed by the observations.Comment: 7 pages, 2 figures. accepted for publication in the Astronomical Journa

    The On-Site Analysis of the Cherenkov Telescope Array

    Get PDF
    The Cherenkov Telescope Array (CTA) observatory will be one of the largest ground-based very high-energy gamma-ray observatories. The On-Site Analysis will be the first CTA scientific analysis of data acquired from the array of telescopes, in both northern and southern sites. The On-Site Analysis will have two pipelines: the Level-A pipeline (also known as Real-Time Analysis, RTA) and the level-B one. The RTA performs data quality monitoring and must be able to issue automated alerts on variable and transient astrophysical sources within 30 seconds from the last acquired Cherenkov event that contributes to the alert, with a sensitivity not worse than the one achieved by the final pipeline by more than a factor of 3. The Level-B Analysis has a better sensitivity (not be worse than the final one by a factor of 2) and the results should be available within 10 hours from the acquisition of the data: for this reason this analysis could be performed at the end of an observation or next morning. The latency (in particular for the RTA) and the sensitivity requirements are challenging because of the large data rate, a few GByte/s. The remote connection to the CTA candidate site with a rather limited network bandwidth makes the issue of the exported data size extremely critical and prevents any kind of processing in real-time of the data outside the site of the telescopes. For these reasons the analysis will be performed on-site with infrastructures co-located with the telescopes, with limited electrical power availability and with a reduced possibility of human intervention. This means, for example, that the on-site hardware infrastructure should have low-power consumption. A substantial effort towards the optimization of high-throughput computing service is envisioned to provide hardware and software solutions with high-throughput, low-power consumption at a low-cost.Comment: In Proceedings of the 34th International Cosmic Ray Conference (ICRC2015), The Hague, The Netherlands. All CTA contributions at arXiv:1508.0589

    A prototype for the real-time analysis of the Cherenkov Telescope Array

    Full text link
    The Cherenkov Telescope Array (CTA) observatory will be one of the biggest ground-based very-high-energy (VHE) Îł- ray observatory. CTA will achieve a factor of 10 improvement in sensitivity from some tens of GeV to beyond 100 TeV with respect to existing telescopes. The CTA observatory will be capable of issuing alerts on variable and transient sources to maximize the scientific return. To capture these phenomena during their evolution and for effective communication to the astrophysical community, speed is crucial. This requires a system with a reliable automated trigger that can issue alerts immediately upon detection of Îł-ray flares. This will be accomplished by means of a Real-Time Analysis (RTA) pipeline, a key system of the CTA observatory. The latency and sensitivity requirements of the alarm system impose a challenge because of the anticipated large data rate, between 0.5 and 8 GB/s. As a consequence, substantial efforts toward the optimization of highthroughput computing service are envisioned. For these reasons our working group has started the development of a prototype of the Real-Time Analysis pipeline. The main goals of this prototype are to test: (i) a set of frameworks and design patterns useful for the inter-process communication between software processes running on memory; (ii) the sustainability of the foreseen CTA data rate in terms of data throughput with different hardware (e.g. accelerators) and software configurations, (iii) the reuse of nonreal- time algorithms or how much we need to simplify algorithms to be compliant with CTA requirements, (iv) interface issues between the different CTA systems. In this work we focus on goals (i) and (ii)

    Associations between depressive symptoms and disease progression in older patients with chronic kidney disease: results of the EQUAL study

    Get PDF
    Background Depressive symptoms are associated with adverse clinical outcomes in patients with end-stage kidney disease; however, few small studies have examined this association in patients with earlier phases of chronic kidney disease (CKD). We studied associations between baseline depressive symptoms and clinical outcomes in older patients with advanced CKD and examined whether these associations differed depending on sex. Methods CKD patients (>= 65 years; estimated glomerular filtration rate <= 20 mL/min/1.73 m(2)) were included from a European multicentre prospective cohort between 2012 and 2019. Depressive symptoms were measured by the five-item Mental Health Inventory (cut-off <= 70; 0-100 scale). Cox proportional hazard analysis was used to study associations between depressive symptoms and time to dialysis initiation, all-cause mortality and these outcomes combined. A joint model was used to study the association between depressive symptoms and kidney function over time. Analyses were adjusted for potential baseline confounders. Results Overall kidney function decline in 1326 patients was -0.12 mL/min/1.73 m(2)/month. A total of 515 patients showed depressive symptoms. No significant association was found between depressive symptoms and kidney function over time (P = 0.08). Unlike women, men with depressive symptoms had an increased mortality rate compared with those without symptoms [adjusted hazard ratio 1.41 (95% confidence interval 1.03-1.93)]. Depressive symptoms were not significantly associated with a higher hazard of dialysis initiation, or with the combined outcome (i.e. dialysis initiation and all-cause mortality). Conclusions There was no significant association between depressive symptoms at baseline and decline in kidney function over time in older patients with advanced CKD. Depressive symptoms at baseline were associated with a higher mortality rate in men

    Data Acquisition System of the CLOUD Experiment at CERN

    No full text
    The Cosmics Leaving OUtdoor Droplets (CLOUD) experiment at the European Organization for Nuclear Research (CERN) is investigating the nucleation and growth of aerosol particles under atmospheric conditions and their activation into cloud droplets. The experiment comprises an ultraclean 26 m3chamber and its associated systems (the CLOUD facility) together with a suite of around 50 advanced instruments attached to the chamber via sampling probes to analyze its contents. The set of instruments changes for each experimental campaign according to the scientific goals. The central function of the CLOUD DAQ (data acquisition) system is to combine the data from these autonomous and inhomogeneous instruments into a single, integrated CLOUD experiment database. The DAQ system needs to be highly adaptable to allow a fast setup over a single installation week at the start of each campaign when the instruments are brought to CERN and installed at the CLOUD chamber. Each campaign requires high flexibility and fast response to changes in instrument configuration or experimental parameters. The experiments require online monitoring of the physical and chemical measurements with delays of only a few seconds. In addition, the raw data, the monitoring databases, and the processed data must be archived and provided to the international collaboration for both real-time and later analyses. We will describe the various components of the CLOUD DAQ and computing infrastructure, together with the reasons for the chosen solutions

    Impact of personalized diet and probiotic supplementation on inflammation, nutritional parameters and intestinal microbiota e The "RISTOMED project": Randomized controlled trial in healthy older people

    Get PDF
    Objectives: To assess the impact of a personalized diet, with or without addition of VSL#3 preparation, on biomarkers of inflammation, nutrition, oxidative stress and intestinal microbiota in 62 healthy persons aged 65–85 years.Design: Open label, randomized, multicenter study. Primary endpoint: High-sensitivity C-reactive protein.Setting: Community.Interventions: Eight week web-based dietary advice (RISTOMED platform) alone or with supplementation of VSL#3 (2 capsules per day). The RISTOMED diet was optimized to reduce inflammation and oxidative stress.Measurements: Blood and stool samples were collected on days 1 and 56.Results: Diet alone reduced ESR (p = 0.02), plasma levels of cholesterol (p < 0.01) and glucose (p = 0.03). Addition of VSL#3 reduced ESR (p = 0.05) and improved folate (p = 0.007), vitamin B12 (p = 0.001) and homocysteine (p < 0.001) plasma levels. Neither intervention demonstrated any further effects on inflammation. Subgroup analysis showed 40 participants without signs of low-grade inflammation (hsCRP<3 mg/l, subgroup 1) and 21 participants with low-grade inflammation at baseline (hsCRP≄3 mg/l, subgroup 2). In subgroup 2 addition of VSL#3 increased bifidobacteria (p = 0.005) in more participants and improved both folate (p = 0.015) and vitamin B12 (p = 0.035) levels compared with subgroup 1. The increases were positively correlated to the change in the bifidobacteria concentration for folate (p = 0.023) and vitamin B12 (p = 0.001). As expected change in homocysteine correlated negatively to change in folate (r = −0.629, p = 0.002) and vitamin B12 (r = −0.482, p = 0.026).Conclusions: Addition of VSL#3 increased bifidobacteria and supported adequate folate and vitamin B12 concentrations in subjects with low-grade inflammation. Decrease in homocysteine with VSL#3 was clinically relevant. suggesting protective potentials for aging-associated conditions, e.g. cardiovascular or neurological diseases

    A prototype for the real-time analysis of the Cherenkov Telescope Array

    No full text
    The Cherenkov Telescope Array (CTA) observatory will be one of the biggest ground-based very-high-energy (VHE) γ- ray observatory. CTA will achieve a factor of 10 improvement in sensitivity from some tens of GeV to beyond 100 TeV with respect to existing telescopes. The CTA observatory will be capable of issuing alerts on variable and transient sources to maximize the scientific return. To capture these phenomena during their evolution and for effective communication to the astrophysical community, speed is crucial. This requires a system with a reliable automated trigger that can issue alerts immediately upon detection of γ-ray flares. This will be accomplished by means of a Real-Time Analysis (RTA) pipeline, a key system of the CTA observatory. The latency and sensitivity requirements of the alarm system impose a challenge because of the anticipated large data rate, between 0.5 and 8 GB/s. As a consequence, substantial efforts toward the optimization of highthroughput computing service are envisioned. For these reasons our working group has started the development of a prototype of the Real-Time Analysis pipeline. The main goals of this prototype are to test: (i) a set of frameworks and design patterns useful for the inter-process communication between software processes running on memory; (ii) the sustainability of the foreseen CTA data rate in terms of data throughput with different hardware (e.g. accelerators) and software configurations, (iii) the reuse of nonreal- time algorithms or how much we need to simplify algorithms to be compliant with CTA requirements, (iv) interface issues between the different CTA systems. In this work we focus on goals (i) and (ii). © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only
    corecore