1,496 research outputs found

    Electron cloud in the CERN accelerators (PS, SPS, LHC)

    Full text link
    Several indicators have pointed to the presence of an Electron Cloud (EC) in some of the CERN accelerators, when operating with closely spaced bunched beams. In particular, spurious signals on the pick ups used for beam detection, pressure rise and beam instabilities were observed at the Proton Synchrotron (PS) during the last stage of preparation of the beams for the Large Hadron Collider (LHC), as well as at the Super Proton Synchrotron (SPS). Since the LHC has started operation in 2009, typical electron cloud phenomena have appeared also in this machine, when running with trains of closely packed bunches (i.e. with spacings below 150ns). Beside the above mentioned indicators, other typical signatures were seen in this machine (due to its operation mode and/or more refined detection possibilities), like heat load in the cold dipoles, bunch dependent emittance growth and degraded lifetime in store and bunch-by-bunch stable phase shift to compensate for the energy loss due to the electron cloud. An overview of the electron cloud status in the different CERN machines (PS, SPS, LHC) will be presented in this paper, with a special emphasis on the dangers for future operation with more intense beams and the necessary countermeasures to mitigate or suppress the effect.Comment: 8 pages, contribution to the Joint INFN-CERN-EuCARD-AccNet Workshop on Electron-Cloud Effects: ECLOUD'12; 5-9 Jun 2012, La Biodola, Isola d'Elba, Ital

    PyECLOUD and build-up simulations at CERN

    Full text link
    PyECLOUD is a newly developed code for the simulation of the electron cloud (EC) build-up in particle accelerators. Almost entirely written in Python, it is mostly based on the physical models already used in the ECLOUD code but, thanks to the implementation of new optimized algorithms, it exhibits a significantly improved performance in accuracy, speed, reliability and flexibility. Such new features of PyECLOUD have been already broadly exploited to study EC observations in the Large Hadron Collider (LHC) and its injector chain as well as for the extrapolation to high luminosity upgrade scenarios.Comment: 6 pages, contribution to the Joint INFN-CERN-EuCARD-AccNet Workshop on Electron-Cloud Effects: ECLOUD'12; 5-9 Jun 2012, La Biodola, Isola d'Elba, Ital

    Enhanced genotypability for a more accurate variant calling in targeted resequencing

    Get PDF
    The analysis of Next-Generation Sequencing (NGS) data for the identification of DNA genetic variants presents several bioinformatics challenges. The main requirements of the analysis are the accuracy and the reproducibility of results, as their clinical interpretation may be influenced by many variables, from the sample processing to the adopted bioinformatics algorithms. Targeted resequencing, which aim is the enrichment of genomic regions to identify genetic variants possibly associated to clinical diseases, bases the quality of its data on the depth and uniformity of coverage, for the differentiation between true and false positives findings. Many variant callers have been developed to reach the best accuracy considering these metrics, but they can\u2019t work in regions of the genome where short reads cannot align uniquely (uncallable regions). The misalignment of reads on the reference genome can arise when reads are too short to overcome repetitious regions of the genome, causing the software to assign a low-quality score to the read pairs of the same fragment. A limitation of this process is that variant callers are not able to call variants in these regions, unless the quality of one of the two read mates could increase. Moreover, current metrics are not able to define with accuracy these regions, lacking in providing this information to the final customer. For this reason, a more accurate metric is needed to clearly report the uncallable genomic regions, with the prospect to improve the data analysis to possibly investigate them. This work aimed to improve the callability (genotypability) of the target regions for a more accurate data analysis and to provide a high-quality variant calling. Different experiments have been conducted to prove the relevance of genotypability for the evaluation of targeted resequencing performance. Firstly, this metric showed that increasing the depth of sequencing to rescue variants is not necessary at thresholds where genotypability reaches saturation (70X). To improve this metric and to evaluate the accuracy and reproducibility of results on different enrichment technologies for WES sample processing, the genotypability was evaluated on four exome platforms using three different DNA fragment lengths (short: ~200, medium: ~350, long: ~500 bp). Results showed that mapping quality could successfully increase on all platforms extending the fragment, hence increasing the distance between the read pairs. The genotypability of many genes, including several ones associated to a clinical phenotype, could strongly improve. Moreover, longer libraries increased uniformity of coverage for platforms that have not been completely optimized for short fragments, further improving their genotypability. Given the relevance of the quality of data derived, especially from the extension of the short fragments to the medium ones, a deeper investigation was performed to identify a potential threshold of fragment length above which the improvement in genotypability was significant. On the enrichment platform producing the higher enrichment uniformity (Twist), the fragments above 230 bp could obtain a meaningful improvement of genotypability (almost 1%) and a high uniformity of coverage of the target. Interestingly, the extension of the DNA fragment showed a greater influence on genotypability in respect on the solely uniformity of coverage. The enhancement of genotypability for a more accurate bioinformatics analysis of the target regions provided at limited costs (less sequencing) the investigation of regions of the genome previously defined as uncallable by current NGS methodologies

    Mass Spectrometric Proteomics

    Get PDF
    As suggested by the title of this Special Issue, liquid chromatography-mass spectrometry plays a pivotal role in the field of proteomics. Indeed, the research and review articles  published in the Issue clearly evidence how the data produced by this sophisticated methodology may promote impressive advancements in this area. From among the topics discussed in the Issue, a few point to the development of  new procedures for the  optimization of the experimental conditions that should be applied  for the identification of proteins present in complex mixtures.  Other applications  described in these articles show  the huge potential of  these strategies in the protein profiling of organs and  range from  to the study of post-translational tissue modifications to the investigation of the molecular mechanisms behind human disorders and the identification of potential biomarkers of these diseases

    Hybrid Design Lab come spazio di confronto

    Get PDF

    Benchmarking headtail with electron cloud instabilities observed in the LHC

    Full text link
    After a successful scrubbing run in the beginning of 2011, the LHC can be presently operated with high intensity proton beams with 50 ns bunch spacing. However, strong electron cloud effects were observed during machine studies with the nominal beam with 25 ns bunch spacing. In particular, fast transverse instabilities were observed when attempting to inject trains of 48 bunches into the LHC for the first time. An analysis of the turn-by-turn bunch-bybunch data from the transverse damper pick-ups during these injection studies is presented, showing a clear signature of the electron cloud effect. These experimental observations are reproduced using numerical simulations: the electron distribution before each bunch passage is generated with PyECLOUD and used as input for a set of HEADTAIL simulations. This paper describes the simulation method as well as the sensitivity of the results to the initial conditions for the electron build-up. The potential of this type of simulations and their clear limitations on the other hand are discussed.Comment: 7 pages, contribution to the Joint INFN-CERN-EuCARD-AccNet Workshop on Electron-Cloud Effects: ECLOUD'12; 5-9 Jun 2012, La Biodola, Isola d'Elba, Ital

    Lynn University: 60 Years in 30 Minutes

    Get PDF
    Lynn University celebrates 60 years since it was founded as Marymount College in 1962. Join Amy Filiatreau, Lynn University Library Director, and Lea Iadarola, Archivist & Records Manager, as they take you through how Lynn transformed from a two-year all-female Catholic school to become one of the most innovative and diverse universities in America

    You Have a Milestone to Celebrate. Now What?

    Get PDF
    The 2019-2020 academic year marked the Lynn University Conservatory of Music’s 20th anniversary. The archivist and music librarian combined their individual strengths to create a special tribute to this milestone. Our contribution was in two formats: a documentary-style video and a historical timeline. The idea of producing a video was born from the idea of conducting an oral history project. We hoped to produce a commemorative video that told the unique story of the Conservatory’s inception and evolution over two decades - and the characters involved. But first, we had to secure a budget to enlist two film studies students to help us shoot the interviews and edit the video. Then, we interviewed the past and current presidents of the university, Deans of the Conservatory, and faculty who were essential in developing the two pillars of the program – the chamber music and orchestra programs. And, because you can’t have a video about a conservatory of music without music, we commissioned a musical work from the composition student to use in the background. Because the Archives is home to diverse historical artifacts, such as news clippings and archival photos, we utilized them to create an interactive historical timeline to visually show the Conservatory’s development. We chose Knights Lab’s TimelineJS, as it was easy to use, simple to integrate into our institutional repository, and visually appealing. These efforts later set the stage for the university-wide celebration and promotion of the anniversary. We consulted with the Marketing department to adhere to university branding guidelines. The video and timeline were showcased on social media, such as YouTube, and shown at concerts and fundraising events. In this presentation, we will discuss how, with a limited budget and staff, we conducted oral history interviews with the idea of using segments to produce a commemorative video. We will also describe the steps we took to create a digital, interactive historical timeline using an open-source tool. Participants will learn the basics of producing a video based on interviews and how to lead a multi-departmental anniversary project
    • …
    corecore