1,855 research outputs found
Electron cloud in the CERN accelerators (PS, SPS, LHC)
Several indicators have pointed to the presence of an Electron Cloud (EC) in
some of the CERN accelerators, when operating with closely spaced bunched
beams. In particular, spurious signals on the pick ups used for beam detection,
pressure rise and beam instabilities were observed at the Proton Synchrotron
(PS) during the last stage of preparation of the beams for the Large Hadron
Collider (LHC), as well as at the Super Proton Synchrotron (SPS). Since the LHC
has started operation in 2009, typical electron cloud phenomena have appeared
also in this machine, when running with trains of closely packed bunches (i.e.
with spacings below 150ns). Beside the above mentioned indicators, other
typical signatures were seen in this machine (due to its operation mode and/or
more refined detection possibilities), like heat load in the cold dipoles,
bunch dependent emittance growth and degraded lifetime in store and
bunch-by-bunch stable phase shift to compensate for the energy loss due to the
electron cloud. An overview of the electron cloud status in the different CERN
machines (PS, SPS, LHC) will be presented in this paper, with a special
emphasis on the dangers for future operation with more intense beams and the
necessary countermeasures to mitigate or suppress the effect.Comment: 8 pages, contribution to the Joint INFN-CERN-EuCARD-AccNet Workshop
on Electron-Cloud Effects: ECLOUD'12; 5-9 Jun 2012, La Biodola, Isola d'Elba,
Ital
PyECLOUD and build-up simulations at CERN
PyECLOUD is a newly developed code for the simulation of the electron cloud
(EC) build-up in particle accelerators. Almost entirely written in Python, it
is mostly based on the physical models already used in the ECLOUD code but,
thanks to the implementation of new optimized algorithms, it exhibits a
significantly improved performance in accuracy, speed, reliability and
flexibility. Such new features of PyECLOUD have been already broadly exploited
to study EC observations in the Large Hadron Collider (LHC) and its injector
chain as well as for the extrapolation to high luminosity upgrade scenarios.Comment: 6 pages, contribution to the Joint INFN-CERN-EuCARD-AccNet Workshop
on Electron-Cloud Effects: ECLOUD'12; 5-9 Jun 2012, La Biodola, Isola d'Elba,
Ital
Synchronous Phase Shift at LHC
The electron cloud in vacuum pipes of accelerators of positively charged
particle beams causes a beam energy loss which could be estimated from the
synchronous phase. Measurements done with beams of 75 ns, 50 ns, and 25 ns
bunch spacing in the LHC for some fills in 2010 and 2011 show that the average
energy loss depends on the total beam intensity in the ring. Later measurements
during the scrubbing run with 50 ns beams show the reduction of the electron
cloud due to scrubbing. Finally, measurements of the individual bunch phase
give us information about the electron cloud build-up inside the batch and from
batch to batch.Comment: Presented at ECLOUD'12: Joint INFN-CERN-EuCARD-AccNet Workshop on
Electron-Cloud Effects, La Biodola, Isola d'Elba, Italy, 5-9 June 201
Benchmarking headtail with electron cloud instabilities observed in the LHC
After a successful scrubbing run in the beginning of 2011, the LHC can be
presently operated with high intensity proton beams with 50 ns bunch spacing.
However, strong electron cloud effects were observed during machine studies
with the nominal beam with 25 ns bunch spacing. In particular, fast transverse
instabilities were observed when attempting to inject trains of 48 bunches into
the LHC for the first time. An analysis of the turn-by-turn bunch-bybunch data
from the transverse damper pick-ups during these injection studies is
presented, showing a clear signature of the electron cloud effect. These
experimental observations are reproduced using numerical simulations: the
electron distribution before each bunch passage is generated with PyECLOUD and
used as input for a set of HEADTAIL simulations. This paper describes the
simulation method as well as the sensitivity of the results to the initial
conditions for the electron build-up. The potential of this type of simulations
and their clear limitations on the other hand are discussed.Comment: 7 pages, contribution to the Joint INFN-CERN-EuCARD-AccNet Workshop
on Electron-Cloud Effects: ECLOUD'12; 5-9 Jun 2012, La Biodola, Isola d'Elba,
Ital
Triple peptide vaccination as consolidation treatment in women affected by ovarian and breast cancer: clinical and immunological data of a phase I/II clinical trial
Vaccination with priming and expansion of tumour
reacting T cells is an important therapeutic option to be used
in combination with novel checkpoint inhibitors to increase
the specificity of the T cell infiltrate and the efficacy of the
treatment. In this phase I/II study, 14 high-risk disease-free
ovarian (OC) and breast cancer (BC) patients after completion
of standard therapies were vaccinated with MUC1, ErbB2
and carcinoembryonic antigen (CEA) HLA-A2+-restricted
peptides and Montanide. Patients were subjected to 6 doses
of vaccine every two weeks and a recall dose after 3 months.
ECOG grade 2 toxicity was observed at the injection site. Eight
out of 14 patients showed specific CD8+ T cells to at least one
antigen. None of 4 patients vaccinated for compassionate use
showed a CD8 activation. An OC patient who suffered from
a lymph nodal recurrence, showed specific anti-ErbB2 CD8+
T cells in the bulky aortic lymph nodes suggesting homingof the activated T cells. Results confirm that peptide vaccination
strategy is feasible, safe and well tolerated. In particular
OC patients appear to show a higher response rate compared
to BC patients. Vaccination generates a long-lasting immune
response, which is strongly enhanced by recall administrations.
The clinical outcome of patients enrolled in the trial
appears favourable, having registered no deceased patients
with a minimum follow-up of 8 years. These promising data,
in line with the results of similar studies, the high compliance
of patients observed and the favourable toxicity profile, support
future trials of peptide vaccination in clinically disease-free
patients who have completed standard treatments
Enhanced genotypability for a more accurate variant calling in targeted resequencing
The analysis of Next-Generation Sequencing (NGS) data for the identification of DNA genetic variants presents several bioinformatics challenges. The main requirements of the analysis are the accuracy and the reproducibility of results, as their clinical interpretation may be influenced by many variables, from the sample processing to the adopted bioinformatics algorithms. Targeted resequencing, which aim is the enrichment of genomic regions to identify genetic variants possibly associated to clinical diseases, bases the quality of its data on the depth and uniformity of coverage, for the differentiation between true and false positives findings. Many variant callers have been developed to reach the best accuracy considering these metrics, but they can\u2019t work in regions of the genome where short reads cannot align uniquely (uncallable regions). The misalignment of reads on the reference genome can arise when reads are too short to overcome repetitious regions of the genome, causing the software to assign a low-quality score to the read pairs of the same fragment. A limitation of this process is that variant callers are not able to call variants in these regions, unless the quality of one of the two read mates could increase. Moreover, current metrics are not able to define with accuracy these regions, lacking in providing this information to the final customer. For this reason, a more accurate metric is needed to clearly report the uncallable genomic regions, with the prospect to improve the data analysis to possibly investigate them. This work aimed to improve the callability (genotypability) of the target regions for a more accurate data analysis and to provide a high-quality variant calling. Different experiments have been conducted to prove the relevance of genotypability for the evaluation of targeted resequencing performance. Firstly, this metric showed that increasing the depth of sequencing to rescue variants is not necessary at thresholds where genotypability reaches saturation (70X). To improve this metric and to evaluate the accuracy and reproducibility of results on different enrichment technologies for WES sample processing, the genotypability was evaluated on four exome platforms using three different DNA fragment lengths (short: ~200, medium: ~350, long: ~500 bp). Results showed that mapping quality could successfully increase on all platforms extending the fragment, hence increasing the distance between the read pairs. The genotypability of many genes, including several ones associated to a clinical phenotype, could strongly improve. Moreover, longer libraries increased uniformity of coverage for platforms that have not been completely optimized for short fragments, further improving their genotypability. Given the relevance of the quality of data derived, especially from the extension of the short fragments to the medium ones, a deeper investigation was performed to identify a potential threshold of fragment length above which the improvement in genotypability was significant. On the enrichment platform producing the higher enrichment uniformity (Twist), the fragments above 230 bp could obtain a meaningful improvement of genotypability (almost 1%) and a high uniformity of coverage of the target. Interestingly, the extension of the DNA fragment showed a greater influence on genotypability in respect on the solely uniformity of coverage. The enhancement of genotypability for a more accurate bioinformatics analysis of the target regions provided at limited costs (less sequencing) the investigation of regions of the genome previously defined as uncallable by current NGS methodologies
Mass Spectrometric Proteomics
As suggested by the title of this Special Issue, liquid chromatography-mass spectrometry plays a pivotal role in the field of proteomics. Indeed, the research and review articles published in the Issue clearly evidence how the data produced by this sophisticated methodology may promote impressive advancements in this area. From among the topics discussed in the Issue, a few point to the development of new procedures for the optimization of the experimental conditions that should be applied for the identification of proteins present in complex mixtures. Other applications described in these articles show the huge potential of these strategies in the protein profiling of organs and range from to the study of post-translational tissue modifications to the investigation of the molecular mechanisms behind human disorders and the identification of potential biomarkers of these diseases
Progress with the Upgrade of the SPS for the HL-LHC Era
The demanding beam performance requirements of the High Luminosity (HL-) LHC
project translate into a set of requirements and upgrade paths for the LHC
injector complex. In this paper the performance requirements for the SPS and
the known limitations are reviewed in the light of the 2012 operational
experience. The various SPS upgrades in progress and still under consideration
are described, in addition to the machine studies and simulations performed in
2012. The expected machine performance reach is estimated on the basis of the
present knowledge, and the remaining decisions that still need to be made
concerning upgrade options are detailed.Comment: 3 p. Presented at 4th International Particle Accelerator Conference
(IPAC 2013
Lynn University: 60 Years in 30 Minutes
Lynn University celebrates 60 years since it was founded as Marymount College in 1962. Join Amy Filiatreau, Lynn University Library Director, and Lea Iadarola, Archivist & Records Manager, as they take you through how Lynn transformed from a two-year all-female Catholic school to become one of the most innovative and diverse universities in America
- …
