6,126 research outputs found
Pseudorapidity Distribution of Charged Particles in PbarP Collisions at root(s)= 630GeV
Using a silicon vertex detector, we measure the charged particle
pseudorapidity distribution over the range 1.5 to 5.5 using data collected from
PbarP collisions at root s = 630 GeV. With a data sample of 3 million events,
we deduce a result with an overall normalization uncertainty of 5%, and typical
bin to bin errors of a few percent. We compare our result to the measurement of
UA5, and the distribution generated by the Lund Monte Carlo with default
settings. This is only the second measurement at this level of precision, and
only the second measurement for pseudorapidity greater than 3.Comment: 9 pages, 5 figures, LaTeX format. For ps file see
http://hep1.physics.wayne.edu/harr/harr.html Submitted to Physics Letters
On the hierarchical classification of G Protein-Coupled Receptors
Motivation: G protein-coupled receptors (GPCRs) play an important role in many physiological systems by transducing an extracellular signal into an intracellular response. Over 50% of all marketed drugs are targeted towards a GPCR. There is considerable interest in developing an algorithm that could effectively predict the function of a GPCR from its primary sequence. Such an algorithm is useful not only in identifying novel GPCR sequences but in characterizing the interrelationships between known GPCRs.
Results: An alignment-free approach to GPCR classification has been developed using techniques drawn from data mining and proteochemometrics. A dataset of over 8000 sequences was constructed to train the algorithm. This represents one of the largest GPCR datasets currently available. A predictive algorithm was developed based upon the simplest reasonable numerical representation of the protein's physicochemical properties. A selective top-down approach was developed, which used a hierarchical classifier to assign sequences to subdivisions within the GPCR hierarchy. The predictive performance of the algorithm was assessed against several standard data mining classifiers and further validated against Support Vector Machine-based GPCR prediction servers. The selective top-down approach achieves significantly higher accuracy than standard data mining methods in almost all cases
Accumulation of driver and passenger mutations during tumor progression
Major efforts to sequence cancer genomes are now occurring throughout the
world. Though the emerging data from these studies are illuminating, their
reconciliation with epidemiologic and clinical observations poses a major
challenge. In the current study, we provide a novel mathematical model that
begins to address this challenge. We model tumors as a discrete time branching
process that starts with a single driver mutation and proceeds as each new
driver mutation leads to a slightly increased rate of clonal expansion. Using
the model, we observe tremendous variation in the rate of tumor development -
providing an understanding of the heterogeneity in tumor sizes and development
times that have been observed by epidemiologists and clinicians. Furthermore,
the model provides a simple formula for the number of driver mutations as a
function of the total number of mutations in the tumor. Finally, when applied
to recent experimental data, the model allows us to calculate, for the first
time, the actual selective advantage provided by typical somatic mutations in
human tumors in situ. This selective advantage is surprisingly small, 0.005 +-
0.0005, and has major implications for experimental cancer research
Exact solution of a two-type branching process: Clone size distribution in cell division kinetics
We study a two-type branching process which provides excellent description of
experimental data on cell dynamics in skin tissue (Clayton et al., 2007). The
model involves only a single type of progenitor cell, and does not require
support from a self-renewed population of stem cells. The progenitor cells
divide and may differentiate into post-mitotic cells. We derive an exact
solution of this model in terms of generating functions for the total number of
cells, and for the number of cells of different types. We also deduce large
time asymptotic behaviors drawing on our exact results, and on an independent
diffusion approximation.Comment: 16 page
Atomic mass dependence of \Xi^- and \overline{\Xi}^+ production in central 250 GeV \pi^- nucleon interactions
We present the first measurement of the atomic mass dependence of central
\Xi^- and \overline{\Xi}^+ production. It is measured using a sample of 22,459
\Xi^-'s and \overline{\Xi}^+'s produced in collisions between a 250 GeV \pi^-
beam and targets of beryllium, aluminum, copper, and tungsten. The relative
cross sections are fit to the two parameter function \sigma_0 A^\alpha, where A
is the atomic mass. We measure \alpha = 0.924+-0.020+-0.025, for Feynman-x in
the range -0.09 < x_F < 0.15.Comment: 10 pages, revtex, 2 figures, submitted to Phys. Rev.
Interpol: An R package for preprocessing of protein sequences
<p>Abstract</p> <p>Background</p> <p>Most machine learning techniques currently applied in the literature need a fixed dimensionality of input data. However, this requirement is frequently violated by real input data, such as DNA and protein sequences, that often differ in length due to insertions and deletions. It is also notable that performance in classification and regression is often improved by numerical encoding of amino acids, compared to the commonly used sparse encoding.</p> <p>Results</p> <p>The software "Interpol" encodes amino acid sequences as numerical descriptor vectors using a database of currently 532 descriptors (mainly from AAindex), and normalizes sequences to uniform length with one of five linear or non-linear interpolation algorithms. Interpol is distributed with open source as platform independent R-package. It is typically used for preprocessing of amino acid sequences for classification or regression.</p> <p>Conclusions</p> <p>The functionality of Interpol widens the spectrum of machine learning methods that can be applied to biological sequences, and it will in many cases improve their performance in classification and regression.</p
Identifying Mendelian disease genes with the Variant Effect Scoring Tool
Background
Whole exome sequencing studies identify hundreds to thousands of rare protein coding variants of ambiguous significance for human health. Computational tools are needed to accelerate the identification of specific variants and genes that contribute to human disease.
Results
We have developed the Variant Effect Scoring Tool (VEST), a supervised machine learning-based classifier, to prioritize rare missense variants with likely involvement in human disease. The VEST classifier training set comprised ~ 45,000 disease mutations from the latest Human Gene Mutation Database release and another ~45,000 high frequency (allele frequency > 1%) putatively neutral missense variants from the Exome Sequencing Project. VEST outperforms some of the most popular methods for prioritizing missense variants in carefully designed holdout benchmarking experiments (VEST ROC AUC = 0.91, PolyPhen2 ROC AUC = 0.86, SIFT4.0 ROC AUC = 0.84). VEST estimates variant score p-values against a null distribution of VEST scores for neutral variants not included in the VEST training set. These p-values can be aggregated at the gene level across multiple disease exomes to rank genes for probable disease involvement. We tested the ability of an aggregate VEST gene score to identify candidate Mendelian disease genes, based on whole-exome sequencing of a small number of disease cases. We used whole-exome data for two Mendelian disorders for which the causal gene is known. Considering only genes that contained variants in all cases, the VEST gene score ranked dihydroorotate dehydrogenase (DHODH) number 2 of 2253 genes in four cases of Miller syndrome, and myosin-3 (MYH3) number 2 of 2313 genes in three cases of Freeman Sheldon syndrome.
Conclusions
Our results demonstrate the potential power gain of aggregating bioinformatics variant scores into gene-level scores and the general utility of bioinformatics in assisting the search for disease genes in large-scale exome sequencing studies
Construction and Performance of Large-Area Triple-GEM Prototypes for Future Upgrades of the CMS Forward Muon System
At present, part of the forward RPC muon system of the CMS detector at the
CERN LHC remains uninstrumented in the high-\eta region. An international
collaboration is investigating the possibility of covering the 1.6 < |\eta| <
2.4 region of the muon endcaps with large-area triple-GEM detectors. Given
their good spatial resolution, high rate capability, and radiation hardness,
these micro-pattern gas detectors are an appealing option for simultaneously
enhancing muon tracking and triggering capabilities in a future upgrade of the
CMS detector. A general overview of this feasibility study will be presented.
The design and construction of small (10\times10 cm2) and full-size trapezoidal
(1\times0.5 m2) triple-GEM prototypes will be described. During detector
assembly, different techniques for stretching the GEM foils were tested.
Results from measurements with x-rays and from test beam campaigns at the CERN
SPS will be shown for the small and large prototypes. Preliminary simulation
studies on the expected muon reconstruction and trigger performances of this
proposed upgraded muon system will be reported.Comment: 7 pages, 25 figures, submitted for publication in conference record
of the 2011 IEEE Nuclear Science Symposium, Valencia, Spai
An overview of the design, construction and performance of large area triple-GEM prototypes for future upgrades of the CMS forward muon system
GEM detectors are used in high energy physics experiments given their good spatial resolution, high rate capability and radiation hardness. An international collaboration is investigating the possibility of covering the 1.6 < vertical bar eta vertical bar < 2.4 region of the CMS muon endcaps with large-area triple-GEM detectors. The CMS high-eta area is actually not fully instrumented, only Cathode Strip Chamber (CSC) are installed. The vacant area presents an opportunity for a detector technology able to to cope with the harsh radiation environment; these micropattern gas detectors are an appealing option to simultaneously enhance muon tracking and triggering capabilities in a future upgrade of the CMS detector. A general overview of this feasibility study is presented. Design and construction of small (10cm x 10cm) and full-size trapezoidal (1m x 0.5m) triple-GEM prototypes is described. Results from measurements with x-rays and from test beam campaigns at the CERN SPS is shown for the small and large prototypes. Preliminary simulation studies on the expected muon reconstruction and trigger performances of this proposed upgraded muon system are reported
Recommended from our members
A Search for Dark Higgs Bosons
Recent astrophysical and terrestrial experiments have motivated the proposal
of a dark sector with GeV-scale gauge boson force carriers and new Higgs
bosons. We present a search for a dark Higgs boson using 516 fb-1 of data
collected with the BABAR detector. We do not observe a significant signal and
we set 90% confidence level upper limits on the product of the Standard
Model-dark sector mixing angle and the dark sector coupling constant.Comment: 7 pages, 5 postscript figures, published version with improved plots
for b/w printin
- …