7,699 research outputs found

    Can Doubly Strange Dibaryon Resonances be Discovered at RHIC?

    Full text link
    The baryon-baryon continuum invariant mass spectrum generated from relativistic nucleus + nucleus collision data may reveal the existence of doubly-strange dibaryons not stable against strong decay if they lie within a few MeV of threshold. Furthermore, since the dominant component of these states is a superposition of two color-octet clusters which can be produced intermediately in a color-deconfined quark-gluon plasma (QGP), an enhanced production of dibaryon resonances could be a signal of QGP formation. A total of eight, doubly-strange dibaryon states are considered for experimental search using the STAR detector (Solenoidal Tracker at RHIC) at the new Relativistic Heavy Ion Collider (RHIC). These states may decay to Lambda-Lambda and/or proton-Cascade-minus, depending on the resonance energy. STAR's large acceptance, precision tracking and vertex reconstruction capabilities, and large data volume capacity, make it an ideal instrument to use for such a search. Detector performance and analysis sensitivity are studied as a function of resonance production rate and width for one particular dibaryon which can directly strong decay to proton-Cascade-minus but not Lambda-Lambda. Results indicate that such resonances may be discovered using STAR if the resonance production rates are comparable to coalescence model predictions for dibaryon bound states.Comment: 28 pages, 5 figures, revised versio

    The age of data-driven proteomics : how machine learning enables novel workflows

    Get PDF
    A lot of energy in the field of proteomics is dedicated to the application of challenging experimental workflows, which include metaproteomics, proteogenomics, data independent acquisition (DIA), non-specific proteolysis, immunopeptidomics, and open modification searches. These workflows are all challenging because of ambiguity in the identification stage; they either expand the search space and thus increase the ambiguity of identifications, or, in the case of DIA, they generate data that is inherently more ambiguous. In this context, machine learning-based predictive models are now generating considerable excitement in the field of proteomics because these predictive models hold great potential to drastically reduce the ambiguity in the identification process of the above-mentioned workflows. Indeed, the field has already produced classical machine learning and deep learning models to predict almost every aspect of a liquid chromatography-mass spectrometry (LC-MS) experiment. Yet despite all the excitement, thorough integration of predictive models in these challenging LC-MS workflows is still limited, and further improvements to the modeling and validation procedures can still be made. In this viewpoint we therefore point out highly promising recent machine learning developments in proteomics, alongside some of the remaining challenges

    Deriving High-Precision Radial Velocities

    Full text link
    This chapter describes briefly the key aspects behind the derivation of precise radial velocities. I start by defining radial velocity precision in the context of astrophysics in general and exoplanet searches in particular. Next I discuss the different basic elements that constitute a spectrograph, and how these elements and overall technical choices impact on the derived radial velocity precision. Then I go on to discuss the different wavelength calibration and radial velocity calculation techniques, and how these are intimately related to the spectrograph's properties. I conclude by presenting some interesting examples of planets detected through radial velocity, and some of the new-generation instruments that will push the precision limit further.Comment: Lecture presented at the IVth Azores International Advanced School in Space Sciences on "Asteroseismology and Exoplanets: Listening to the Stars and Searching for New Worlds" (arXiv:1709.00645), which took place in Horta, Azores Islands, Portugal in July 201

    Virtual Astronomy, Information Technology, and the New Scientific Methodology

    Get PDF
    All sciences, including astronomy, are now entering the era of information abundance. The exponentially increasing volume and complexity of modern data sets promises to transform the scientific practice, but also poses a number of common technological challenges. The Virtual Observatory concept is the astronomical community's response to these challenges: it aims to harness the progress in information technology in the service of astronomy, and at the same time provide a valuable testbed for information technology and applied computer science. Challenges broadly fall into two categories: data handling (or "data farming"), including issues such as archives, intelligent storage, databases, interoperability, fast networks, etc., and data mining, data understanding, and knowledge discovery, which include issues such as automated clustering and classification, multivariate correlation searches, pattern recognition, visualization in highly hyperdimensional parameter spaces, etc., as well as various applications of machine learning in these contexts. Such techniques are forming a methodological foundation for science with massive and complex data sets in general, and are likely to have a much broather impact on the modern society, commerce, information economy, security, etc. There is a powerful emerging synergy between the computationally enabled science and the science-driven computing, which will drive the progress in science, scholarship, and many other venues in the 21st century

    Evolutionary descent of prion genes from a ZIP metal ion transport ancestor

    Get PDF
    In the more than 20 years since its discovery, both the phylogenetic origin and cellular function of the prion protein (PrP) have remained enigmatic. Insights into the function of PrP may be obtained through a characterization of its molecular neighborhood. Quantitative interactome data revealed the spatial proximity of a subset of metal ion transporters of the ZIP family to mammalian prion proteins. A subsequent bioinformatic analysis revealed the presence of a prion-like protein sequence within the N-terminal, extracellular domain of a phylogenetic branch of ZIPs. Additional structural threading and ortholog sequence alignment analyses consolidated the conclusion that the prion protein gene family is phylogenetically derived from a ZIP-like ancestor molecule. Our data explain structural and functional features found within mammalian prion proteins as elements of an ancient involvement in the transmembrane transport of divalent cations. The connection to ZIP proteins is expected to open new avenues to elucidate the biology of the prion protein in health and disease

    Exploration of Parameter Spaces in a Virtual Observatory

    Get PDF
    Like every other field of intellectual endeavor, astronomy is being revolutionised by the advances in information technology. There is an ongoing exponential growth in the volume, quality, and complexity of astronomical data sets, mainly through large digital sky surveys and archives. The Virtual Observatory (VO) concept represents a scientific and technological framework needed to cope with this data flood. Systematic exploration of the observable parameter spaces, covered by large digital sky surveys spanning a range of wavelengths, will be one of the primary modes of research with a VO. This is where the truly new discoveries will be made, and new insights be gained about the already known astronomical objects and phenomena. We review some of the methodological challenges posed by the analysis of large and complex data sets expected in the VO-based research. The challenges are driven both by the size and the complexity of the data sets (billions of data vectors in parameter spaces of tens or hundreds of dimensions), by the heterogeneity of the data and measurement errors, including differences in basic survey parameters for the federated data sets (e.g., in the positional accuracy and resolution, wavelength coverage, time baseline, etc.), various selection effects, as well as the intrinsic clustering properties (functional form, topology) of the data distributions in the parameter spaces of observed attributes. Answering these challenges will require substantial collaborative efforts and partnerships between astronomers, computer scientists, and statisticians.Comment: Invited review, 10 pages, Latex file with 4 eps figures, style files included. To appear in Proc. SPIE, v. 4477 (2001

    Dark Matter and Fundamental Physics with the Cherenkov Telescope Array

    Get PDF
    The Cherenkov Telescope Array (CTA) is a project for a next-generation observatory for very high energy (GeV-TeV) ground-based gamma-ray astronomy, currently in its design phase, and foreseen to be operative a few years from now. Several tens of telescopes of 2-3 different sizes, distributed over a large area, will allow for a sensitivity about a factor 10 better than current instruments such as H.E.S.S, MAGIC and VERITAS, an energy coverage from a few tens of GeV to several tens of TeV, and a field of view of up to 10 deg. In the following study, we investigate the prospects for CTA to study several science questions that influence our current knowledge of fundamental physics. Based on conservative assumptions for the performance of the different CTA telescope configurations, we employ a Monte Carlo based approach to evaluate the prospects for detection. First, we discuss CTA prospects for cold dark matter searches, following different observational strategies: in dwarf satellite galaxies of the Milky Way, in the region close to the Galactic Centre, and in clusters of galaxies. The possible search for spatial signatures, facilitated by the larger field of view of CTA, is also discussed. Next we consider searches for axion-like particles which, besides being possible candidates for dark matter may also explain the unexpectedly low absorption by extragalactic background light of gamma rays from very distant blazars. Simulated light-curves of flaring sources are also used to determine the sensitivity to violations of Lorentz Invariance by detection of the possible delay between the arrival times of photons at different energies. Finally, we mention searches for other exotic physics with CTA.Comment: (31 pages, Accepted for publication in Astroparticle Physics

    A Summary from the Theorist's Point of View

    Get PDF
    1. Introduction - 2. Astrophysics and Cosmology - 3. Neutrino Oscillations - 4. Higgs and New Physics Searches - 5. Flavour Physics and CP Violation - 6. QCD - 7. Heavy Ion Collisions - 8. OutlookComment: 10 pages, Summary talk given at the IVth Rencontres du Vietnam on "Physics at Extreme Energies", Hanoi, July 200
    • 

    corecore