3,898 research outputs found
A time-domain veto for binary inspirals search
We describe a test to distinguish between actual gravitational waves from binary inspiral and false noise triggers. The test operates in the time domain, and considers the time evolution of the correlator and its statistical distribution. It should distinguish true versus noisy events with the same signal-to-noise ratio and chi-square frequency distribution. A similar test has been applied to S1 LIGO data
Adaptive spectral identification techniques in presence of undetected non linearities
The standard procedure for detection of gravitational wave coalescing
binaries signals is based on Wiener filtering with an appropriate bank of
template filters. This is the optimal procedure in the hypothesis of addictive
Gaussian and stationary noise. We study the possibility of improving the
detection efficiency with a class of adaptive spectral identification
techniques, analyzing their effect in presence of non stationarities and
undetected non linearities in the noiseComment: 4 pages, 2 figures, uses ws-procs9x6.cls Proceedings of "Non linear
physics: theory and experiment. II", Gallipoli (Lecce), 200
DiBELLA: Distributed long read to long read alignment
We present a parallel algorithm and scalable implementation for genome analysis, specifically the problem of finding overlaps and alignments for data from "third generation" long read sequencers [29]. While long sequences of DNA offer enormous advantages for biological analysis and insight, current long read sequencing instruments have high error rates and therefore require different approaches to analysis than their short read counterparts. Our work focuses on an efficient distributed-memory parallelization of an accurate single-node algorithm for overlapping and aligning long reads. We achieve scalability of this irregular algorithm by addressing the competing issues of increasing parallelism, minimizing communication, constraining the memory footprint, and ensuring good load balance. The resulting application, diBELLA, is the first distributed memory overlapper and aligner specifically designed for long reads and parallel scalability. We describe and present analyses for high level design trade-offs and conduct an extensive empirical analysis that compares performance characteristics across state-of-the-art HPC systems as well as a commercial cloud architectures, highlighting the advantages of state-of-the-art network technologies
Low-latency analysis pipeline for compact binary coalescences in the advanced gravitational wave detector era
The multi-band template analysis (MBTA) pipeline is a low-latency coincident
analysis pipeline for the detection of gravitational waves (GWs) from compact
binary coalescences. MBTA runs with a low computational cost, and can identify
candidate GW events online with a sub-minute latency. The low computational
running cost of MBTA also makes it useful for data quality studies. Events
detected by MBTA online can be used to alert astronomical partners for
electromagnetic follow-up. We outline the current status of MBTA and give
details of recent pipeline upgrades and validation tests that were performed in
preparation for the first advanced detector observing period. The MBTA pipeline
is ready for the outset of the advanced detector era and the exciting prospects
it will bring.Comment: 18 pages, 10 figure
Principali tecniche e strumenti per il rilievo tridimensionale in ambito archeologico
The increase of 3D acquisition and modeling techniques applied to archeology is due principally to (i) their capacity to survey archeological artifacts with high precision and a non-contact approach and (ii) the possibility to create 3D digital models useful for data analysis, simulation and preservation. These benefits in terms of knowledge oblige the contemporary archaeologist to acquire a better understanding of 3D acquisition and modeling principles and practice. This evidence arises from the necessity of adopting a common language for experts in 3D data management and archaeologists with the principal aim being the understanding of each other’s requirements and sharing of the purposes of the project. In this article the authors propose a concise but exhaustive explanation of the working principles of active and passive 3D acquisition techniques. For each one a description of instruments and methodologies is developed, pointing out pros and cons of every technique. In conclusion, a sensor fusion approach is presented as an interesting solution to increase the instrument performances while obtaining at the same time a quality improvement of 3D acquisition and modeling results. A final multi-resolution application about Pompeii Forum 3D modeling follows and closes the article
Principali tecniche e strumenti per il rilievo tridimensionale in ambito archeologico
The increase of 3D acquisition and modeling techniques applied to archeology is due principally to (i) their capacity to survey archeological artifacts with high precision and a non-contact approach and (ii) the possibility to create 3D digital models useful for data analysis, simulation and preservation. These benefits in terms of knowledge oblige the contemporary archaeologist to acquire a better understanding of 3D acquisition and modeling principles and practice. This evidence arises from the necessity of adopting a common language for experts in 3D data management and archaeologists with the principal aim being the understanding of each other’s requirements and sharing of the purposes of the project. In this article the authors propose a concise but exhaustive explanation of the working principles of active and passive 3D acquisition techniques. For each one a description of instruments and methodologies is developed, pointing out pros and cons of every technique. In conclusion, a sensor fusion approach is presented as an interesting solution to increase the instrument performances while obtaining at the same time a quality improvement of 3D acquisition and modeling results. A final multi-resolution application about Pompeii Forum 3D modeling follows and closes the article
Maintainability improvement using allocation methods for railway systems
An optimal maintenance policy is an essential condition of many industrial products in order to save resources and to minimize operational costs and system downtime. Some maintenance actions (e.g.CM, PM and CBM) are illustrated in the first part of the paper. The paper focuses on maintainability allocation techniques: four procedures are analyzed (Failure Rate-based allocation method; Trade-off of failure rate and design feature-based allocation method; Fuzzy maintainability allocation based on interval analysis; Time characteristic-based MA model). The traditional procedures are characterized by several drawbacks; therefore the attention is focalized on the time characteristic-based method, which turned out to be the best and the most complete procedure because it exceeds the others methods limitations. The last part of the paper proposes a case study analyzed using the techniques implemented in the MA optimal method. Two different cases are studied: they differ for the objective Mean Time To Repair, initially the requirement could vary inside a range, then it is fixed to the value 6 hours
Observation of charge-density-wave excitations in manganites
In the optical conductivity of four different manganites with commensurate
charge order (CO), strong peaks appear in the meV range below the ordering
temperature T_{CO}. They are similar to those reported for one-dimensional
charge density waves (CDW) and are assigned to pinned phasons. The peaks and
their overtones allow one to obtain, for La{1-n/8}Ca{n/8}$MnO{3} with n = 5, 6,
the electron-phonon coupling, the effective mass of the CO system, and its
contribution to the dielectric constant. These results support a description of
the CO in La-Ca manganites in terms of moderately weak-coupling and of the CDW
theory.Comment: To be published on Phys. Rev. Let
- …