9,749 research outputs found

    BlackMax: A black-hole event generator with rotation, recoil, split branes and brane tension

    Full text link
    We present a comprehensive black-hole event generator, BlackMax, which simulates the experimental signatures of microscopic and Planckian black-hole production and evolution at the LHC in the context of brane world models with low-scale quantum gravity. The generator is based on phenomenologically realistic models free of serious problems that plague low-scale gravity, thus offering more realistic predictions for hadron-hadron colliders. The generator includes all of the black-hole graybody factors known to date and incorporates the effects of black-hole rotation, splitting between the fermions, non-zero brane tension and black-hole recoil due to Hawking radiation (although not all simultaneously). The generator can be interfaced with Herwig and Pythia.Comment: 32 pages, 61 figures, webpage http://www-pnp.physics.ox.ac.uk/~issever/BlackMax/blackmax.htm

    Translational Invariance and the Anisotropy of the Cosmic Microwave Background

    Get PDF
    Primordial quantum fluctuations produced by inflation are conventionally assumed to be statistically homogeneous, a consequence of translational invariance. In this paper we quantify the potentially observable effects of a small violation of translational invariance during inflation, as characterized by the presence of a preferred point, line, or plane. We explore the imprint such a violation would leave on the cosmic microwave background anisotropy, and provide explicit formulas for the expected amplitudes of the spherical-harmonic coefficients.Comment: Notation improve

    An unsupervised automated paradigm for artifact removal from electrodermal activity in an uncontrolled clinical setting

    Get PDF
    Objective. Electrodermal activity (EDA) reflects sympathetic nervous system activity through sweating-related changes in skin conductance and could be used in clinical settings in which patients cannot self-report pain, such as during surgery or when in a coma. To enable EDA data to be used robustly in clinical settings, we need to develop artifact detection and removal frameworks that can handle the types of interference experienced in clinical settings while salvaging as much useful information as possible. Approach. In this study, we collected EDA data from 70 subjects while they were undergoing surgery in the operating room. We then built a fully automated artifact removal framework to remove the heavy artifacts that resulted from the use of surgical electrocautery during the surgery and compared it to two existing state-of-the-art methods for artifact removal from EDA data. This automated framework consisted of first utilizing three unsupervised machine learning methods for anomaly detection, and then customizing the threshold to separate artifact for each data instance by taking advantage of the statistical properties of the artifact in that data instance. We also created simulated surgical data by introducing artifacts into cleaned surgical data and measured the performance of all three methods in removing it. Main results. Our method achieved the highest overall accuracy and precision and lowest overall error on simulated data. One of the other methods prioritized high sensitivity while sacrificing specificity and precision, while the other had low sensitivity, high error, and left behind several artifacts. These results were qualitatively similar between the simulated data instances and operating room data instances. Significance. Our framework allows for robust removal of heavy artifact from EDA data in clinical settings such as surgery, which is the first step to enable clinical integration of EDA as part of standard monitoring

    Research Cloud Data Communities

    Get PDF
    Big Data, big science, the data deluge, these are topics we are hearing about more and more in our research pursuits. Then, through media hype, comes cloud computing, the saviour that is going to resolve our Big Data issues. However, it is difficult to pinpoint exactly what researchers can actually do with data and with clouds, how they get to exactly solve their Big Data problems, and how they get help in using these relatively new tools and infrastructure. Since the beginning of 2012, the NeCTAR Research Cloud has been running at the University of Melbourne, attracting over 1,650 users from around the country. This has not only provided an unprecedented opportunity for researchers to employ clouds in their research, but it has also given us an opportunity to clearly understand how researchers can more easily solve their Big Data problems. The cloud is now used daily, from running web servers and blog sites, through to hosting virtual laboratories that can automatically create hundreds of servers depending on research demand. Of course, it has also helped us understand that infrastructure isn’t everything. There are many other skillsets needed to help researchers from the multitude of disciplines use the cloud effectively. How can we solve Big Data problems on cloud infrastructure? One of the key aspects are communities based on research platforms: Research is built on collaboration, connection and community, and researchers employ platforms daily, whether as bio-imaging platforms, computational platforms or cloud platforms (like DropBox). There are some important features which enabled this to work.. Firstly, the borders to collaboration are eased, allowing communities to access infrastructure that can be instantly built to be completely open, through to completely closed, all managed securely through (nationally) standardised interfaces. Secondly, it is free and easy to build servers and infrastructure, but it is also cheap to fail, allowing for experimentation not only at a code-level, but at a server or infrastructure level as well. Thirdly, this (virtual) infrastructure can be shared with collaborators, moving the practice of collaboration from sharing papers and code to sharing servers, pre-configured and ready to go. And finally, the underlying infrastructure is built with Big Data in mind, co-located with major data storage infrastructure and high-performance computers, and interconnected with high-speed networks nationally to research instruments. The research cloud is fundamentally new in that it easily allows communities of researchers, often connected by common geography (research precincts), discipline or long-term established collaborations, to build open, collaborative platforms. These open, sharable, and repeatable platforms encourage coordinated use and development, evolving to common community-oriented methods for Big Data access and data manipulation. In this paper we discuss in detail critical ingredients in successfully establishing these communities, as well as some outcomes as a result of these communities and their collaboration enabling platforms. We consider astronomy as an exemplar of a research field that has already looked to the cloud as a solution to the ensuing data tsunami

    Iterative Approximate Consensus in the presence of Byzantine Link Failures

    Full text link
    This paper explores the problem of reaching approximate consensus in synchronous point-to-point networks, where each directed link of the underlying communication graph represents a communication channel between a pair of nodes. We adopt the transient Byzantine link failure model [15, 16], where an omniscient adversary controls a subset of the directed communication links, but the nodes are assumed to be fault-free. Recent work has addressed the problem of reaching approximate consen- sus in incomplete graphs with Byzantine nodes using a restricted class of iterative algorithms that maintain only a small amount of memory across iterations [22, 21, 23, 12]. However, to the best of our knowledge, we are the first to consider approximate consensus in the presence of Byzan- tine links. We extend our past work that provided exact characterization of graphs in which the iterative approximate consensus problem in the presence of Byzantine node failures is solvable [22, 21]. In particular, we prove a tight necessary and sufficient condition on the underlying com- munication graph for the existence of iterative approximate consensus algorithms under transient Byzantine link model. The condition answers (part of) the open problem stated in [16].Comment: arXiv admin note: text overlap with arXiv:1202.609

    Precise Particle Tracking Against a Complicated Background: Polynomial Fitting with Gaussian Weight

    Full text link
    We present a new particle tracking software algorithm designed to accurately track the motion of low-contrast particles against a background with large variations in light levels. The method is based on a polynomial fit of the intensity around each feature point, weighted by a Gaussian function of the distance from the centre, and is especially suitable for tracking endogeneous particles in the cell, imaged with bright field, phase contrast or fluorescence optical microscopy. Furthermore, the method can simultaneously track particles of all different sizes, and allows significant freedom in their shape. The algorithm is evaluated using the quantitative measures of accuracy and precision of previous authors, using simulated images at variable signal-to-noise ratios. To these we add a new test of the error due to a non-uniform background. Finally the tracking of particles in real cell images is demonstrated. The method is made freely available for non-commencial use as a software package with a graphical user-inferface, which can be run within the Matlab programming environment

    Some Systematics of the Coupling Constant Dependence of N=4 Yang-Mills

    Full text link
    The operator, O_\tau, that generates infinitesimal changes of the coupling constant in N=4 Yang-Mills sits in the same supermultiplet as the superconformal currents. We show how superconformal current Ward identities determine a class of terms in the operator product expansion of O_\tau with any other operator. In certain cases, this leads to constraints on the coupling dependence of correlation functions in N=4 Yang-Mills. As an application, we demonstrate the exact non-renormalization of two and certain three-point correlation functions of BPS operators.Comment: 56 pages, LaTeX; amended and expanded arguments, added reference

    Developing a multiple-document-processing performance assessment for epistemic literacy

    Get PDF
    The LAK15 theme “shifts the focus from data to impact”, noting the potential for Learning Analytics based on existing technologies to have scalable impact on learning for people of all ages. For such demand and potential in scalability to be met the challenges of addressing higher-order thinking skills should be addressed. This paper discuses one such approach – the creation of an analytic and task model to probe epistemic cognition in complex literacy tasks. The research uses existing technologies in novel ways to build a conceptually grounded model of trace-indicators for epistemic-commitments in information seeking behaviors. We argue that such an evidence centered approach is fundamental to realizing the potential of analytics, which should maintain a strong association with learning theory

    Acoustic cues to tonal contrasts in Mandarin: Implications for cochlear implants

    Get PDF
    The present study systematically manipulated three acoustic cues-fundamental frequency (f0), amplitude envelope, and duration-to investigate their contributions to tonal contrasts in Mandarin. Simplified stimuli with all possible combinations of these three cues were presented for identification to eight normal-hearing listeners, all native speakers of Mandarin from Taiwan. The f0 information was conveyed either by an f0-controlled sawtooth carrier or a modulated noise so as to compare the performance achievable by a clear indication of voice f0 and what is possible with purely temporal coding of f0. Tone recognition performance with explicit f0 was much better than that with any combination of other acoustic cues (consistently greater than 90% correct compared to 33%-65%; chance is 25%). In the absence of explicit f0, the temporal coding of f0 and amplitude envelope both contributed somewhat to tone recognition, while duration had only a marginal effect. Performance based on these secondary cues varied greatly across listeners. These results explain the relatively poor perception of tone in cochlear implant users, given that cochlear implants currently provide only weak cues to f0, so that users must rely upon the purely temporal (and secondary) features for the perception of tone. (c) 2008 Acoustical Society of America
    • …
    corecore