87 research outputs found

    Revealing Fundamental Physics from the Daya Bay Neutrino Experiment using Deep Neural Networks

    Full text link
    Experiments in particle physics produce enormous quantities of data that must be analyzed and interpreted by teams of physicists. This analysis is often exploratory, where scientists are unable to enumerate the possible types of signal prior to performing the experiment. Thus, tools for summarizing, clustering, visualizing and classifying high-dimensional data are essential. In this work, we show that meaningful physical content can be revealed by transforming the raw data into a learned high-level representation using deep neural networks, with measurements taken at the Daya Bay Neutrino Experiment as a case study. We further show how convolutional deep neural networks can provide an effective classification filter with greater than 97% accuracy across different classes of physics events, significantly better than other machine learning approaches

    The Athena Data Dictionary and Description Language

    Get PDF
    Athena is the ATLAS off-line software framework, based upon the GAUDI architecture from LHCb. As part of ATLAS' continuing efforts to enhance and customise the architecture to meet our needs, we have developed a data object description tool suite and service for Athena. The aim is to provide a set of tools to describe, manage, integrate and use the Event Data Model at a design level according to the concepts of the Athena framework (use of patterns, relationships, ...). Moreover, to ensure stability and reusability this must be fully independent from the implementation details. After an extensive investigation into the many options, we have developed a language grammar based upon a description language (IDL, ODL) to provide support for object integration in Athena. We have then developed a compiler front end based upon this language grammar, JavaCC, and a Java Reflection API-like interface. We have then used these tools to develop several compiler back ends which meet specific needs in ATLAS such as automatic generation of object converters, and data object scripting interfaces. We present here details of our work and experience to date on the Athena Definition Language and Athena Data Dictionary.Comment: 4 pages, 2 figure

    GMA Instrumentation of the Athena Framework using NetLogger

    Full text link
    Grid applications are, by their nature, wide-area distributed applications. This WAN aspect of Grid applications makes the use of conventional monitoring and instrumentation tools (such as top, gprof, LSF Monitor, etc) impractical for verification that the application is running correctly and efficiently. To be effective, monitoring data must be "end-to-end", meaning that all components between the Grid application endpoints must be monitored. Instrumented applications can generate a large amount of monitoring data, so typically the instrumentation is off by default. For jobs running on a Grid, there needs to be a general mechanism to remotely activate the instrumentation in running jobs. The NetLogger Toolkit Activation Service provides this mechanism. To demonstrate this, we have instrumented the ATLAS Athena Framework with NetLogger to generate monitoring events. We then use a GMA-based activation service to control NetLogger's trigger mechanism. The NetLogger trigger mechanism allows one to easily start, stop, or change the logging level of a running program by modifying a trigger file. We present here details of the design of the NetLogger implementation of the GMA-based activation service and the instrumentation service for Athena. We also describe how this activation service allows us to non-intrusively collect and visualize the ATLAS Athena Framework monitoring data

    Optimization of Software on High Performance Computing Platforms for the LUX-ZEPLIN Dark Matter Experiment

    Full text link
    High Energy Physics experiments like the LUX-ZEPLIN dark matter experiment face unique challenges when running their computation on High Performance Computing resources. In this paper, we describe some strategies to optimize memory usage of simulation codes with the help of profiling tools. We employed this approach and achieved memory reduction of 10-30\%. While this has been performed in the context of the LZ experiment, it has wider applicability to other HEP experimental codes that face these challenges on modern computer architectures.Comment: Contribution to Proceedings of CHEP 2019, Nov 4-8, Adelaide, Australi

    Biases in probabilistic category learning in relation to social anxiety.

    Get PDF
    Instrumental learning paradigms are rarely employed to investigate the mechanisms underlying acquired fear responses in social anxiety. Here, we adapted a probabilistic category learning paradigm to assess information processing biases as a function of the degree of social anxiety traits in a sample of healthy individuals without a diagnosis of social phobia. Participants were presented with three pairs of neutral faces with differing probabilistic accuracy contingencies (A/B: 80/20, C/D: 70/30, E/F: 60/40). Upon making their choice, negative and positive feedback was conveyed using angry and happy faces, respectively. The highly socially anxious group showed a strong tendency to be more accurate at learning the probability contingency associated with the most ambiguous stimulus pair (E/F: 60/40). Moreover, when pairing the most positively reinforced stimulus or the most negatively reinforced stimulus with all the other stimuli in a test phase, the highly socially anxious group avoided the most negatively reinforced stimulus significantly more than the control group. The results are discussed with reference to avoidance learning and hypersensitivity to negative socially evaluative information associated with social anxiety

    The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe

    Get PDF
    The preponderance of matter over antimatter in the early Universe, the dynamics of the supernova bursts that produced the heavy elements necessary for life and whether protons eventually decay --- these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our Universe, its current state and its eventual fate. The Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed plan for a world-class experiment dedicated to addressing these questions. LBNE is conceived around three central components: (1) a new, high-intensity neutrino source generated from a megawatt-class proton accelerator at Fermi National Accelerator Laboratory, (2) a near neutrino detector just downstream of the source, and (3) a massive liquid argon time-projection chamber deployed as a far detector deep underground at the Sanford Underground Research Facility. This facility, located at the site of the former Homestake Mine in Lead, South Dakota, is approximately 1,300 km from the neutrino source at Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino charge-parity symmetry violation and mass ordering effects. This ambitious yet cost-effective design incorporates scalability and flexibility and can accommodate a variety of upgrades and contributions. With its exceptional combination of experimental configuration, technical capabilities, and potential for transformative discoveries, LBNE promises to be a vital facility for the field of particle physics worldwide, providing physicists from around the globe with opportunities to collaborate in a twenty to thirty year program of exciting science. In this document we provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess.Comment: Major update of previous version. This is the reference document for LBNE science program and current status. Chapters 1, 3, and 9 provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess. 288 pages, 116 figure

    Dual Electron Spectrometer for Magnetospheric Multiscale Mission: Results of the Comprehensive Tests of the Engineering Test Unit

    Get PDF
    The Magnetospheric Multiscale mission (MMS) is designed to study fundamental phenomena in space plasma physics such as a magnetic reconnection. The mission consists of four spacecraft, equipped with identical scientific payloads, allowing for the first measurements of fast dynamics in the critical electron diffusion region where magnetic reconnection occurs and charged particles are demagnetized. The MMS orbit is optimized to ensure the spacecraft spend extended periods of time in locations where reconnection is known to occur: at the dayside magnetopause and in the magnetotail. In order to resolve fine structures of the three dimensional electron distributions in the diffusion region (reconnection site), the Fast Plasma Investigation's (FPI) Dual Electron Spectrometer (DES) is designed to measure three dimensional electron velocity distributions with an extremely high time resolution of 30 ms. In order to achieve this unprecedented sampling rate, four dual spectrometers, each sampling 180 x 45 degree sections of the sky, are installed on each spacecraft. We present results of the comprehensive tests performed on the DES Engineering & Test Unit (ETU). This includes main parameters of the spectrometer such as energy resolution, angular acceptance, and geometric factor along with their variations over the 16 pixels spanning the 180-degree tophat Electro Static Analyzer (ESA) field of view and over the energy of the test beam. A newly developed method for precisely defining the operational space of the instrument is presented as well. This allows optimization of the trade-off between pixel to pixel crosstalk and uniformity of the main spectrometer parameters

    Examining the Link Between Domestic Violence Victimization and Loneliness in a Dutch Community Sample: A Comparison Between Victims and Nonvictims by Type D Personality

    Get PDF
    The current study investigated whether differences in loneliness scores between individuals with a distressed personality type (type D personality) and subjects without such a personality varied by domestic violence victimization. Participants (N = 625) were recruited by random sampling from the Municipal Basic Administration of the Dutch city of ‘s-Hertogenbosch and were invited to fill out a set of questionnaires on health status. For this study, only ratings for domestic violence victimization, type D personality, feelings of loneliness, and demographics were used. Statistical analyses yielded main effects on loneliness for both type D personality and history of domestic violence victimization. Above and beyond these main effects, their interaction was significantly associated with loneliness as well. However, this result seemed to apply to emotional loneliness in particular. Findings were discussed in light of previous research and study limitations

    From sea monsters to charismatic megafauna: changes in perception and use of large marine animals

    Get PDF
    Marine megafauna has always elicited contrasting feelings. In the past, large marine animals were often depicted as fantastic mythological creatures and dangerous monsters, while also arousing human curiosity. Marine megafauna has been a valuable resource to exploit, leading to the collapse of populations and local extinctions. In addition, some species have been perceived as competitors of fishers for marine resources and were often actively culled. Since the 1970s, there has been a change in the perception and use of megafauna. The growth of marine tourism, increasingly oriented towards the observation of wildlife, has driven a shift from extractive to non-extractive use, supporting the conservation of at least some species of marine megafauna. In this paper, we review and compare the changes in the perception and use of three megafaunal groups, cetaceans, elasmobranchs and groupers, with a special focus on European cultures. We highlight the main drivers and the timing of these changes, compare different taxonomic groups and species, and highlight the implications for management and conservation. One of the main drivers of the shift in perception, shared by all the three groups of megafauna, has been a general increase in curiosity towards wildlife, stimulated inter alia by documentaries (from the early 1970s onwards), and also promoted by easy access to scuba diving. At the same time, environmental campaigns have been developed to raise public awareness regarding marine wildlife, especially cetaceans, a process greatly facilitated by the rise of Internet and the World Wide Web. Currently, all the three groups (cetaceans, elasmobranchs and groupers) may represent valuable resources for ecotourism. Strikingly, the economic value of live specimens may exceed their value for human consumption. A further change in perception involving all the three groups is related to a growing understanding and appreciation of their key ecological role. The shift from extractive to non-extractive use has the potential for promoting species conservation and local economic growth. However, the change in use may not benefit the original stakeholders (e.g. fishers or whalers) and there may therefore be a case for providing compensation for disadvantaged stakeholders. Moreover, it is increasingly clear that even non-extractive use may have a negative impact on marine megafauna, therefore regulations are needed.SFRH/BPD/102494/2014, UID/MAR/04292/2019, IS1403info:eu-repo/semantics/publishedVersio
    • …