90 research outputs found

    A Mechanism-Based Explanation of the Institutionalization of Semantic Technologies in the Financial Industry

    Get PDF
    Part 3: Creating Value through ApplicationsInternational audienceThis paper explains how the financial industry is solving its data, risk management, and associated vocabulary problems using semantic technologies. The paper is the first to examine this phenomenon and to identify the social and institutional mechanisms being applied to socially construct a standard common vocabulary using ontology-based models. This standardized ontology-based common vocabulary will underpin the design of next generation of semantically-enabled information systems (IS) for the financial industry. The mechanisms that are helping institutionalize this common vocabulary are identified using a longitudinal case study, whose embedded units of analysis focus on central agents of change—the Enterprise Data Management Council and the Object Management Group. All this has important implications for society, as it is intended that semantically-enabled IS will, for example, provide stakeholders, such as regulators, with better transparency over systemic risks to national and international financial systems, thereby mitigating or avoiding future financial crises

    Sensitivity of the IceCube Detector to Astrophysical Sources of High Energy Muon Neutrinos

    Full text link
    We present the results of a Monte-Carlo study of the sensitivity of the planned IceCube detector to predicted fluxes of muon neutrinos at TeV to PeV energies. A complete simulation of the detector and data analysis is used to study the detector's capability to search for muon neutrinos from sources such as active galaxies and gamma-ray bursts. We study the effective area and the angular resolution of the detector as a function of muon energy and angle of incidence. We present detailed calculations of the sensitivity of the detector to both diffuse and pointlike neutrino emissions, including an assessment of the sensitivity to neutrinos detected in coincidence with gamma-ray burst observations. After three years of datataking, IceCube will have been able to detect a point source flux of E^2*dN/dE = 7*10^-9 cm^-2s^-1GeV at a 5-sigma significance, or, in the absence of a signal, place a 90% c.l. limit at a level E^2*dN/dE = 2*10^-9 cm^-2s^-1GeV. A diffuse E-2 flux would be detectable at a minimum strength of E^2*dN/dE = 1*10^-8 cm^-2s^-1sr^-1GeV. A gamma-ray burst model following the formulation of Waxman and Bahcall would result in a 5-sigma effect after the observation of 200 bursts in coincidence with satellite observations of the gamma-rays.Comment: 33 pages, 13 figures, 6 table

    On the selection of AGN neutrino source candidates for a source stacking analysis with neutrino telescopes

    Get PDF
    The sensitivity of a search for sources of TeV neutrinos can be improved by grouping potential sources together into generic classes in a procedure that is known as source stacking. In this paper, we define catalogs of Active Galactic Nuclei (AGN) and use them to perform a source stacking analysis. The grouping of AGN into classes is done in two steps: first, AGN classes are defined, then, sources to be stacked are selected assuming that a potential neutrino flux is linearly correlated with the photon luminosity in a certain energy band (radio, IR, optical, keV, GeV, TeV). Lacking any secure detailed knowledge on neutrino production in AGN, this correlation is motivated by hadronic AGN models, as briefly reviewed in this paper. The source stacking search for neutrinos from generic AGN classes is illustrated using the data collected by the AMANDA-II high energy neutrino detector during the year 2000. No significant excess for any of the suggested groups was found.Comment: 43 pages, 12 figures, accepted by Astroparticle Physic

    Use of SMS texts for facilitating access to online alcohol interventions: a feasibility study

    Get PDF
    A41 Use of SMS texts for facilitating access to online alcohol interventions: a feasibility study In: Addiction Science & Clinical Practice 2017, 12(Suppl 1): A4

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Prospective memory and ageing paradox with event-based tasks : A study of young, young-old, and old-old participants

    Get PDF
    Research on ageing and prospective memory—remembering to do something in the future—has resulted in paradoxical findings, whereby older adults are often impaired in the laboratory but perform significantly better than younger adults in naturalistic settings. Nevertheless, there are very few studies that have examined prospective memory both in and outside the laboratory using the same sample of young and old participants. Moreover, most naturalistic studies have used time-based tasks, and it is unclear whether the prospective memory and ageing paradox extends to event-based tasks. In this study, 72 young (18–30 years), 79 young-old (61–70 years), and 72 old-old (71–80 years) participants completed several event-based tasks in and outside the laboratory. Results showed that the ageing paradox does exist for event-based tasks but manifests itself differently from that in time-based tasks. Thus, younger adults outperformed old-old participants in two laboratory event-based tasks, but there were no age effects for a naturalistic task completed at home (remembering to write the date and time in the upper left corner of a questionnaire). The young and old-old also did not differ in remembering to retrieve a wristwatch from a pocket at the end of the laboratory session. This indicates that the paradox may be due to differences in ongoing task demands in the lab and everyday life, rather than the location per se. The findings call for a concentrated effort towards a theory of cognitive ageing that identifies the variables that do, or do not, account for this paradoxPeer reviewedSubmitted Versio
    • 

    corecore