1,639,108 research outputs found

    Proposed method for searches of gravitational waves from PKS 2155-304 and other blazar flares

    Full text link
    We propose to search for gravitational waves from PKS 2155-304 as well as other blazars. PKS 2155-304 emitted a long duration energetic flare in July 2006, with total isotropic equivalent energy released in TeV gamma rays of approximately 104510^{45} ergs. Any possible gravitational wave signals associated with this outburst should be seen by gravitational wave detectors at the same time as the electromagnetic signal. During this flare, the two LIGO interferometers at Hanford and the GEO detector were in operation and collecting data. For this search we will use the data from multiple gravitational wave detectors. The method we use for this purpose is a coherent network analysis algorithm and is called {\tt RIDGE}. To estimate the sensitivity of the search, we perform numerical simulations. The sensitivity to estimated gravitational wave energy at the source is about 2.5×10552.5 \times 10^{55} ergs for a detection probability of 20%. For this search, an end-to-end analysis pipeline has been developed, which takes into account the motion of the source across the sky.Comment: 10 pages, 7 figures. Contribution to 12th Gravitational Wave Data Analysis Workshop. Submitted to Classical and Quantum Gravity. Changes in response to referee comment

    The STRESS Method for Boundary-point Performance Analysis of End-to-end Multicast Timer-Suppression Mechanisms

    Full text link
    Evaluation of Internet protocols usually uses random scenarios or scenarios based on designers' intuition. Such approach may be useful for average-case analysis but does not cover boundary-point (worst or best-case) scenarios. To synthesize boundary-point scenarios a more systematic approach is needed.In this paper, we present a method for automatic synthesis of worst and best case scenarios for protocol boundary-point evaluation. Our method uses a fault-oriented test generation (FOTG) algorithm for searching the protocol and system state space to synthesize these scenarios. The algorithm is based on a global finite state machine (FSM) model. We extend the algorithm with timing semantics to handle end-to-end delays and address performance criteria. We introduce the notion of a virtual LAN to represent delays of the underlying multicast distribution tree. The algorithms used in our method utilize implicit backward search using branch and bound techniques and start from given target events. This aims to reduce the search complexity drastically. As a case study, we use our method to evaluate variants of the timer suppression mechanism, used in various multicast protocols, with respect to two performance criteria: overhead of response messages and response time. Simulation results for reliable multicast protocols show that our method provides a scalable way for synthesizing worst-case scenarios automatically. Results obtained using stress scenarios differ dramatically from those obtained through average-case analyses. We hope for our method to serve as a model for applying systematic scenario generation to other multicast protocols.Comment: 24 pages, 10 figures, IEEE/ACM Transactions on Networking (ToN) [To appear

    Z39.50 broadcast searching and Z-server response times: perspectives from CC-interop

    Get PDF
    This paper begins by briefly outlining the evolution of Z39.50 and the current trends, including the work of the JISC CC-interop project. The research crux of the paper focuses on an investigation conducted with respect to testing Z39.50 server (Z-server) response times in a broadcast (parallel) searching environment. Customised software was configured to broadcast a search to all test Z-servers once an hour, for eleven weeks. The results were logged for analysis. Most Z-servers responded rapidly. 'Network congestion' and local OPAC usage were not found to significantly influence Z-server performance. Response time issues encountered by implementers may be the result of non-response by the Z-server and how Z-client software deals with this. The influence of 'quick and dirty' Z39.50 implementations is also identified as a potential cause of slow broadcast searching. The paper indicates various areas for further research, including setting shorter time-outs and greater end-user behavioural research to ascertain user requirements in this area. The influence more complex searches, such as Boolean, have on response times and suboptimal Z39.50 implementations are also emphasised for further study. This paper informs the LIS research community and has practical implications for those establishing Z39.50 based distributed systems, as well as those in the Web Services community. The paper challenges popular LIS opinion that Z39.50 is inherently sluggish and thus unsuitable for the demands of the modern user

    Book Review: Deane-Peter Baker, \u3cem\u3eTayloring Reformed Epistemology: Charles Taylor, Alvin Plantinga and the De Jure Challenge to Christian Belief\u3c/em\u3e, London, SCM Press, 2007

    Get PDF
    Is Christian belief justified, rational or warranted? The search for a response is the heart of the de jure challenge. The very search itself can be perilous as it can fall into reduction, apologetics or even ‘lite’ analytic, doxastic practices. The reductive temptation is to end up answering the de facto question of whether belief in God is true. The fall into apologetics, again, another attempt to avoid the de jure challenge, is to spend more time defending one’s own position to the detriment of developing an engaging epistemological and theological imagination. Lastly, the fall into ‘lite’ analytic and doxastic practice is, for example, to utilise analogies and hypothetical creations with a ‘thin’ perceptual practice. Observant of these temptations, Deane-Peter Baker invites the reader on a journey into the world of Reformed epistemology. He does this in two major parts
    • …
    corecore