104,768 research outputs found

    Multibaseline gravitational wave radiometry

    Get PDF
    We present a statistic for the detection of stochastic gravitational wave backgrounds (SGWBs) using radiometry with a network of multiple baselines. We also quantitatively compare the sensitivities of existing baselines and their network to SGWBs. We assess how the measurement accuracy of signal parameters, e.g., the sky position of a localized source, can improve when using a network of baselines, as compared to any of the single participating baselines. The search statistic itself is derived from the likelihood ratio of the cross correlation of the data across all possible baselines in a detector network and is optimal in Gaussian noise. Specifically, it is the likelihood ratio maximized over the strength of the SGWB, and is called the maximized-likelihood ratio (MLR). One of the main advantages of using the MLR over past search strategies for inferring the presence or absence of a signal is that the former does not require the deconvolution of the cross correlation statistic. Therefore, it does not suffer from errors inherent to the deconvolution procedure and is especially useful for detecting weak sources. In the limit of a single baseline, it reduces to the detection statistic studied by Ballmer [Class. Quant. Grav. 23, S179 (2006)] and Mitra et al. [Phys. Rev. D 77, 042002 (2008)]. Unlike past studies, here the MLR statistic enables us to compare quantitatively the performances of a variety of baselines searching for a SGWB signal in (simulated) data. Although we use simulated noise and SGWB signals for making these comparisons, our method can be straightforwardly applied on real data.Comment: 17 pages and 19 figure

    Symmetry reduction and heuristic search for error detection in model checking

    Get PDF
    The state explosion problem is the main limitation of model checking. Symmetries in the system being verified can be exploited in order to avoid this problem by defining an equivalence (symmetry) relation on the states of the system, which induces a semantically equivalent quotient system of smaller size. On the other hand, heuristic search algorithms can be applied to improve the bug finding capabilities of model checking. Such algorithms use heuristic functions to guide the exploration. Bestfirst is used for accelerating the search, while A* guarantees optimal error trails if combined with admissible estimates. We analyze some aspects of combining both approaches, concentrating on the problem of finding the optimal path to the equivalence class of a given error state. Experimental results evaluate our approach

    Assessment of the minimalist approach to computer user documentation

    Get PDF
    The minimalist approach (Carroll, 1990a) advocates the development of a radically different type of manual when compared to a conventional one. For example, the manual should proceed almost directly to procedural skills development rather than building a conceptual model first. It ought to focus on authentic tasks practised in context, as opposed to mock exercises and isolated practice. In addition, it should stimulate users to exploit their knowledge and thinking, as opposed to imposing the writer's view and discussing everything that users should see or know.\ud \ud In the first part of the paper the construction of a tutorial based on the minimalist principles is described. A parallel is drawn with constructivism with which minimalism shares important notions of instruction. In the second part, an experiment is described in which the minimal manual was tested against a conventional one. The outcome favoured the new manual. For example, minimal manual users completed about 50% more tasks successfully on a performance test and displayed significantly more self-reliance (e.g. more self-initiated error-recoveries, and fewer manual consultations)

    Movement around real and virtual cluttered environments

    Get PDF
    Two experiments investigated participantsā€™ ability to search for targets in a cluttered small-scale space. The first experiment was conducted in the real world with two field of view conditions (full vs. restricted), and participants found the task trivial to perform in both. The second experiment used the same search task but was conducted in a desktop virtual environment (VE), and investigated two movement interfaces and two visual scene conditions. Participants restricted to forward only movement performed the search task quicker and more efficiently (visiting fewer targets) than those who used an interface that allowed more flexible movement (forward, backward, left, right, and diagonal). Also, participants using a high fidelity visual scene performed the task significantly quicker and more efficiently than those who used a low fidelity scene. The performance differences between all the conditions decreased with practice, but the performance of the best VE group approached that of the real-world participants. These results indicate the importance of using high fidelity scenes in VEs, and suggest that the use of a simple control system is sufficient for maintaining ones spatial orientation during searching

    Ariadne: An interface to support collaborative database browsing:Technical Report CSEG/3/1995

    Get PDF
    This paper outlines issues in the learning of information searching skills. We report on our observations of the learning of browsing skills and the subsequent iterative development and testing of the Ariadne system ā€“ intended to investigate and support the collaborative learning of search skills. A key part of this support is a mechanism for recording an interaction history and providing students with a visualisation of that history that they can reflect and comment upon

    Classical Optimizers for Noisy Intermediate-Scale Quantum Devices

    Get PDF
    We present a collection of optimizers tuned for usage on Noisy Intermediate-Scale Quantum (NISQ) devices. Optimizers have a range of applications in quantum computing, including the Variational Quantum Eigensolver (VQE) and Quantum Approximate Optimization (QAOA) algorithms. They are also used for calibration tasks, hyperparameter tuning, in machine learning, etc. We analyze the efficiency and effectiveness of different optimizers in a VQE case study. VQE is a hybrid algorithm, with a classical minimizer step driving the next evaluation on the quantum processor. While most results to date concentrated on tuning the quantum VQE circuit, we show that, in the presence of quantum noise, the classical minimizer step needs to be carefully chosen to obtain correct results. We explore state-of-the-art gradient-free optimizers capable of handling noisy, black-box, cost functions and stress-test them using a quantum circuit simulation environment with noise injection capabilities on individual gates. Our results indicate that specifically tuned optimizers are crucial to obtaining valid science results on NISQ hardware, and will likely remain necessary even for future fault tolerant circuits
    • ā€¦
    corecore