17,100 research outputs found

    Using Pinch Gloves(TM) for both Natural and Abstract Interaction Techniques in Virtual Environments

    Get PDF
    Usable three-dimensional (3D) interaction techniques are difficult to design, implement, and evaluate. One reason for this is a poor understanding of the advantages and disadvantages of the wide range of 3D input devices, and of the mapping between input devices and interaction techniques. We present an analysis of Pinch Glovesā„¢ and their use as input devices for virtual environments (VEs). We have developed a number of novel and usable interaction techniques for VEs using the gloves, including a menu system, a technique for text input, and a two-handed navigation technique. User studies have indicated the usability and utility of these techniques

    Theory and Calibration of Swap Market Models

    Get PDF
    This paper introduces a general framework for market models, named Market Model Approach, through the concept of admissible sets of for-ward swap rates spanning a given tenor structure. We relate this concept to results in graph theory by showing that a set is admissible if and only if the associated graph is a tree. This connection enables us to enumerate all admissible models for a given tenor structure. Three main classes are identified within this framework, and correspond to the co-terminal, co-initial, and co-sliding model. We prove that the LIBOR market model is the only admissible model of a co-sliding type. By focusing on the co-terminal model in a lognormal setting, we develop and compare several approximating analytical formulae for caplets, while swaptions can be priced by a simple Black-type formula. A novel calibration technique is introduced to allow simultaneous calibration to caplet and swaption prices. Empirical calibration of the co-terminal model is shown to be faster, more robust and more efficient than the same procedure applied to the LIBOR market model. We then argue that the co-terminal approach is the simplest and most convenient market model for pricing and hedging a large variety of exotic interest-rate derivatives.Swap Market Model, Cap, Swaption, Calibration, Graph Theory

    TARGET: Rapid Capture of Process Knowledge

    Get PDF
    TARGET (Task Analysis/Rule Generation Tool) represents a new breed of tool that blends graphical process flow modeling capabilities with the function of a top-down reporting facility. Since NASA personnel frequently perform tasks that are primarily procedural in nature, TARGET models mission or task procedures and generates hierarchical reports as part of the process capture and analysis effort. Historically, capturing knowledge has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent the expert's knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some types of knowledge, procedural knowledge has received relatively little attention. In essence, TARGET is one of the first tools of its kind, commercial or institutional, that is designed to support this type of knowledge capture undertaking. This paper will describe the design and development of TARGET for the acquisition and representation of procedural knowledge. The strategies employed by TARGET to support use by knowledge engineers, subject matter experts, programmers and managers will be discussed. This discussion includes the method by which the tool employs its graphical user interface to generate a task hierarchy report. Next, the approach to generate production rules for incorporation in and development of a CLIPS based expert system will be elaborated. TARGET also permits experts to visually describe procedural tasks as a common medium for knowledge refinement by the expert community and knowledge engineer making knowledge consensus possible. The paper briefly touches on the verification and validation issues facing the CLIPS rule generation aspects of TARGET. A description of efforts to support TARGET's interoperability issues on PCs, Macintoshes and UNIX workstations concludes the paper

    An Attempt to Probe the Radio Jet Collimation Regions in NGC 4278, NGC 4374 (M84), and NGC 6166

    Full text link
    NRAO Very Long Baseline Array (VLBA) observations of NGC 4278, NGC 4374 (M84), NGC 6166, and M87 (NGC 4486) have been made at 43 GHz in an effort to image the jet collimation region. This is the first attempt to image the first three sources at 43 GHz using Very Long Baseline Interferometry (VLBI) techniques. These three sources were chosen because their estimated black hole mass and distance implied a Schwarzschild radius with large angular size, giving hope that the jet collimation regions could be studied. Phase referencing was utilize for the three sources because of their expected low flux densities. M87 was chosen as the calibrator for NGC 4374 because it satisfied the phase referencing requirements: nearby to the source and sufficiently strong. Having observed M87 for a long integration time, we have detected its sub-parsec jet, allowing us to confirm previous high resolution observations made by Junor, Biretta & Livio, who have indicated that a wide opening angle was seen near the base of the jet. Phase referencing successfully improved our image sensitivity, yielding detections and providing accurate positions for NGC 4278, NGC 4374 and NGC 6166. These sources are point dominated, but show suggestions of extended structure in the direction of the large-scale jets. However, higher sensitivity will be required to study their sub-parsec jet structure

    The Discovery of Extended Thermal X-ray Emission from PKS 2152-699: Evidence for a `Jet-cloud' Interaction

    Full text link
    A Chandra ACIS-S observation of PKS 2152-699 reveals thermal emission from a diffuse region around the core and a hotspot located 10" northeast from the core. This is the first detection of thermal X-ray radiation on kiloparsec scales from an extragalactic radio source. Two other hotspots located 47" north-northeast and 26" southwest from the core were also detected. Using a Raymond-Smith model, the first hotspot can be characterized with a thermal plasma temperature of 2.6Ɨ106\times10^6 K and an electron number density of 0.17 cmāˆ’3^{-3}. These values correspond to a cooling time of about 1.6Ɨ107\times10^7 yr. In addition, an emission line from the hotspot, possibly Fe xxv, was detected at rest wavelength 10.04\AA. The thermal X-ray emission from the first hotspot is offset from the radio emission but is coincident with optical filaments detected with broadband filters of HST/WFPC2. The best explanation for the X-ray, radio, and optical emission is that of a `jet-cloud' interaction. The diffuse emission around the nucleus of PKS 2152-699 can be modeled as a thermal plasma with a temperature of 1.2Ɨ107\times10^7 K and a luminosity of 1.8Ɨ1041\times10^{41} erg sāˆ’1^{-1}. This emission appears to be asymmetric with a small extension toward Hotspot A, similar to a jet. An optical hotspot (EELR) is seen less than an arcsecond away from this extension in the direction of the core. This indicates that the extension may be caused by the jet interacting with an inner ISM cloud, but entrainment of hot gas is unavoidable. Future observations are discussed.Comment: To appear in the Astrophysical Journal 21 pages, 5 Postscript figures, 1 table, AASTeX v. 5.

    Bayes factors for peri-null hypotheses

    Get PDF

    A Critical Evaluation of the FBST <i>ev </i>for Bayesian Hypothesis Testing

    Get PDF
    The ā€œFull Bayesian Significance Test e-valueā€, henceforth FBST ev, has received increasing attention across a range of disciplines including psychology. We show that the FBST ev leads to four problems: (1) the FBST ev cannot quantify evidence in favor of a null hypothesis and therefore also cannot discriminate ā€œevidence of absenceā€ from ā€œabsence of evidenceā€; (2) the FBST ev is susceptible to sampling to a foregone conclusion; (3) the FBST ev violates the principle of predictive irrelevance, such that it is affected by data that are equally likely to occur under the null hypothesis and the alternative hypothesis; (4) the FBST ev suffers from the Jeffreys-Lindley paradox in that it does not include a correction for selection. These problems also plague the frequentist p-value. We conclude that although the FBST ev may be an improvement over the p-value, it does not provide a reasonable measure of evidence against the null hypothesis
    • ā€¦
    corecore