30,009 research outputs found

    Substructure Discovery Using Minimum Description Length and Background Knowledge

    Full text link
    The ability to identify interesting and repetitive substructures is an essential component to discovering knowledge in structural data. We describe a new version of our SUBDUE substructure discovery system based on the minimum description length principle. The SUBDUE system discovers substructures that compress the original data and represent structural concepts in the data. By replacing previously-discovered substructures in the data, multiple passes of SUBDUE produce a hierarchical description of the structural regularities in the data. SUBDUE uses a computationally-bounded inexact graph match that identifies similar, but not identical, instances of a substructure and finds an approximate measure of closeness of two substructures when under computational constraints. In addition to the minimum description length principle, other background knowledge can be used by SUBDUE to guide the search towards more appropriate substructures. Experiments in a variety of domains demonstrate SUBDUE's ability to find substructures capable of compressing the original data and to discover structural concepts important to the domain. Description of Online Appendix: This is a compressed tar file containing the SUBDUE discovery system, written in C. The program accepts as input databases represented in graph form, and will output discovered substructures with their corresponding value.Comment: See http://www.jair.org/ for an online appendix and other files accompanying this articl

    Beetle fauna of the island of Tobago, Trinidad and Tobago, West Indies

    Get PDF
    Tobago is a biologically rich but poorly investigated island. In this paper we report the occurrence of 672 species of beetles representing 69 families. Of these, only 95 had been previously reported from the island

    Measuring eccentricity in binary black-hole initial data

    Full text link
    Initial data for evolving black-hole binaries can be constructed via many techniques, and can represent a wide range of physical scenarios. However, because of the way that different schemes parameterize the physical aspects of a configuration, it is not alway clear what a given set of initial data actually represents. This is especially important for quasiequilibrium data constructed using the conformal thin-sandwich approach. Most initial-data studies have focused on identifying data sets that represent binaries in quasi-circular orbits. In this paper, we consider initial-data sets representing equal-mass black holes binaries in eccentric orbits. We will show that effective-potential techniques can be used to calibrate initial data for black-hole binaries in eccentric orbits. We will also examine several different approaches, including post-Newtonian diagnostics, for measuring the eccentricity of an orbit. Finally, we propose the use of the ``Komar-mass difference'' as a useful, invariant means of parameterizing the eccentricity of relativistic orbits.Comment: 12 pages, 11 figures, submitted to Physical Review D, revtex

    Gravitational waves from nonspinning black hole-neutron star binaries: dependence on equations of state

    Full text link
    We report results of a numerical-relativity simulation for the merger of a black hole-neutron star binary with a variety of equations of state (EOSs) modeled by piecewise polytropes. We focus in particular on the dependence of the gravitational waveform at the merger stage on the EOSs. The initial conditions are computed in the moving-puncture framework, assuming that the black hole is nonspinning and the neutron star has an irrotational velocity field. For a small mass ratio of the binaries (e.g., MBH/MNS = 2 where MBH and MNS are the masses of the black hole and neutron star, respectively), the neutron star is tidally disrupted before it is swallowed by the black hole irrespective of the EOS. Especially for less-compact neutron stars, the tidal disruption occurs at a more distant orbit. The tidal disruption is reflected in a cutoff frequency of the gravitational-wave spectrum, above which the spectrum amplitude exponentially decreases. A clear relation is found between the cutoff frequency of the gravitational-wave spectrum and the compactness of the neutron star. This relation also depends weakly on the stiffness of the EOS in the core region of the neutron star, suggesting that not only the compactness but also the EOS at high density is reflected in gravitational waveforms. The mass of the disk formed after the merger shows a similar correlation with the EOS, whereas the spin of the remnant black hole depends primarily on the mass ratio of the binary, and only weakly on the EOS. Properties of the remnant disks are also analyzed.Comment: 27pages, 21 figures; erratum is added on Aug 5. 201

    Cognitive consequences of clumsy automation on high workload, high consequence human performance

    Get PDF
    The growth of computational power has fueled attempts to automate more of the human role in complex problem solving domains, especially those where system faults have high consequences and where periods of high workload may saturate the performance capacity of human operators. Examples of these domains include flightdecks, space stations, air traffic control, nuclear power operation, ground satellite control rooms, and surgical operating rooms. Automation efforts may have unanticipated effects on human performance, particularly if they increase the workload at peak workload times or change the practitioners' strategies for coping with workload. Smooth and effective changes in automation requires detailed understanding of the congnitive tasks confronting the user: it has been called user centered automation. The introduction of a new computerized technology in a group of hospital operating rooms used for heart surgery was observed. The study revealed how automation, especially 'clumsy automation', effects practitioner work patterns and suggest that clumsy automation constrains users in specific and significant ways. Users tailor both the new system and their tasks in order to accommodate the needs of process and production. The study of this tailoring may prove a powerful tool for exposing previously hidden patterns of user data processing, integration, and decision making which may, in turn, be useful in the design of more effective human-machine systems

    Fragmentation of Nuclei at Intermediate and High Energies in Modified Cascade Model

    Get PDF
    The process of nuclear multifragmentation has been implemented, together with evaporation and fission channels of the disintegration of excited remnants in nucleus-nucleus collisions using percolation theory and the intranuclear cascade model. Colliding nuclei are treated as face--centered--cubic lattices with nucleons occupying the nodes of the lattice. The site--bond percolation model is used. The code can be applied for calculation of the fragmentation of nuclei in spallation and multifragmentation reactions.Comment: 19 pages, 10 figure

    From one solution of a 3-satisfiability formula to a solution cluster: Frozen variables and entropy

    Full text link
    A solution to a 3-satisfiability (3-SAT) formula can be expanded into a cluster, all other solutions of which are reachable from this one through a sequence of single-spin flips. Some variables in the solution cluster are frozen to the same spin values by one of two different mechanisms: frozen-core formation and long-range frustrations. While frozen cores are identified by a local whitening algorithm, long-range frustrations are very difficult to trace, and they make an entropic belief-propagation (BP) algorithm fail to converge. For BP to reach a fixed point the spin values of a tiny fraction of variables (chosen according to the whitening algorithm) are externally fixed during the iteration. From the calculated entropy values, we infer that, for a large random 3-SAT formula with constraint density close to the satisfiability threshold, the solutions obtained by the survey-propagation or the walksat algorithm belong neither to the most dominating clusters of the formula nor to the most abundant clusters. This work indicates that a single solution cluster of a random 3-SAT formula may have further community structures.Comment: 13 pages, 6 figures. Final version as published in PR

    Black hole evolution by spectral methods

    Get PDF
    Current methods of evolving a spacetime containing one or more black holes are plagued by instabilities that prohibit long-term evolution. Some of these instabilities may be due to the numerical method used, traditionally finite differencing. In this paper, we explore the use of a pseudospectral collocation (PSC) method for the evolution of a spherically symmetric black hole spacetime in one dimension using a hyperbolic formulation of Einstein's equations. We demonstrate that our PSC method is able to evolve a spherically symmetric black hole spacetime forever without enforcing constraints, even if we add dynamics via a Klein-Gordon scalar field. We find that, in contrast to finite-differencing methods, black hole excision is a trivial operation using PSC applied to a hyperbolic formulation of Einstein's equations. We discuss the extension of this method to three spatial dimensions.Comment: 20 pages, 17 figures, submitted to PR

    Conformal thin-sandwich puncture initial data for boosted black holes

    Full text link
    We apply the puncture approach to conformal thin-sandwich black-hole initial data. We solve numerically the conformal thin-sandwich puncture (CTSP) equations for a single black hole with non-zero linear momentum. We show that conformally flat solutions for a boosted black hole have the same maximum gravitational radiation content as the corresponding Bowen-York solution in the conformal transverse-traceless decomposition. We find that the physical properties of these data are independent of the free slicing parameter.Comment: 12 pages, 11 figure

    Ranking Templates for Linear Loops

    Full text link
    We present a new method for the constraint-based synthesis of termination arguments for linear loop programs based on linear ranking templates. Linear ranking templates are parametrized, well-founded relations such that an assignment to the parameters gives rise to a ranking function. This approach generalizes existing methods and enables us to use templates for many different ranking functions with affine-linear components. We discuss templates for multiphase, piecewise, and lexicographic ranking functions. Because these ranking templates require both strict and non-strict inequalities, we use Motzkin's Transposition Theorem instead of Farkas Lemma to transform the generated ∃∀\exists\forall-constraint into an ∃\exists-constraint.Comment: TACAS 201
    • …
    corecore