8,584 research outputs found

    A survey on algorithmic aspects of modular decomposition

    Full text link
    The modular decomposition is a technique that applies but is not restricted to graphs. The notion of module naturally appears in the proofs of many graph theoretical theorems. Computing the modular decomposition tree is an important preprocessing step to solve a large number of combinatorial optimization problems. Since the first polynomial time algorithm in the early 70's, the algorithmic of the modular decomposition has known an important development. This paper survey the ideas and techniques that arose from this line of research

    Generalizing input-driven languages: theoretical and practical benefits

    Get PDF
    Regular languages (RL) are the simplest family in Chomsky's hierarchy. Thanks to their simplicity they enjoy various nice algebraic and logic properties that have been successfully exploited in many application fields. Practically all of their related problems are decidable, so that they support automatic verification algorithms. Also, they can be recognized in real-time. Context-free languages (CFL) are another major family well-suited to formalize programming, natural, and many other classes of languages; their increased generative power w.r.t. RL, however, causes the loss of several closure properties and of the decidability of important problems; furthermore they need complex parsing algorithms. Thus, various subclasses thereof have been defined with different goals, spanning from efficient, deterministic parsing to closure properties, logic characterization and automatic verification techniques. Among CFL subclasses, so-called structured ones, i.e., those where the typical tree-structure is visible in the sentences, exhibit many of the algebraic and logic properties of RL, whereas deterministic CFL have been thoroughly exploited in compiler construction and other application fields. After surveying and comparing the main properties of those various language families, we go back to operator precedence languages (OPL), an old family through which R. Floyd pioneered deterministic parsing, and we show that they offer unexpected properties in two fields so far investigated in totally independent ways: they enable parsing parallelization in a more effective way than traditional sequential parsers, and exhibit the same algebraic and logic properties so far obtained only for less expressive language families

    Automated Protein Structure Classification: A Survey

    Full text link
    Classification of proteins based on their structure provides a valuable resource for studying protein structure, function and evolutionary relationships. With the rapidly increasing number of known protein structures, manual and semi-automatic classification is becoming ever more difficult and prohibitively slow. Therefore, there is a growing need for automated, accurate and efficient classification methods to generate classification databases or increase the speed and accuracy of semi-automatic techniques. Recognizing this need, several automated classification methods have been developed. In this survey, we overview recent developments in this area. We classify different methods based on their characteristics and compare their methodology, accuracy and efficiency. We then present a few open problems and explain future directions.Comment: 14 pages, Technical Report CSRG-589, University of Toront

    Cubical Cohomology Ring of 3D Photographs

    Get PDF
    Cohomology and cohomology ring of three-dimensional (3D) objects are topological invariants that characterize holes and their relations. Cohomology ring has been traditionally computed on simplicial complexes. Nevertheless, cubical complexes deal directly with the voxels in 3D images, no additional triangulation is necessary, facilitating efficient algorithms for the computation of topological invariants in the image context. In this paper, we present formulas to directly compute the cohomology ring of 3D cubical complexes without making use of any additional triangulation. Starting from a cubical complex QQ that represents a 3D binary-valued digital picture whose foreground has one connected component, we compute first the cohomological information on the boundary of the object, ∂Q\partial Q by an incremental technique; then, using a face reduction algorithm, we compute it on the whole object; finally, applying the mentioned formulas, the cohomology ring is computed from such information

    Variety and the evolution of refinery processing

    Get PDF
    Evolutionary theories of economic development stress the role of variety as both a determinant and a result of growth. In this paper we develop a measure of variety, based on Weitzman's maximum likelihood procedure. This measure is based on the distance between products, and indicates the degree of differentiation of a product population. We propose a generic method, which permits to regroup the products with very similar characteristics values before choosing randomly the product models to be used to calculate Weitzman's measure. We apply the variety measure to process characteristics of oil refining. The results obtained for this technology show classic evolutionary specialization patterns that can be understood on the basis of niche theory. Here the changes in variety are related to changes in the range of the services the technology considered can deliver, range which plays a role similar to that of the size of the habitat of a biological species.TECHNOLOGICAL EVOLUTION; REFINERY PROCESSES; NICHE THEORY; WEITZMAN MEASURE
    • …
    corecore