8,584 research outputs found
A survey on algorithmic aspects of modular decomposition
The modular decomposition is a technique that applies but is not restricted
to graphs. The notion of module naturally appears in the proofs of many graph
theoretical theorems. Computing the modular decomposition tree is an important
preprocessing step to solve a large number of combinatorial optimization
problems. Since the first polynomial time algorithm in the early 70's, the
algorithmic of the modular decomposition has known an important development.
This paper survey the ideas and techniques that arose from this line of
research
Generalizing input-driven languages: theoretical and practical benefits
Regular languages (RL) are the simplest family in Chomsky's hierarchy. Thanks
to their simplicity they enjoy various nice algebraic and logic properties that
have been successfully exploited in many application fields. Practically all of
their related problems are decidable, so that they support automatic
verification algorithms. Also, they can be recognized in real-time.
Context-free languages (CFL) are another major family well-suited to
formalize programming, natural, and many other classes of languages; their
increased generative power w.r.t. RL, however, causes the loss of several
closure properties and of the decidability of important problems; furthermore
they need complex parsing algorithms. Thus, various subclasses thereof have
been defined with different goals, spanning from efficient, deterministic
parsing to closure properties, logic characterization and automatic
verification techniques.
Among CFL subclasses, so-called structured ones, i.e., those where the
typical tree-structure is visible in the sentences, exhibit many of the
algebraic and logic properties of RL, whereas deterministic CFL have been
thoroughly exploited in compiler construction and other application fields.
After surveying and comparing the main properties of those various language
families, we go back to operator precedence languages (OPL), an old family
through which R. Floyd pioneered deterministic parsing, and we show that they
offer unexpected properties in two fields so far investigated in totally
independent ways: they enable parsing parallelization in a more effective way
than traditional sequential parsers, and exhibit the same algebraic and logic
properties so far obtained only for less expressive language families
Automated Protein Structure Classification: A Survey
Classification of proteins based on their structure provides a valuable
resource for studying protein structure, function and evolutionary
relationships. With the rapidly increasing number of known protein structures,
manual and semi-automatic classification is becoming ever more difficult and
prohibitively slow. Therefore, there is a growing need for automated, accurate
and efficient classification methods to generate classification databases or
increase the speed and accuracy of semi-automatic techniques. Recognizing this
need, several automated classification methods have been developed. In this
survey, we overview recent developments in this area. We classify different
methods based on their characteristics and compare their methodology, accuracy
and efficiency. We then present a few open problems and explain future
directions.Comment: 14 pages, Technical Report CSRG-589, University of Toront
Recommended from our members
Pattern matching : a sheaf-theoretic approach
A general theory of pattern matching is presented by adopting an extensional, geometric view of patterns. The extension of the matching relation consists of the occurrences of all possible patterns in a particular target. The geometry of the pattern describes the structure of the pattern and the spatial relationships among parts of the pattern. The extension and the geometry, when combined, produce a structure called a sheaf. Sheaf theory is a well developed branch of mathematics which studies the global consequences of locally defined properties. For pattern matching, an occurrence of a pattern, a global property of the pattern, is obtained by gluing together occurrences of parts of the pattern, which are locally defined properties.A sheaf-theoretic view of pattern rnatching provides a uniforrn treatrnent of pattern matching on any kind of data structure-strings, trees, graphs, hypergraphs, and so on. Such a parametric description is achieved by using the language of category theory, a highly abstract description of commonly occurring structures and relationships in mathematics.A generalized version of the Knuth-Morris-Pratt pattern matching algorithm is derived by gradually converting the extensional description of pattern rnatching as a sheaf into an intensional description. The algorithm results from a synergy of four very general program synthesis/transformation techniques: (1) Divide and conquer: exploit the sheaf condition; assemble a full match by gluing together partial matches; (2) Finite differencing: collect and update partial matches incrementally while traversing the target; (3) Backtracking: instead of saving all partial matches, save just one; when this partial match cannot be extended, fail back to another; (4) Partial evaluation: precompute pattern-based (and therefore constant) computations.The derivation is carried out in a general frarnework using Grothendieck topologies. By appropriately instantiating the underlying data structures and topologies, the sarne scheme results in matching algorithms for patterns with variables and with multiple patterns. Slight variations of the derivation result in Earley's algorithm for context-free parsing, and Waltz filtering, a relaxation algorithm for providing 3-D interpretations to 2-D irnages.Other applications of a geometric view of patterns are briefly considered: rewrites, parallel algorithms, induction and computability
Cubical Cohomology Ring of 3D Photographs
Cohomology and cohomology ring of three-dimensional (3D) objects are
topological invariants that characterize holes and their relations. Cohomology
ring has been traditionally computed on simplicial complexes. Nevertheless,
cubical complexes deal directly with the voxels in 3D images, no additional
triangulation is necessary, facilitating efficient algorithms for the
computation of topological invariants in the image context. In this paper, we
present formulas to directly compute the cohomology ring of 3D cubical
complexes without making use of any additional triangulation. Starting from a
cubical complex that represents a 3D binary-valued digital picture whose
foreground has one connected component, we compute first the cohomological
information on the boundary of the object, by an incremental
technique; then, using a face reduction algorithm, we compute it on the whole
object; finally, applying the mentioned formulas, the cohomology ring is
computed from such information
Variety and the evolution of refinery processing
Evolutionary theories of economic development stress the role of variety as both a determinant and a result of growth. In this paper we develop a measure of variety, based on Weitzman's maximum likelihood procedure. This measure is based on the distance between products, and indicates the degree of differentiation of a product population. We propose a generic method, which permits to regroup the products with very similar characteristics values before choosing randomly the product models to be used to calculate Weitzman's measure. We apply the variety measure to process characteristics of oil refining. The results obtained for this technology show classic evolutionary specialization patterns that can be understood on the basis of niche theory. Here the changes in variety are related to changes in the range of the services the technology considered can deliver, range which plays a role similar to that of the size of the habitat of a biological species.TECHNOLOGICAL EVOLUTION; REFINERY PROCESSES; NICHE THEORY; WEITZMAN MEASURE
- …