5,469 research outputs found

    The Six concerns for Separation of Concerns

    Get PDF
    Despite a common agreement on the necessity of the application of the separation of concerns (SOC) principle, there is not yet a consensus for its key issues. The separation of the concerns is usually based on the adopted programming paradigm, the applied method or even the programming language. This paper presents the so-called six ‘C’ properties that can be applied as a guideline for defining and evaluating the approaches that adopt the SOC principle

    Complexity, BioComplexity, the Connectionist Conjecture and Ontology of Complexity\ud

    Get PDF
    This paper develops and integrates major ideas and concepts on complexity and biocomplexity - the connectionist conjecture, universal ontology of complexity, irreducible complexity of totality & inherent randomness, perpetual evolution of information, emergence of criticality and equivalence of symmetry & complexity. This paper introduces the Connectionist Conjecture which states that the one and only representation of Totality is the connectionist one i.e. in terms of nodes and edges. This paper also introduces an idea of Universal Ontology of Complexity and develops concepts in that direction. The paper also develops ideas and concepts on the perpetual evolution of information, irreducibility and computability of totality, all in the context of the Connectionist Conjecture. The paper indicates that the control and communication are the prime functionals that are responsible for the symmetry and complexity of complex phenomenon. The paper takes the stand that the phenomenon of life (including its evolution) is probably the nearest to what we can describe with the term “complexity”. The paper also assumes that signaling and communication within the living world and of the living world with the environment creates the connectionist structure of the biocomplexity. With life and its evolution as the substrate, the paper develops ideas towards the ontology of complexity. The paper introduces new complexity theoretic interpretations of fundamental biomolecular parameters. The paper also develops ideas on the methodology to determine the complexity of “true” complex phenomena.\u

    Higher-Order Termination: from Kruskal to Computability

    Get PDF
    Termination is a major question in both logic and computer science. In logic, termination is at the heart of proof theory where it is usually called strong normalization (of cut elimination). In computer science, termination has always been an important issue for showing programs correct. In the early days of logic, strong normalization was usually shown by assigning ordinals to expressions in such a way that eliminating a cut would yield an expression with a smaller ordinal. In the early days of verification, computer scientists used similar ideas, interpreting the arguments of a program call by a natural number, such as their size. Showing the size of the arguments to decrease for each recursive call gives a termination proof of the program, which is however rather weak since it can only yield quite small ordinals. In the sixties, Tait invented a new method for showing cut elimination of natural deduction, based on a predicate over the set of terms, such that the membership of an expression to the predicate implied the strong normalization property for that expression. The predicate being defined by induction on types, or even as a fixpoint, this method could yield much larger ordinals. Later generalized by Girard under the name of reducibility or computability candidates, it showed very effective in proving the strong normalization property of typed lambda-calculi..

    Advanced Cyberinfrastructure for Science, Engineering, and Public Policy

    Full text link
    Progress in many domains increasingly benefits from our ability to view the systems through a computational lens, i.e., using computational abstractions of the domains; and our ability to acquire, share, integrate, and analyze disparate types of data. These advances would not be possible without the advanced data and computational cyberinfrastructure and tools for data capture, integration, analysis, modeling, and simulation. However, despite, and perhaps because of, advances in "big data" technologies for data acquisition, management and analytics, the other largely manual, and labor-intensive aspects of the decision making process, e.g., formulating questions, designing studies, organizing, curating, connecting, correlating and integrating crossdomain data, drawing inferences and interpreting results, have become the rate-limiting steps to progress. Advancing the capability and capacity for evidence-based improvements in science, engineering, and public policy requires support for (1) computational abstractions of the relevant domains coupled with computational methods and tools for their analysis, synthesis, simulation, visualization, sharing, and integration; (2) cognitive tools that leverage and extend the reach of human intellect, and partner with humans on all aspects of the activity; (3) nimble and trustworthy data cyber-infrastructures that connect, manage a variety of instruments, multiple interrelated data types and associated metadata, data representations, processes, protocols and workflows; and enforce applicable security and data access and use policies; and (4) organizational and social structures and processes for collaborative and coordinated activity across disciplinary and institutional boundaries.Comment: A Computing Community Consortium (CCC) white paper, 9 pages. arXiv admin note: text overlap with arXiv:1604.0200

    A Primer on the Tools and Concepts of Computable Economics

    Get PDF
    Computability theory came into being as a result of Hilbert's attempts to meet Brouwer's challenges, from an intuitionistc and constructive standpoint, to formalism as a foundation for mathematical practice. Viewed this way, constructive mathematics should be one vision of computability theory. However, there are fundamental differences between computability theory and constructive mathematics: the Church-Turing thesis is a disciplining criterion in the former and not in the latter; and classical logic - particularly, the law of the excluded middle - is not accepted in the latter but freely invoked in the former, especially in proving universal negative propositions. In Computable Economic an eclectic approach is adopted where the main criterion is numerical content for economic entities. In this sense both the computable and the constructive traditions are freely and indiscriminately invoked and utilised in the formalization of economic entities. Some of the mathematical methods and concepts of computable economics are surveyed in a pedagogical mode. The context is that of a digital economy embedded in an information society

    A Sound and Complete Projection for Global Types

    Get PDF
    Multiparty session types is a typing discipline used to write specifications, known as global types, for branching and recursive message-passing systems. A necessary operation on global types is projection to abstractions of local behaviour, called local types. Typically, this is a computable partial function that given a global type and a role erases all details irrelevant to this role. Computable projection functions in the literature are either unsound or too restrictive when dealing with recursion and branching. Recent work has taken a more general approach to projection defining it as a coinductive, but not computable, relation. Our work defines a new computable projection function that is sound and complete with respect to its coinductive counterpart and, hence, equally expressive. All results have been mechanised in the Coq proof assistant
    corecore