359 research outputs found

    High-fidelity metaprogramming with separator syntax trees

    Get PDF
    Many metaprogramming tasks, such as refactorings, automated bug fixing, or large-scale software renovation, require high-fidelity source code transformations-transformations which preserve comments and layout as much as possible. Abstract syntax trees (ASTs) typically abstract from such details, and hence would require pretty printing, destroying the original program layout. Concrete syntax trees (CSTs) preserve all layout information, but transformation systems or parsers that support CSTs are rare and can be cumbersome to use. In this paper we present separator syntax trees (SSTs), a lightweight syntax tree format, that sits between AST and CSTs, in terms of the amount of information they preserve. SSTs extend ASTs by recording textual layout information separating AST nodes. This information can be used to reconstruct the textual code after parsing, but can largely be ignored when implementing high-fidelity transformations. We have implemented SSTs in Rascal, and show how it enables the concise definition of high-fidelity source code transformations using a simple refactoring for C++

    Concrete Syntax with Black Box Parsers

    Get PDF
    Context: Meta programming consists for a large part of matching, analyzing, and transforming syntax trees. Many meta programming systems process abstract syntax trees, but this requires intimate knowledge of the structure of the data type describing the abstract syntax. As a result, meta programming is error-prone, and meta programs are not resilient to evolution of the structure of such ASTs, requiring invasive, fault-prone change to these programs. Inquiry: Concrete syntax patterns alleviate this problem by allowing the meta programmer to match and create syntax trees using the actual syntax of the object language. Systems supporting concrete syntax patterns, however, require a concrete grammar of the object language in their own formalism. Creating such grammars is a costly and error-prone process, especially for realistic languages such as Java and C++. Approach: In this paper we present Concretely, a technique to extend meta programming systems with pluggable concrete syntax patterns, based on external, black box parsers. We illustrate Concretely in the context of Rascal, an open-source meta programming system and language workbench, and show how to reuse existing parsers for Java, JavaScript, and C++. Furthermore, we propose Tympanic, a DSL to declaratively map external AST structures to Rascal's internal data structures. Tympanic allows implementors of Concretely to solve the impedance mismatch between object-oriented class hierarchies in Java and Rascal's algebraic data types. Both the algebraic data type and AST marshalling code is automatically generated. Knowledge: The conceptual architecture of Concretely and Tympanic supports the reuse of pre-existing, external parsers, and their AST representation in meta programming systems that feature concrete syntax patterns for matching and constructing syntax trees. As such this opens up concrete syntax pattern matching for a host of realistic languages for which writing a grammar from scratch is time consuming and error-prone, but for which industry-strength parsers exist in the wild. Grounding: We evaluate Concretely in terms of source lines of code (SLOC), relative to the size of the AST data type and marshalling code. We show that for real programming languages such as C++ and Java, adding support for concrete syntax patterns takes an effort only in the order of dozens of SLOC. Similarly, we evaluate Tympanic in terms of SLOC, showing an order of magnitude of reduction in SLOC compared to manual implementation of the AST data types and marshalling code. Importance: Meta programming has applications in reverse engineering, reengineering, source code analysis, static analysis, software renovation, domain-specific language engineering, and many others. Processing of syntax trees is central to all of these tasks. Concrete syntax patterns improve the practice of constructing meta programs. The combination of Concretely and Tympanic has the potential to make concrete syntax patterns available with very little effort, thereby improving and promoting the application of meta programming in the general software engineering context

    High-fidelity metaprogramming with separator syntax trees

    Get PDF
    Many metaprogramming tasks, such as refactorings, automated bug fixing, or large-scale software renovation, require high-fidelity source code transformations-transformations which preserve comments and layout as much as possible. Abstract syntax trees (ASTs) typically abstract from such details, and hence would require pretty printing, destroying the original program layout. Concrete syntax trees (CSTs) preserve all layout information, but transformation systems or parsers that support CSTs are rare and can be cumbersome to use. In this paper we present separator syntax trees (SSTs), a lightweight syntax tree format, that sits between AST and CSTs, in terms of the amount of information they preserve. SSTs extend ASTs by recording textual layout information separating AST nodes. This information can be used to reconstruct the textual code after parsing, but can largely be ignored when implementing high-fidelity transformations. We have implemented SSTs in Rascal, and show how it enables the concise definition of high-fidelity source code transformations using a simple refactoring for C++

    Operational Risk Management; An analysis of FSA Final Notices

    Get PDF
    In the last two decades, financial markets have been highlighted by large-scale financial failures due to incompetence and fraud, such as Barings, Daiwa, Allied Irish Banks, UBS, Société Génerale, and more recently JP Morgan. While previous research has focussed on market and credit risk, and even if the focus was on operational risk it concentrates on the market reaction to operational losses, the current research addresses the root of the problem. The current research explores the final notices issued by the FSA (now FCA and PRA) of over £1,000,000 from 2008-2013 in order to find a common theme in the operational risks of the financial institutions. In order to categorise the final notices a framework by McConnell (2008) has been adopted. With the help of this framework the risks are more accurately distinguished in either an incident, individual, institution, or industry area. The data analysis the frequently breached principles outlined by the FSA ‘principles for business’. The results from this allow the accurate categorisation of each case, and allow us to see where financial institution most often commit operational risks. The findings indicate that the vast majority of operational risks occur on an institutional level; however, the findings also show that no cases were attributed to solely the incident or industry area. Furthermore, two subcategories have been created in order to place the cases more accurately, these subcategories are a combination of the already existing areas; the individual and institution area, and the industry and institution area. This makes the tally for institutional area increase, and results in only 3 cases involving no institutional factors whatsoever. With these results suggestions to minimise the frequency and severity of operational risks are made in the form of adopting either a new regulatory body, which has already happened. The further suggestions include adopting a Dodd-Frank Act stress testing similar approach to risk management of financial institutions, or applying the Enterprise Wide Risk Management (EWRM) framework in order to stop risks in the earlier stages. However, it is important to consider the definition of risk, as it is not necessarily deemed a bad thing, especially for financial institutions who most take some risks in order to maintain competitive and profitable, the suggestions are therefore recommendations on determining what the good, and the bad risks are

    Thermal decoupling and the smallest subhalo mass in dark matter models with Sommerfeld-enhanced annihilation rates

    Full text link
    We consider dark matter consisting of weakly interacting massive particles (WIMPs) and revisit in detail its thermal evolution in the early universe, with a particular focus on models where the annihilation rate is enhanced by the Sommerfeld effect. After chemical decoupling, or freeze-out, dark matter no longer annihilates but is still kept in local thermal equilibrium due to scattering events with the much more abundant standard model particles. During kinetic decoupling, even these processes stop to be effective, which eventually sets the scale for a small-scale cutoff in the matter density fluctuations. Afterwards, the WIMP temperature decreases more quickly than the heat bath temperature, which causes dark matter to reenter an era of annihilation if the cross-section is enhanced by the Sommerfeld effect. Here, we give a detailed and self-consistent description of these effects. As an application, we consider the phenomenology of simple leptophilic models that have been discussed in the literature and find that the relic abundance can be affected by as much two orders of magnitude or more. We also compute the mass of the smallest dark matter subhalos in these models and find it to be in the range of about 10^{-10} to 10 solar masses; even much larger cutoff values are possible if the WIMPs couple to force carriers lighter than about 100 MeV. We point out that a precise determination of the cutoff mass allows to infer new limits on the model parameters, in particular from gamma-ray observations of galaxy clusters, that are highly complementary to existing constraints from g-2 or beam dump experiments.Comment: minor changes to match published versio

    Is dark matter with long-range interactions a solution to all small-scale problems of \Lambda CDM cosmology?

    Full text link
    The cold dark matter (DM) paradigm describes the large-scale structure of the universe remarkably well. However, there exists some tension with the observed abundances and internal density structures of both field dwarf galaxies and galactic satellites. Here, we demonstrate that a simple class of DM models may offer a viable solution to all of these problems simultaneously. Their key phenomenological properties are velocity-dependent self-interactions mediated by a light vector messenger and thermal production with much later kinetic decoupling than in the standard case.Comment: revtex4; 6 pages, 3 figures; minor changes to match published versio
    corecore