206,583 research outputs found

    BCFA: Bespoke Control Flow Analysis for CFA at Scale

    Full text link
    Many data-driven software engineering tasks such as discovering programming patterns, mining API specifications, etc., perform source code analysis over control flow graphs (CFGs) at scale. Analyzing millions of CFGs can be expensive and performance of the analysis heavily depends on the underlying CFG traversal strategy. State-of-the-art analysis frameworks use a fixed traversal strategy. We argue that a single traversal strategy does not fit all kinds of analyses and CFGs and propose bespoke control flow analysis (BCFA). Given a control flow analysis (CFA) and a large number of CFGs, BCFA selects the most efficient traversal strategy for each CFG. BCFA extracts a set of properties of the CFA by analyzing the code of the CFA and combines it with properties of the CFG, such as branching factor and cyclicity, for selecting the optimal traversal strategy. We have implemented BCFA in Boa, and evaluated BCFA using a set of representative static analyses that mainly involve traversing CFGs and two large datasets containing 287 thousand and 162 million CFGs. Our results show that BCFA can speedup the large scale analyses by 1%-28%. Further, BCFA has low overheads; less than 0.2%, and low misprediction rate; less than 0.01%.Comment: 12 page

    Timed Specification For Web Services Compatibility Analysis

    Get PDF
    AbstractWeb services are becoming one of the main technologies for designing and building complex inter-enterprise business applications. Usually, a business application cannot be fulfilled by one Web service but by coordinating a set of them. In particular, to perform a coordination, one of the important investigations is the compatibility analysis. Two Web services are said compatible if they can interact correctly. In the literature, the proposed frameworks for the services compatibility checking rely on the supported sequences of messages. The interaction of services depends also on other properties, such that the exchanged data flow. Thus, considering only supported sequences of messages seems to be insufficient. Other properties on which the services interaction can rely on, are the temporal constraints. In this paper, we focus our interest on the compatibility analysis of Web services regarding their (1) supported sequences of messages, (2) the exchanged data flow, (3) constraints related to the exchanged data flow and (4) the temporal requirements. Based on these properties, we study three compatibility classes: (i) absolute compatibility, (ii) likely compatibility and (iii) absolute incompatibility

    Reynolds-Averaged Turbulence Modeling Using Type I and Type II Machine Learning Frameworks with Deep Learning

    Full text link
    Deep learning (DL)-based Reynolds stress with its capability to leverage values of large data can be used to close Reynolds-averaged Navier-Stoke (RANS) equations. Type I and Type II machine learning (ML) frameworks are studied to investigate data and flow feature requirements while training DL-based Reynolds stress. The paper presents a method, flow features coverage mapping (FFCM), to quantify the physics coverage of DL-based closures that can be used to examine the sufficiency of training data points as well as input flow features for data-driven turbulence models. Three case studies are formulated to demonstrate the properties of Type I and Type II ML. The first case indicates that errors of RANS equations with DL-based Reynolds stress by Type I ML are accumulated along with the simulation time when training data do not sufficiently cover transient details. The second case uses Type I ML to show that DL can figure out time history of flow transients from data sampled at various times. The case study also shows that the necessary and sufficient flow features of DL-based closures are first-order spatial derivatives of velocity fields. The last case demonstrates the limitation of Type II ML for unsteady flow simulation. Type II ML requires initial conditions to be sufficiently close to reference data. Then reference data can be used to improve RANS simulation

    Needs and challenges for assessing the environmental impacts of engineered nanomaterials (ENMs).

    Get PDF
    The potential environmental impact of nanomaterials is a critical concern and the ability to assess these potential impacts is top priority for the progress of sustainable nanotechnology. Risk assessment tools are needed to enable decision makers to rapidly assess the potential risks that may be imposed by engineered nanomaterials (ENMs), particularly when confronted by the reality of limited hazard or exposure data. In this review, we examine a range of available risk assessment frameworks considering the contexts in which different stakeholders may need to assess the potential environmental impacts of ENMs. Assessment frameworks and tools that are suitable for the different decision analysis scenarios are then identified. In addition, we identify the gaps that currently exist between the needs of decision makers, for a range of decision scenarios, and the abilities of present frameworks and tools to meet those needs

    Modeling solid state stability for speciation: a ten-year long study

    Get PDF
    Speciation studies are based on fundamental models that relate the properties of biomimetic coordination compounds to the stability of the complexes. In addition to the classic approach based on solution studies, solid state properties have been recently proposed as supporting tools to understand the bioavailability of the involved metal. A ten-year long systematic study of several different complexes of imidazole substituted ligands with transition metal ions led our group to the definition of a model based on experimental evidences. This model revealed to be a useful tool to predict the stability of such coordination complexes and is based on the induced behavior under thermal stress. Several different solid state complexes were characterized by Thermally Induced Evolved Gas Analysis by Mass Spectrometry (TI-EGA-MS). This hyphenated technique provides fundamental information to determine the solid state properties and to create a model that relates stability to coordination. In this research, the model resulting from our ten-year long systematic study of complexes of transition metal ions with imidazole substituted ligands is described. In view of a systematic addition of information, new complexes of Cu(II), Zn(II), or Cd(II) with 2-propyl-4,5-imidazoledicarboxylic acid were precipitated, characterized, and studied by means of Thermally Induced Evolved Gas Analysis performed by mass spectrometry (TI-EGA-MS). The hyphenated approach was applied to enrich the information related to thermally induced steps, to confirm the supposed decomposition mechanism, and to determine the thermal stability of the studied complexes. Results, again, allowed supporting the theory that only two main characteristic and common thermally induced decomposition behaviors join the imidazole substituted complexes studied by our group. These two behaviors could be considered as typical trends and the model allowed to predict coordination behavior and to provide speciation information
    • …
    corecore