41 research outputs found

    Lightweight Call-Graph Construction for Multilingual Software Analysis

    Full text link
    Analysis of multilingual codebases is a topic of increasing importance. In prior work, we have proposed the MLSA (MultiLingual Software Analysis) architecture, an approach to the lightweight analysis of multilingual codebases, and have shown how it can be used to address the challenge of constructing a single call graph from multilingual software with mutual calls. This paper addresses the challenge of constructing monolingual call graphs in a lightweight manner (consistent with the objective of MLSA) which nonetheless yields sufficient information for resolving language interoperability calls. A novel approach is proposed which leverages information from a compiler-generated AST to provide the quality of call graph necessary, while the program itself is written using an Island Grammar that parses the AST providing the lightweight aspect necessary. Performance results are presented for a C/C++ implementation of the approach, PAIGE (Parsing AST using Island Grammar Call Graph Emitter) showing that despite its lightweight nature, it outperforms Doxgen, is robust to changes in the (Clang) AST, and is not restricted to C/C++.Comment: 10 page

    Big data analytics correlation taxonomy

    Get PDF
    Big data analytics (BDA) is an increasingly popular research area for both organisations and academia due to its usefulness in facilitating human understanding and communication. In the literature, researchers have focused on classifying big data according to data type, data security or level of difficulty, and many research papers reveal that there is a lack of information on evidence of a real-world link of big data analytics methods and its associated techniques. Thus, many organisations are still struggling to realise the actual value of big data analytic methods and its associated techniques. Therefore, this paper gives a design research account for formulating and proposing a step ahead to understand the relation between the analytical methods and its associated techniques. Furthermore, this paper is an attempt to clarify this uncertainty and identify the difference between analytics methods and techniques by giving clear definitions for each method and its associated techniques to integrate them later in a new correlation taxonomy based on the research approaches. Thus, the primary outcome of this research is to achieve for the first time a correlation taxonomy combining analytic methods used for big data and its recommended techniques that are compatible for various sectors. This investigation was done through studying various descriptive articles of big data analytics methods and its associated techniques in different industries

    Modeling and formal verification of probabilistic reconfigurable systems

    Get PDF
    In this thesis, we propose a new approach for formal modeling and verification of adaptive probabilistic systems. Dynamic reconfigurable systems are the trend of all future technological systems, such as flight control systems, vehicle electronic systems, and manufacturing systems. In order to meet user and environmental requirements, such a dynamic reconfigurable system has to actively adjust its configuration at run-time by modifying its components and connections, while changes are detected in the internal/external execution environment. On the other hand, these changes may violate the memory usage, the required energy and the concerned real-time constraints since the behavior of the system is unpredictable. It might also make the system's functions unavailable for some time and make potential harm to human life or large financial investments. Thus, updating a system with any new configuration requires that the post reconfigurable system fully satisfies the related constraints. We introduce GR-TNCES formalism for the optimal functional and temporal specification of probabilistic reconfigurable systems under resource constraints. It enables the optimal specification of a probabilistic, energetic and memory constraints of such a system. To formally verify the correctness and the safety of such a probabilistic system specification, and the non-violation of its properties, an automatic transformation from GR-TNCES models into PRISM models is introduced. Moreover, a new approach XCTL is also proposed to formally verify reconfigurable systems. It enables the formal certification of uncompleted and reconfigurable systems. A new version of the software ZIZO is also proposed to model, simulate and verify such GR-TNCES model. To prove its relevance, the latter was applied to case studies; it was used to model and simulate the behavior of an IPV4 protocol to prevent the energy and memory resources violation. It was also used to optimize energy consumption of an automotive skid conveyor.In dieser Arbeit wird ein neuer Ansatz zur formalen Modellierung und Verifikation dynamisch rekonfigurierbarer Systeme vorgestellt. Dynamische rekonfigurierbare Systeme sind in vielen aktuellen und zukünftigen Anwendungen, wie beispielsweise Flugsteuerungssystemen, Fahrzeugelektronik und Fertigungssysteme zu finden. Diese Systeme weisen ein probabilistisches, adaptives Verhalten auf. Um die Benutzer- und Umgebungsbedingungen kontinuierlich zu erfüllen, muss ein solches System seine Konfiguration zur Laufzeit aktiv anpassen, indem es seine Komponenten, Verbindungen zwischen Komponenten und seine Daten modifiziert (adaptiv), sobald Änderungen in der internen oder externen Ausführungsumgebung erkannt werden (probabilistisch). Diese Anpassungen dürfen Beschränkungen bei der Speichernutzung, der erforderlichen Energie und bestehende Echtzeitbedingungen nicht verletzen. Eine nicht geprüfte Rekonfiguration könnte dazu führen, dass die Funktionen des Systems für einige Zeit nicht verfügbar wären und potenziell menschliches Leben gefährdet würde oder großer finanzieller Schaden entstünde. Somit erfordert das Aktualisieren eines Systems mit einer neuen Konfiguration, dass das rekonfigurierte System die zugehörigen Beschränkungen vollständig einhält. Um dies zu überprüfen, wird in dieser Arbeit der GR-TNCES-Formalismus, eine Erweiterung von Petrinetzen, für die optimale funktionale und zeitliche Spezifikation probabilistischer rekonfigurierbarer Systeme unter Ressourcenbeschränkungen vorgeschlagen. Die entstehenden Modelle sollen über probabilistische model checking verifiziert werden. Dazu eignet sich die etablierte Software PRISM. Um die Verifikation zu ermöglichen wird in dieser Arbeit ein Verfahren zur Transformation von GR-TNCES-Modellen in PRISM-Modelle beschrieben. Eine neu eingeführte Logik (XCTL) erlaubt zudem die einfache Beschreibung der zu prüfenden Eigenschaften. Die genannten Schritte wurden in einer Softwareumgebung für den automatisierten Entwurf, die Simulation und die formale Verifikation (durch eine automatische Transformation nach PRISM) umgesetzt. Eine Fallstudie zeigt die Anwendung des Verfahren

    Exploring Model-to-Model Transformations for RIA Architectures by means of a Systematic Mapping Study

    Get PDF
    This study focuses on model-to-model – M2M – transformations, as part of the Model- Driven Development – MDD – approach, for Rich Internet Applications – RIA. The main aim of this study is to identify fields that require further contributions, and/or research opportunities in the previously mentioned context. We applied mapping studies techniques, since these techniques use the same basic methodology as reviews but are more general and aimed at discovering what the research trends are, allowing to identify gaps in the literature. From an initial set of 132 papers, we selected 30 papers first. Then, thanks to experts’ suggestion, we added 3 additional papers. Therefore, we considered 33 research papers. The performed analysis led to various considerations. Among the important ones, we can mention: there are many newly proposed methods, the scarcity of rigorous and formal validation of such methods, the problem of the portability of Platform Independent Models – PIM, and the low number of tools available for MDD.Laboratorio de Investigación y Formación en Informática Avanzad

    An Update on Effort Estimation in Agile Software Development: A Systematic Literature Review

    Full text link
    [EN] Software developers require effective effort estimation models to facilitate project planning. Although Usman et al. systematically reviewed and synthesized the effort estimation models and practices for Agile Software Development (ASD) in 2014, new evidence may provide new perspectives for researchers and practitioners. This article presents a systematic literature review that updates the Usman et al. study from 2014 to 2020 by analyzing the data extracted from 73 new papers. This analysis allowed us to identify six agile methods: Scrum, Xtreme Programming and four others, in all of which expert-based estimation methods continue to play an important role. This is particularly the case of Planning Poker, which is very closely related to the most frequently used size metric (story points) and the way in which software requirements are specified in ASD. There is also a remarkable trend toward studying techniques based on the intensive use of data. In this respect, although most of the data originate from single-company datasets, there is a significant increase in the use of cross-company data. With regard to cost factors, we applied the thematic analysis method. The use of team and project factors appears to be more frequent than the consideration of more technical factors, in accordance with agile principles. Finally, although accuracy is still a challenge, we identified that improvements have been made. On the one hand, an increasing number of papers showed acceptable accuracy values, although many continued to report inadequate results. On the other, almost 29% of the papers that reported the accuracy metric used reflected aspects concerning the validation of the models and 18% reported the effect size when comparing models.This work was supported by the Spanish Ministry of Science, Innovation and Universities through the Adapt@Cloud Project under Grant TIN2017-84550-R.Fernández-Diego, M.; Méndez, ER.; González-Ladrón-De-Guevara, F.; Abrahao Gonzales, SM.; Insfran, E. (2020). An Update on Effort Estimation in Agile Software Development: A Systematic Literature Review. IEEE Access. 8:166768-166800. https://doi.org/10.1109/ACCESS.2020.3021664S166768166800

    Exploring Model-to-Model Transformations for RIA Architectures by means of a Systematic Mapping Study

    Get PDF
    This study focuses on model-to-model – M2M – transformations, as part of the Model-Driven Development – MDD – approach, for Rich Internet Applications – RIA. The main aim of this study is to identify fields that require further contributions, and/or research opportunities in the previously mentioned context.CONACYT – Consejo Nacional de Ciencia y TecnologíaPROCIENCI

    Exchange Communication Point Modeling in the context of the Enterprise Architecture

    Get PDF
    It is important to understand the performance and operation of an Internet Exchange Point to improve the management, and to reduce the cost associated with implementation and the information shared. Enterprise Architecture supports the design of systems, according to the business domain processes, network infrastructure and all the different applications running. Existing Enterprise Architecture modelling languages only provide a general concept of a network and do not represent specific information such as the protocols used, the internet protocols or the network addresses used for sharing information. This paper proposes a set of new concepts and attributes to the technology layer of reference language (ArchiMate) to enhance the representation and management of the network infrastructure. The ArchiMate language extensions are then used in modelling two Case Studies of Internet Exchange Point implementation in the Portuguese Public Administration. It was possible to compute which services will have impact in case of failure

    Scientific History of Incipit in the period 2010-2016

    Get PDF
    Historial de la actividad científica y técnica del Instituto de Ciencias del Patrimonio (Incipit) del CSIC, basado en Santiago de Compostela, desde su fecha de creación (2010) hasta el año 2016. Se presentan la misión y las líneas de investigación del Incipit, centradas principalmente en el estudio de los procesos de patrimonialización y de valorización social del patrimonio cultural realizadas con una perspectiva transdisciplinar. Se relacionan las publicaciones, proyectos de investigación, actividades de ciencia pública, eventos de comunicación y productos de divulgación que su personal investigador ha producido a lo largo de estos años.General introduction to the Incipit. Presentation of the Research Line: Cultural Heritage Studies: Sub-Theme: Landscape Archaeology and Cultural Landscapes, Sub-theme: Heritagization Processes: Memory, Power and Ethnicity, Sub-theme: Socioeconomics of Cultural Heritage, Sub-theme: Archaeology of the Contemporary Past, Sub-theme: Material culture and formalization processes of cultural heritage. Scientific Contributions. Transfer of Knowledge. International Activities. Other Activities and Results. Scientific DisseminationN
    corecore