160,558 research outputs found

    Formal verification of storm topologies through D-VerT

    Get PDF
    Data-intensive applications (DIAs) based on so-called Big Data technologies are nowadays a common solution adopted by IT companies to face their growing computational needs. The need for highly reliable applications able to handle huge amounts of data and the availability of infrastructures for distributed computing rapidly led industries to develop frame-works for streaming and big-data processing, like Apache Storm and Spark. The definition of methodologies and principles for good software design is, therefore, fundamental to support the development of DIAs. This paper presents an approach for non-functional analysis of DIAs through D- VerT, a tool for the architectural assessment of Storm applications. The verification is based on a translation of Storm topologies into the CLTLoc metric temporal logic. It allows the designer of a Storm application to check for the existence of components that cannot process their workload in a timely manner, typically due to an incorrect design of the topology

    A Model-Driven Approach for the Formal Verification of Storm-Based Streaming Applications

    Get PDF
    Data-intensive applications (DIAs) based on so-called Big Data technologies are nowadays a common solution adopted by IT companies to face their growing computational needs. The need for highly reliable applications able to handle huge amounts of data and the availability of infrastructures for distributed computing rapidly led industries to develop frameworks for streaming and big-data processing, like Apache Storm and Spark. The definition of methodologies and principles for good software design is, therefore, fundamental to support the development of DIAs. This paper presents an approach for non-functional analysis of DIAs through D-VerT, a tool for the architectural assessment of Storm applications. The verification is based on a translation of Storm topologies into the CLTLoc metric temporal logic. It allows the designer of a Storm application to check for the existence of components that cannot process their workload in a timely manner, typically due to an incorrect design of the topology

    Comparative Analysis of Process Mining Tools

    Get PDF
    In the current context of availability of large amounts of data (Big Data), its underlying value can be, frequently, devalued. However, there are several tools that allow to extract knowledge from data. Among other information, this knowledge can lead to improve processes or detect any failure during their execution. This work intends to compare several process mining (PM) tools, using different techniques. For each tool, the best scenario in the discovery of processes is found and the respective results are evaluated. The results showed that Disco is the simplest and most intuitive tool to use. Along with ProM, it also allows a complete analysis, without the need for theoretical knowledge concerning PM or programming. PM4Py, on the other hand, is a free framework that allows great customizations for all functionalities. So it is ideal for professionals with knowledge in PM needing more adjusted implementation or integration with other applications. From a cost perspective, either PM4Py or ProM are free. The use of PM4Py can be complemented by ProM for compliance verification

    Model-driven Engineering IDE for Quality Assessment of Data-intensive Applications

    Full text link
    This article introduces a model-driven engineering (MDE) integrated development environment (IDE) for Data-Intensive Cloud Applications (DIA) with iterative quality enhancements. As part of the H2020 DICE project (ICT-9-2014, id 644869), a framework is being constructed and it is composed of a set of tools developed to support a new MDE methodology. One of these tools is the IDE which acts as the front-end of the methodology and plays a pivotal role in integrating the other tools of the framework. The IDE enables designers to produce from the architectural structure of the general application along with their properties and QoS/QoD annotations up to the deployment model. Administrators, quality assurance engineers or software architects may also run and examine the output of the design and analysis tools in addition to the designer in order to assess the DIA quality in an iterative process

    Analysis of the Security of BB84 by Model Checking

    Full text link
    Quantum Cryptography or Quantum key distribution (QKD) is a technique that allows the secure distribution of a bit string, used as key in cryptographic protocols. When it was noted that quantum computers could break public key cryptosystems based on number theory extensive studies have been undertaken on QKD. Based on quantum mechanics, QKD offers unconditionally secure communication. Now, the progress of research in this field allows the anticipation of QKD to be available outside of laboratories within the next few years. Efforts are made to improve the performance and reliability of the implemented technologies. But several challenges remain despite this big progress. The task of how to test the apparatuses of QKD For example did not yet receive enough attention. These devises become complex and demand a big verification effort. In this paper we are interested in an approach based on the technique of probabilistic model checking for studying quantum information. Precisely, we use the PRISM tool to analyze the security of BB84 protocol and we are focused on the specific security property of eavesdropping detection. We show that this property is affected by the parameters of quantum channel and the power of eavesdropper.Comment: 12 Pages, IJNS

    Overview on agent-based social modelling and the use of formal languages

    Get PDF
    Transdisciplinary Models and Applications investigates a variety of programming languages used in validating and verifying models in order to assist in their eventual implementation. This book will explore different methods of evaluating and formalizing simulation models, enabling computer and industrial engineers, mathematicians, and students working with computer simulations to thoroughly understand the progression from simulation to product, improving the overall effectiveness of modeling systems.Postprint (author's final draft

    Km4City Ontology Building vs Data Harvesting and Cleaning for Smart-city Services

    Get PDF
    Presently, a very large number of public and private data sets are available from local governments. In most cases, they are not semantically interoperable and a huge human effort would be needed to create integrated ontologies and knowledge base for smart city. Smart City ontology is not yet standardized, and a lot of research work is needed to identify models that can easily support the data reconciliation, the management of the complexity, to allow the data reasoning. In this paper, a system for data ingestion and reconciliation of smart cities related aspects as road graph, services available on the roads, traffic sensors etc., is proposed. The system allows managing a big data volume of data coming from a variety of sources considering both static and dynamic data. These data are mapped to a smart-city ontology, called KM4City (Knowledge Model for City), and stored into an RDF-Store where they are available for applications via SPARQL queries to provide new services to the users via specific applications of public administration and enterprises. The paper presents the process adopted to produce the ontology and the big data architecture for the knowledge base feeding on the basis of open and private data, and the mechanisms adopted for the data verification, reconciliation and validation. Some examples about the possible usage of the coherent big data knowledge base produced are also offered and are accessible from the RDF-Store and related services. The article also presented the work performed about reconciliation algorithms and their comparative assessment and selection
    • …
    corecore