23 research outputs found

    Formal Verification of Probabilistic SystemC Models with Statistical Model Checking

    Full text link
    Transaction-level modeling with SystemC has been very successful in describing the behavior of embedded systems by providing high-level executable models, in which many of them have inherent probabilistic behaviors, e.g., random data and unreliable components. It thus is crucial to have both quantitative and qualitative analysis of the probabilities of system properties. Such analysis can be conducted by constructing a formal model of the system under verification and using Probabilistic Model Checking (PMC). However, this method is infeasible for large systems, due to the state space explosion. In this article, we demonstrate the successful use of Statistical Model Checking (SMC) to carry out such analysis directly from large SystemC models and allow designers to express a wide range of useful properties. The first contribution of this work is a framework to verify properties expressed in Bounded Linear Temporal Logic (BLTL) for SystemC models with both timed and probabilistic characteristics. Second, the framework allows users to expose a rich set of user-code primitives as atomic propositions in BLTL. Moreover, users can define their own fine-grained time resolution rather than the boundary of clock cycles in the SystemC simulation. The third contribution is an implementation of a statistical model checker. It contains an automatic monitor generation for producing execution traces of the model-under-verification (MUV), the mechanism for automatically instrumenting the MUV, and the interaction with statistical model checking algorithms.Comment: Journal of Software: Evolution and Process. Wiley, 2017. arXiv admin note: substantial text overlap with arXiv:1507.0818

    Towards Combining Model Checking and Proof Checking

    Get PDF
    International audienceModel checking and automated theorem proving are two pillars of formal verification methods. This paper investigates model checking from an automated theorem proving perspective, aiming at combining the expressiveness of automated theorem proving and the complete automaticity of model checking. It places the focus on the verification of temporal logic properties of Kripke models. The main contributions are: (1) introducing an extended computation tree logic that allows polyadic predicate symbols; (2) designing a proof system for this logic, taking Kripke models as parameters; (3) developing a proof search algorithm for this system and a new automated theorem prover to implement it. The verification process of the new prover is completely automatic, and produces either a counterexample when the property does not hold, or a certificate when it does. The experimental results compare well to existing state-of-the-art tools on some benchmarks, and the efficiency is illustrated by application to an air traffic control problem

    Maravedí: A Secure and Practical Protocol to Trade Risk for Instantaneous Finality

    Get PDF
    The efficiency of blockchain systems is often compared to popular credit card networks with respect to the transactions per second rate. This seems to be an unfair comparison since these networks do not complete a transaction from beginning to end. Rather they buy the risk and settle it much later. Typically transactions have only two players, the payer and the payee, and the settlement of this transaction requires time since it depends on basic properties of the consensus protocol. In practice, the payee, very often, needs to wait for confirmation in order to ship the traded goods. Alternatively, the payee, or merchant, can ship it in faith that the transaction will be confirmed. Our contribution, the Maravedí Protocol, introduces a third player to minimize the risk of the payee to be left without the payment even without the consensus layer confirmation. The main idea is that the third player can work similarly to a credit card company. That is, it buys the risk from the merchant, by a small discount, and allows the third player to pay it instantaneously via a payment-channel like protocol. In parallel, the third player receives the regular payment transaction from the payer that can be settled on the chain, thus, after waiting the consensus/blockchain required time. Moreover, the on-chain transaction pays the full amount, allowing the third player to cash in the discount. Hence, on the side of the merchant, our protocol puts forth instantaneous finality in a novel way to the best of our knowledge

    Constructing and restraining the societies of surveillance: Accountability, from the rise of intelligence services to the expansion of personal data networks in Spain and Brazil (1975-2020)

    Get PDF
    541 p.The objective of this study is to examine the development of socio-technical accountability mechanisms in order to: a) preserve and increase the autonomy of individuals subjected to surveillance and b) replenish the asymmetry of power between those who watch and those who are watched. To do so, we address two surveillance realms: intelligence services and personal data networks. The cases studied are Spain and Brazil, from the beginning of the political transitions in the 1970s (in the realm of intelligence), and from the expansion of Internet digital networks in the 1990s (in the realm of personal data) to the present time. The examination of accountability, thus, comprises a holistic evolution of institutions, regulations, market strategies, as well as resistance tactics. The conclusion summarizes the accountability mechanisms and proposes universal principles to improve the legitimacy of authority in surveillance and politics in a broad sense

    Model Checking and Model-Based Testing : Improving Their Feasibility by Lazy Techniques, Parallelization, and Other Optimizations

    Get PDF
    This thesis focuses on the lightweight formal method of model-based testing for checking safety properties, and derives a new and more feasible approach. For liveness properties, dynamic testing is impossible, so feasibility is increased by specializing on an important class of properties, livelock freedom, and deriving a more feasible model checking algorithm for it. All mentioned improvements are substantiated by experiments

    Study, analysis and implementation of an enterprise mobility management system

    Get PDF
    The enterprise mobility management (EMM) has recently become a hot topic for business organizations. Enterprises have seen how the introduction of tablets and smartphones to corporate jobs, has supposed in one hand to some a revolution to the way some business are done, while on the other hand has also uncovered serious deficiencies for securing and access control. This document was created as a response to the imperative need to manage and remotely control devices, applications and content its have access. The initial objectives have been: Study the main characteristics about enterprise mobility and why is needed to manage it. Analyse business mobility requirements and define an Enterprise Mobility Management strategy. Compare, select and implement an Enterprise Mobility Management System and evaluate if it satisfies the needs proposed previously. For its elaboration, has been used a methodology based on phases. First, has developed an intense fieldwork to analyse the requirements of mobility in a real case study. Next, have designed a solution based on an Enterprise Mobility Management System (EMMS) that fulfil all the needs identified previously. Finally, based on previous studies, has been developed a laboratory where put in practice all the management and control techniques studied. At the project ends, the following conclusions have been reached: Companies have much benefits to gain from learn how to develop their activities in a mobility ecosystem. Instead of manifesting an attitude of resistance to this phenomenon, it is better to adopt a constructive and collaborative thinking. Therefore to define the EMM program, it is necessary to analyse the company and its business, having business units forwarding to IT how they consume, create, collaborate and communicate with their activities in order to respond to their mobility needs. Manage a heterogeneous device platform is a big management, maintenance and support challenge. It is therefore essential to identify that Enterprise Mobility Management System which best adapt to the needs required, if necessary making a deep comparison between the different options on the market. EMM solutions must be able to keep under control the device and its contents during the whole life cycle, from delivery to withdrawal. To do EMMS capabilities must include Mobile Device Management, Mobile Application Management and Mobile Content Management. These systems by themselves suppose improvements to the management and security, but are not relevant at the level of productivity and efficiency until its integration with corporate services. This is definitely the "leitmotiv" of this type of solutions. But this part also is the most difficult because when many systems currently on the companies was implanted, not taken into account its use in other conditions different from the traditional PC

    Integració de sistemes SCADA de codi obert en una aplicació web GIS

    Get PDF
    This thesis explores the integration of Supervisory Control And Data Acquisition (SCADA) and Geographic Information Systems (GIS) technologies within a web server, using open source software, for a water supply company. The motivation for this research comes from the interest in studying the feasibility of implementing open source technologies instead of proprietary ones, since the latter cause high implementation and maintenance costs, as well as dependence on specific suppliers. The motivation for using the web-based software is the easy access for the rest of the users in it, being able to interact in the same network or in field. The objectives of this thesis were twofold. First, the development of the web server. Functionalities of purely SCADA web servers are integrated into this web server, such as system monitoring and control, historical visualization, statistical analysis, etc. and purely GIS, such as the use of maps, layers and geospatial information. Secondly, it was intended to show the operational structure of the company, highlight the technologies used and explore opportunities for improvement. The results of this investigation have been positive, successfully developing the web server, effectively integrates SCADA and GIS technologies. The server proves that there is no need to have multile web servers to accomplish the various specific tasks. The use of open source technologies was not an impediment during development. On the other hand, an architecture has been proposed for the company, which includes a technological renewal using the principles of IoT communication and open source hardware and software. The conclusions drawn from the work are that open source technologies present a viable alternative to integrate GIS and SCADA systems. Thanks to the development of the web server using open source technologies, it has been shown that these technologies can be effectively merged in certain cases, as was the case of a water supply company, or similar companies such as electricity companies, ISP, gas supply or logistics companies. The combination of GIS and SCADA technologies offers good compatibility and numerous advantages. A second conclusion is the high flexibility offered by using technologies such as Python, since it has numerous libraries with a wide range of possibilities. On the other hand, one of the most notable drawbacks of the web server is its low performance, since it needs a much more powerful hardware. This is due to the amount of requests to update the layers in real time, requiring high computational power. This last part requieres more optimization work, which could not be done in this project. Finally, the paper presents different options for future studies and research in the field of web-based SCADA systems and GIS-SCADA systems.Aquest treball explora la integració de tecnologies de Control Supervisor i Adquisició de Dades (SCADA) i dels Sistemes d’Informació Geogràfica (GIS) dins d’un servidor web, utilitzant programari de codi obert, per a una empresa de subministrament d’aigua. La motivació d’aquesta investigació prové de l’interès d’estudiar la viabilitat d’implementar tecnologies de codi obert en lloc de propietàries, ja que aquestes últimes provoquen alts costos d’implementació i manteniment, així com dependència de proveïdors específics. Per la part del programari basat en web, la principal motivació és el fàcil accés de la resta d’usuaris sobre aquest, podent estar en la mateixa xarxa d’oficines o en camp. Els objectius d’aquest treball eren dobles. En primer lloc, el desenvolupament del servidor web. En aquest servidor web s’integren funcionalitats dels servidors web purament SCADA, tals com supervisió del sistema, visualització d’històrics, anàlisis estadístiques, etc. i els purament GIS, com l’ús de mapes, capes i informació geoespacial. En segon lloc, es pretenia mostrar l’estructura operativa de l’empresa, destacar les tecnologies utilitzades i explorar oportunitats de millora. Els resultats d’aquesta investigació han estat positius, desenvolupant correctament el servidor web. Aquest integra de forma eficaç les tecnologies SCADA i GIS, demostrant la no necessitat de disposar de diversos servidors web per assolir les diferents tasques. La utilització de tecnologies de codi obert no va resultar un impediment durant del desenvolupament. D’altra banda, s’ha proposat una arquitectura per a l’empresa, la qual inclou una renovació tecnològica utilitzant els principis de comunicació IoT i maquinari i programari de codi obert. Les conclusions extretes del treball són que les tecnologies de codi obert presenten una alter nativa viable per integrar els sistemes GIS i SCADA. Gràcies al desenvolupament del servidor web utilitzant tecnologies de codi obert, s’ha demostrat que aquestes tecnologies es poden fusionar de manera efectiva en determinats casos, com era el cas d’una empresa de subministrament d’aigua, o empreses similars tals com companyies elèctriques, empreses telefòniques, de subministrament de gas o logística. La combinació de les tecnologies ofereix una bona compatibilitat i nombrosos avantatges. Una segona conclusió es l’alta flexibilitat que ofereix fer servir tecnologies com Python, ja que disposa de gran quantitat de llibreries amb un gran ventall de possibilitats. D’altra banda, una de les inconvenients més notables del servidor web es el seu baix rendiment, ja que necessita un equip bastant més potent del que és disposava a l’hora de fer el treball. Això és degut a la quantitat de peticions per actualitzar les capes en temps reals. Aquest requereix més treball d’optimització, el qual no s’ha pogut realitzar en aquest projecte. Per acabar, el treball presenta diferents opcions per a estudis i investigacions futures en el camp dels sistemes SCADA i sistemes GIS-SCADA basats en web

    Sistema automàtic de fotomosaic subaquàtic per a l’avaluació de la comunitat bentònica utilitzant el vehicle d’operació remota crawler a l’observatori OBSEA

    Get PDF
    La contínua necessitat d'estudiar els oceans ha portat al desenvolupament de noves tecnologies marines per a l'adquisició de dades oceanogràfiques, biològiques, i biogeoquímiques multiparamètriques en temps real. Fins ara, l'avaluació de les poblacions bentòniques i els estudis de biodiversitat es basaven en l'ús de plataformes d'observació submarina fixes i càmeres estàtiques amb un camp de visió limitat. Pel fet que proporcionaven una petita representació de la realitat, es desenvolupen noves plataformes mòbils acoblades als observatoris fixos. Aquest treball es basa en l'observatori cablejat OBSEA, que forma part de l'Observatori Europeu Multidisciplinari del Fons Marí i la Columna d'Aigua (EMSO), i en un Vehicle d'Observació Remota (ROV), el Crawler submarí, que és una versió modificada de la sèrie de plataformes "Wally". El nou ROV és fàcilment desplegable per a monitorar les comunitats bentòniques, entre altres índexs biològics, a profunditats de fins a 50 metres. Aquí es presenta els components del Crawler i els algorismes de processament d'imatges que es realitzen per a la càmera HD incrustada en una esfera de cristall en la part davantera del vehicle. Com a context, es disposa d'una càmera de 360° amb una inclinació de 180° que permet obtenir un camp de visió (FOV) panoràmic. En aquest treball es mostra els resultats assolits mitjançant transsectes de vídeo d'anada i tornada al voltant de la plataforma OBSEA. El control de la càmera ha estat millorat amb un sistema automàtic de fotomosaic . A més, es detalla els resultats aconseguits en les proves de calibratge de la càmera fetes en diferents escenaris, amb correccions degudes a factors interns i externs de la càmera. Tanmateix, s'explica la selecció de l'àrea d'interès i com s'ha emprat una transformació de perspectiva segons la teoria de la projecció. Finalment, s'exposa una anàlisi de les imatges a través de gradients espacialment heterogenis amb propòsit d'escalar les dades locals a àrees més grans.La continua necesidad de estudiar los océanos ha llevado al desarrollo de nuevas tecnologías marinas para la adquisición de datos oceanográficos, biológicos y biogeoquímicos multiparamétricos en tiempo real. Hasta ahora, la evaluación de las poblaciones bentónicas y estudios de biodiversidad se basaban en el uso de plataformas de observación submarina fijas y cámaras estáticas con un campo de visión limitado. Al proporcionar una pequeña representación de la realidad, se desarrollan nuevas plataformas móviles ensambladas en los observatorios fijos. Este trabajo se basa en el observatorio cableado OBSEA, que forma parte del Observatorio Europeo Multidisciplinar del Fondo Marino y la Columna de Agua (EMSO), y en un Vehículo de Observación Remota (ROV), el Crawler submarino, que es una versión modificada de la serie de plataformas "Wally". El nuevo ROV es fácilmente desplegable para monitorear a las comunidades bentónicas, entre otros índices biológicos, a profundidades de hasta 50 metros. Aquí se presentan los componentes del Crawler y los algoritmos de procesamiento de imágenes que se realizan para la cámara HD incrustada en una esfera de cristal en la parte delantera del vehículo. Como contexto, se dispone de una cámara de 360° con una inclinación de 180° que permite obtener un campo de visión (FOV) panorámico. En este trabajo se muestra los resultados alcanzados mediante transectos de vídeo de ida y vuelta en torno a la plataforma OBSEA. El control de la cámara ha sido mejorado con un sistema automático de fotomosaico. Además, se detalla los resultados conseguidos en las pruebas de calibración de la cámara realizadas en diferentes escenarios, con correcciones debidas a factores internos y externos de la cámara. Sin embargo, se explica la selección del área de interés y cómo se ha utilizado una transformación de perspectiva según la teoría de la proyección. Por último, se expone un anàlisis para escalar los datos locales a áreas de mayor tamaño.The continuing need to study the oceans has led to the development of new marine technologies for the acquisition of real-time multi-parametric oceanographic, biological and biogeochemical data. So far, assessment of benthic populations and biodiversity studies have relied on the use of fixed underwater observation platforms and static cameras with a limited field of view. Because they provided a small representation of reality, new mobile platforms coupled to the fixed observatories are developed. This work it is based on the OBSEA cabled observatory, that is part of European Multidisciplinary Seafloor and water column Observatory (EMSO), and a Remote Observation Vehicle (ROV), the underwater Crawler, which is a modified version of the "Wally" platform series. The new ROV is easily deployable for monitoring benthic communities, among other biological indices, to depths up to 50m. Here we present the Crawler components and the image processing algorithms that done for the HD camera embedded in a glass sphere on the front of the vehicle. As a context, we have a 360º camera with a 180º tilt allowing us to obtain a panoramic field of view (FOV). In this work we present the results obtained through video transects back and forth around the OBSEA platform. The control of the Crawler camera has been enhanced with an automatic underwater photomosaic system with the objective to generate a mobile monitoring system. Moreover, we detail the results obtained for the camera calibration tests done in air, pool and sea scenarios, with corrections due to position of the camera and problems in terms of reflectance and light scattering. In addition, we explain the selection of the area of interest and how a perspective transformation has been employed according to projection theory. We present an analysis of images through spatially heterogeneous gradients with the aim of scalin the local data to larger areas
    corecore