7,108 research outputs found
A fast and accurate per-cell dynamic IR-drop estimation method for at-speed scan test pattern validation
ITC : 2012 IEEE International Test Conference , 5-8 Nov. 2012 , Anaheim, CA, USAIn return for increased operating frequency and reduced supply voltage in nano-scale designs, their vulnerability to IR-drop-induced yield loss grew increasingly apparent. Therefore, it is necessary to consider delay increase effect due to IR-drop during at-speed scan testing. However, it consumes significant amounts of time for precise IR-drop analysis. This paper addresses this issue with a novel per-cell dynamic IR-drop estimation method. Instead of performing time-consuming IR-drop analysis for each pattern one by one, the proposed method uses global cycle average power profile for each pattern and dynamic IR-drop profiles for a few representative patterns, thus total computation time is effectively reduced. Experimental results on benchmark circuits demonstrate that the proposed method achieves both high accuracy and high time-efficiency
A Capture-Safe Test Generation Scheme for At-Speed Scan Testing
Capture-safety, defined as the avoidance of any timing error due to unduly high launch switching activity in capture mode during at-speed scan testing, is critical for avoiding test- induced yield loss. Although point techniques are available for reducing capture IR-drop, there is a lack of complete capture-safe test generation flows. The paper addresses this problem by proposing a novel and practical capture-safe test generation scheme, featuring (1) reliable capture-safety checking and (2) effective capture-safety improvement by combining X-bit identification & X-filling with low launch- switching-activity test generation. This scheme is compatible with existing ATPG flows, and achieves capture-safety with no changes in the circuit-under-test or the clocking scheme.2008 13th European Test Symposium, 25-29 May 2008, Verbania, Ital
Conformance Testing as Falsification for Cyber-Physical Systems
In Model-Based Design of Cyber-Physical Systems (CPS), it is often desirable
to develop several models of varying fidelity. Models of different fidelity
levels can enable mathematical analysis of the model, control synthesis, faster
simulation etc. Furthermore, when (automatically or manually) transitioning
from a model to its implementation on an actual computational platform, then
again two different versions of the same system are being developed. In all
previous cases, it is necessary to define a rigorous notion of conformance
between different models and between models and their implementations. This
paper argues that conformance should be a measure of distance between systems.
Albeit a range of theoretical distance notions exists, a way to compute such
distances for industrial size systems and models has not been proposed yet.
This paper addresses exactly this problem. A universal notion of conformance as
closeness between systems is rigorously defined, and evidence is presented that
this implies a number of other application-dependent conformance notions. An
algorithm for detecting that two systems are not conformant is then proposed,
which uses existing proven tools. A method is also proposed to measure the
degree of conformance between two systems. The results are demonstrated on a
range of models
Formal Approaches to Control System Security From Static Analysis to Runtime Enforcement
With the advent of Industry 4.0, industrial facilities and critical infrastructures are transforming into an ecosystem of heterogeneous physical and cyber components, such as programmable logic controllers, increasingly interconnected and therefore exposed to cyber-physical attacks, i.e., security breaches in cyberspace that may adversely affect the physical processes underlying industrial control systems. The main contributions of this thesis follow two research strands that address the security concerns of industrial control systems via formal methodologies. As our first contribution, we propose a formal approach based on model checking and statistical model checking, within the MODEST TOOLSET, to analyse the impact of attacks targeting nontrivial control systems equipped with an intrusion detection system (IDS) capable of detecting and mitigating attacks. Our goal is to evaluate the impact of cyber-physical attacks, i.e., attacks targeting sensors and/or actuators of the system with potential consequences on the safety of the inner physical process. Our security analysis estimates both the physical impact of the attacks and the performance of the IDS. As our second contribution, we propose a formal approach based on runtime enforcement to ensure specification compliance in networks of controllers, possibly compromised by colluding malware that may tamper with actuator commands, sensor readings, and inter-controller communications. Our approach relies on an ad-hoc sub-class of Ligatti et al.’s edit automata to enforce controllers represented in Hennessy and Regan’s Timed Process Language. We define a synthesis algorithm that, given an alphabet P of observable actions and a timed correctness property e, returns a monitor that enforces the property e during the execution of any (potentially corrupted) controller with alphabet P, and complying with the property e. Our monitors correct and suppress incorrect actions coming from corrupted controllers and emit actions in full autonomy when the controller under scrutiny is not able to do so in a correct manner. Besides classical requirements, such as transparency and soundness, the proposed enforcement enjoys deadlock- and diverge-freedom of monitored controllers, together with compositionality when dealing with networks of controllers. Finally, we test the proposed enforcement mechanism on a non-trivial case study, taken from the context of industrial water treatment systems, in which the controllers are injected with different malware with different malicious goals
Laser Scanner Technology
Laser scanning technology plays an important role in the science and engineering arena. The aim of the scanning is usually to create a digital version of the object surface. Multiple scanning is sometimes performed via multiple cameras to obtain all slides of the scene under study. Usually, optical tests are used to elucidate the power of laser scanning technology in the modern industry and in the research laboratories. This book describes the recent contributions reported by laser scanning technology in different areas around the world. The main topics of laser scanning described in this volume include full body scanning, traffic management, 3D survey process, bridge monitoring, tracking of scanning, human sensing, three-dimensional modelling, glacier monitoring and digitizing heritage monuments
Digital workflows for the management of existing structures in the pre- and post-earthquake phases: BIM, CDE, drones, laser-scanning and AI
La metodologia BIM, sviluppata in America negli anni '70, ha rivoluzionato l'industria delle costruzioni introducendo i principi di innovazione e digitalizzazione per la gestione dei progetti, in un settore settore produttivo troppo legato a logiche tradizionali. I numerosi processi digitali che sono stati sviluppati da allora hanno riguardato in gran parte la progettazione di nuovi edifici, e sono principalmente legati alla disciplina del construction management. Alcune prime sperimentazioni condotte nel tempo hanno mostrato come l'estensione di questa metodologia agli edifici esistenti comporti molte difficoltà .
In questo panorama, il lavoro di tesi si concentra sulla gestione delle strutture nella fase pre e post-sisma con l'obiettivo di sviluppare processi digitali basati sull'uso di tecnologie innovative applicate sia agli edifici ordinari che a quelli storici.
Il primo workflow sviluppato, relativo alla fase pre-sisma, è stato denominato scan-to-FEM, ed è finalizzato a particolarizzare il classico processo scan-to-BIM nel campo dell'ingegneria strutturale, analizzando così tutti i passaggi dal rilievo dell'edificio con le tecniche digitali di fotogrammetria e laser-scanning fino all'analisi strutturale e alla valutazione della sicurezza nei confronti delle azioni sismiche.
I processi di gestione delle strutture post-sisma sono invece incentrati sulla stima della sicurezza della struttura e sulla definizione delle strategie di intervento, e si basano sull'analisi delle caratteristiche intrinseche della struttura e dei danni indotti dagli eventi sismici. L'intero processo di valutazione del livello operativo di un edificio è stato quindi rivisto alla luce delle moderne tecnologie digitali. Nel dettaglio, sono state sviluppate Reti Neurali Convoluzionali (CNN) per la crack detection, e l'estrazione delle informazioni numeriche associate alle lesioni, gestite poi grazie ai modelli BIM. I quadri fessurativi sono stati digitalizzati grazie allìintroduzione un nuovo oggetto BIM "lesione" (attualmente non codificato nello standard IFC), al quale è stato aggiunto un set di parametri in parte valutati con le CNN ed in parte qualitativi.
Durante lo sviluppo di questi processi, sono stati sviluppati nuovi strumenti adhoc per la gestione degli edifici esistenti. In particolare, sono state definite specifiche per lo sviluppo di schede tecniche digitali dei danni, e per la creazione del nuovo oggetto BIM "lesione".
I processi di gestione degli edifici danneggiati, grazie agli sviluppi tecnologici realizzati, sono stati applicati per la digitalizzazione dell'edificio storico della chiesa di San Pietro in Vinculis danneggiato a seguito di eventi sismici, grazie ai quali sono stati sperimentati i massimi benefici in termini di riduzione di tempo e risparmio di risorse
- …