367,781 research outputs found

    Implementation of a piezo-diagnostics approach for damage detection based on PCA in a linux-based embedded platform

    Get PDF
    The implementation of damage-detection methods for continuously assessing structural integrity entails systems with attractive features such as storage capabilities, memory capacity, computational complexity and time-consuming processing. In this sense, embedded hardware platforms are a promising technology for developing integrated solutions in Structural Health Monitoring. In this paper, design, test, and specifications for a standalone inspection prototype are presented, which take advantage of piezo-diagnostics principle, statistical processing via Principal Component Analysis (PCA) and embedded systems. The equipment corresponds to a piezoelectric active system with the capability to detect defects in structures, by using a PCA-based algorithm embedded in the Odroid-U3 ARM Linux platform. The operation of the equipment consists of applying, at one side of the structure, wide guided waves by means of piezoelectric devices operated in actuation mode and to record the wave response in another side of the structure by using the same kind of piezoelectric devices operated in sensor mode. Based on the nominal response of the guide wave (no damages), represented by means of a PCA statistical model, the system can detect damages between the actuated/sensed points through squared prediction error (Q-statistical index). The system performance was evaluated in a pipe test bench where two kinds of damages were studied: first, a mass is added to the pipe surface, and then leaks are provoked to the pipe structure by means of a drill tool. The experiments were conducted on two lab structures: (i) a meter carbon-steel pipe section and (ii) a pipe loop structure. The wave response was recorded between the instrumented points for two conditions: (i) The pipe in nominal conditions, where several repetitions will be applied to build the nominal statistical model and (ii) when damage is caused to the pipe (mass adding or leak). Damage conditions were graphically recognized through the Q-statistic chart. Thus, the feasibility to implement an automated real-time diagnostic system is demonstrated with minimum processing resources and hardware flexibility.Peer ReviewedPostprint (published version

    Formal and Informal Methods for Multi-Core Design Space Exploration

    Full text link
    We propose a tool-supported methodology for design-space exploration for embedded systems. It provides means to define high-level models of applications and multi-processor architectures and evaluate the performance of different deployment (mapping, scheduling) strategies while taking uncertainty into account. We argue that this extension of the scope of formal verification is important for the viability of the domain.Comment: In Proceedings QAPL 2014, arXiv:1406.156

    Applications of Statistical Physics to the Social and Economic Sciences

    Get PDF
    This thesis applies statistical physics concepts and methods to quantitatively analyze socioeconomic systems. For each system we combine theoretical models and empirical data analysis in order to better understand the real-world system in relation to the complex interactions between the underlying human agents. This thesis is separated into three parts: (i) response dynamics in financial markets, (ii) dynamics of career trajectories, and (iii) a stochastic opinion model with quenched disorder. In Part I we quantify the response of U.S. markets to financial shocks, which perturb markets and trigger “herding behavior” among traders. We use concepts from earthquake physics to quantify the decay of volatility shocks after the “main shock.” We also find, surprisingly, that we can make quantitative statements even before the main shock. In order to analyze market behavior before as well as after “anticipated news” we use Federal Reserve interest-rate announcements, which are regular events that are also scheduled in advance. In Part II we analyze the statistical physics of career longevity. We construct a stochastic model for career progress which has two main ingredients: (a) random forward progress in the career and (b) random termination of the career. We incorporate the rich-get-richer (Matthew) effect into ingredient (a), meaning that it is easier to move forward in the career the farther along one is in the career. We verify the model predictions analyzing data on 400,000 scientific careers and 20,000 professional sports careers. Our model highlights the importance of early career development, showing that many careers are stunted by the relative disadvantage associated with inexperience. In Part III we analyze a stochastic two-state spin model which represents a system of voters embedded on a network. We investigate the role in consensus formation of “zealots”, which are agents with time-independent opinion. Our main result is the unexpected finding that it is the number and not the density of zealots which determines the steady-state opinion polarization. We compare our findings with results for United States Presidential elections

    Input variable selection in time-critical knowledge integration applications: A review, analysis, and recommendation paper

    Get PDF
    This is the post-print version of the final paper published in Advanced Engineering Informatics. The published article is available from the link below. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. Copyright @ 2013 Elsevier B.V.The purpose of this research is twofold: first, to undertake a thorough appraisal of existing Input Variable Selection (IVS) methods within the context of time-critical and computation resource-limited dimensionality reduction problems; second, to demonstrate improvements to, and the application of, a recently proposed time-critical sensitivity analysis method called EventTracker to an environment science industrial use-case, i.e., sub-surface drilling. Producing time-critical accurate knowledge about the state of a system (effect) under computational and data acquisition (cause) constraints is a major challenge, especially if the knowledge required is critical to the system operation where the safety of operators or integrity of costly equipment is at stake. Understanding and interpreting, a chain of interrelated events, predicted or unpredicted, that may or may not result in a specific state of the system, is the core challenge of this research. The main objective is then to identify which set of input data signals has a significant impact on the set of system state information (i.e. output). Through a cause-effect analysis technique, the proposed technique supports the filtering of unsolicited data that can otherwise clog up the communication and computational capabilities of a standard supervisory control and data acquisition system. The paper analyzes the performance of input variable selection techniques from a series of perspectives. It then expands the categorization and assessment of sensitivity analysis methods in a structured framework that takes into account the relationship between inputs and outputs, the nature of their time series, and the computational effort required. The outcome of this analysis is that established methods have a limited suitability for use by time-critical variable selection applications. By way of a geological drilling monitoring scenario, the suitability of the proposed EventTracker Sensitivity Analysis method for use in high volume and time critical input variable selection problems is demonstrated.E

    Embedded model discrepancy: A case study of Zika modeling

    Full text link
    Mathematical models of epidemiological systems enable investigation of and predictions about potential disease outbreaks. However, commonly used models are often highly simplified representations of incredibly complex systems. Because of these simplifications, the model output, of say new cases of a disease over time, or when an epidemic will occur, may be inconsistent with available data. In this case, we must improve the model, especially if we plan to make decisions based on it that could affect human health and safety, but direct improvements are often beyond our reach. In this work, we explore this problem through a case study of the Zika outbreak in Brazil in 2016. We propose an embedded discrepancy operator---a modification to the model equations that requires modest information about the system and is calibrated by all relevant data. We show that the new enriched model demonstrates greatly increased consistency with real data. Moreover, the method is general enough to easily apply to many other mathematical models in epidemiology.Comment: 9 pages, 7 figure

    Quantitative Verification: Formal Guarantees for Timeliness, Reliability and Performance

    Get PDF
    Computerised systems appear in almost all aspects of our daily lives, often in safety-critical scenarios such as embedded control systems in cars and aircraft or medical devices such as pacemakers and sensors. We are thus increasingly reliant on these systems working correctly, despite often operating in unpredictable or unreliable environments. Designers of such devices need ways to guarantee that they will operate in a reliable and efficient manner. Quantitative verification is a technique for analysing quantitative aspects of a system's design, such as timeliness, reliability or performance. It applies formal methods, based on a rigorous analysis of a mathematical model of the system, to automatically prove certain precisely specified properties, e.g. ``the airbag will always deploy within 20 milliseconds after a crash'' or ``the probability of both sensors failing simultaneously is less than 0.001''. The ability to formally guarantee quantitative properties of this kind is beneficial across a wide range of application domains. For example, in safety-critical systems, it may be essential to establish credible bounds on the probability with which certain failures or combinations of failures can occur. In embedded control systems, it is often important to comply with strict constraints on timing or resources. More generally, being able to derive guarantees on precisely specified levels of performance or efficiency is a valuable tool in the design of, for example, wireless networking protocols, robotic systems or power management algorithms, to name but a few. This report gives a short introduction to quantitative verification, focusing in particular on a widely used technique called model checking, and its generalisation to the analysis of quantitative aspects of a system such as timing, probabilistic behaviour or resource usage. The intended audience is industrial designers and developers of systems such as those highlighted above who could benefit from the application of quantitative verification,but lack expertise in formal verification or modelling

    Ethernet - a survey on its fields of application

    Get PDF
    During the last decades, Ethernet progressively became the most widely used local area networking (LAN) technology. Apart from LAN installations, Ethernet became also attractive for many other fields of application, ranging from industry to avionics, telecommunication, and multimedia. The expanded application of this technology is mainly due to its significant assets like reduced cost, backward-compatibility, flexibility, and expandability. However, this new trend raises some problems concerning the services of the protocol and the requirements for each application. Therefore, specific adaptations prove essential to integrate this communication technology in each field of application. Our primary objective is to show how Ethernet has been enhanced to comply with the specific requirements of several application fields, particularly in transport, embedded and multimedia contexts. The paper first describes the common Ethernet LAN technology and highlights its main features. It reviews the most important specific Ethernet versions with respect to each application field’s requirements. Finally, we compare these different fields of application and we particularly focus on the fundamental concepts and the quality of service capabilities of each proposal

    An embedded system for evoked biopotential acquisition and processing

    Get PDF
    This work presents an autonomous embedded system for evoked biopotential acquisition and processing. The system is versatile and can be used on different evoked potential scenarios like medical equipments or brain computer interfaces, fulfilling the strict real-time constraints that they impose. The embedded system is based on an ARM9 processor with capabilities to port a real-time operating system. Initially, a benchmark of the Windows CE operative system running on the embedded system is presented in order to find out its real-time capability as a set. Finally, a brain computer interface based on visual evoked potentials is implemented. Results of this application recovering visual evoked potential using two techniques: the fast Fourier transform and stimulus locked inter trace correlation, are also presented.Fil: Garcia, Pablo Andres. Universidad Nacional de la Plata. Facultad de IngenierĂ­a. Departamento de Electrotecnia. Laboratorio de ElectrĂłnica Industrial, Control e InstrumentaciĂłn; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂŠcnicas; ArgentinaFil: Spinelli, Enrique Mario. Universidad Nacional de la Plata. Facultad de IngenierĂ­a. Departamento de Electrotecnia. Laboratorio de ElectrĂłnica Industrial, Control e InstrumentaciĂłn; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂŠcnicas; ArgentinaFil: Toccaceli, Graciela Mabel. Universidad Nacional de la Plata. Facultad de IngenierĂ­a. Departamento de Electrotecnia. Laboratorio de ElectrĂłnica Industrial, Control e InstrumentaciĂłn; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂŠcnicas; Argentin

    Requirements to Testing of Power System Services Provided by DER Units

    Get PDF
    The present report forms the Project Deliverable ‘D 2.2’ of the DERlab NoE project, supported by the EC under Contract No. SES6-CT-518299 NoE DERlab. The present document discuss the power system services that may be provided from DER units and the related methods to test the services actually provided, both at component level and at system level
    • …
    corecore