272 research outputs found

    Analysis of the impact of data compression on condition monitoring algorithms for ball screws

    Get PDF
    The overall equipment effectiveness (OEE) is a management ratio to evaluate the added value of machine tools. Unplanned machine downtime reduces the operational availability and therefore, the OEE. Increased machine costs are the consequence. An important cause of unplanned machine downtimes is the total failure of ball screws of the feed axes due to wear. Therefore, monitoring of the condition of ball screws is important. Common concepts rely on high-frequency acceleration sensors from external control systems to detect a change of the condition. For trend and detailed damage analysis, large amounts of data are generated and stored over a long time period (>5 years), resulting in corresponding data storage costs. Additional axes or machine tools increase the data volume further, adding to the total storage costs. To minimize these costs, data compression or source coding has to be applied. To achieve maximum compression ratios, lossy coding algorithms have to be used, which introduce distortion in a signal. In this work, the influence of lossy coding algorithms on a condition monitoring algorithm (CMA) using acceleration signals is investigated. The CMA is based on principal component analysis and uses 17 features such as standard deviation to predict the preload condition of a ball screw. It is shown that bit rate reduction through lossy compression algorithms is possible without affecting the condition monitoring - as long as the compression algorithm is known. In contrast, an unknown compression algorithm reduces the classification accuracy of condition monitoring by about 20 % when coding with a quantizer resolution of 4 bit/sample

    Dynamic Quantization using Spike Generation Mechanisms

    Get PDF
    This paper introduces a neuro-inspired co-ding/decoding mechanism of a constant real value by using a Spike Generation Mechanism (SGM) and a combination of two Spike Interpretation Mechanisms (SIM). One of the most efficient and widely used SGMs to encode a real value is the Leaky-Integrate and Fire (LIF) model which produces a spike train. The duration of the spike train is bounded by a given time constraint. Seeking for a simple solution of how to interpret the spike train and to reconstruct the input value, we combine two different kinds of SIMs, the time-SIM and the rate-SIM. The time-SIM allows a high quality interpretation of the neural code and the rate-SIM allows a simple decoding mechanism by couting the spikes. The resulting coding/decoding process, called the Dual-SIM Quantizer (Dual-SIMQ), is a non-uniform quantizer. It is shown that it coincides with a uniform scalar quantizer under certain assumptions. Finally, it is also shown that the time constraint can be used to control automatically the reconstruction accuracy of this time-dependent quantizer

    Computer Simulation of a New Redundancy Reduction Technique

    Get PDF
    The compression of information into smaller and smaller bandwidths has been a constant struggle against an expanding demand. Since the modulation of a carrier necessarily produces sidebands, signals cannot be sent in zero bandwidth. For some time, attempts were made to beat nature, until the natural limits of noise and bandwidth were established. Men like Hartley, Nyquist, and Shannon set a sound scientific base and as a result, rules and limits have been Developed. But this struggle continues with ever more sophisticated demands. As requirements for larger amounts of scientific data arise, methods for utilizing the available means of data transmission efficiently are needed. For many years, communication engineers have discussed the advantages of a simple and reliable means of transmitting, recording, and processing all of the data. Although these engineers contemplated such means, they concentrated on designing for wider bandwidths and more power--the brute-force way to transfer an ever-increasing mass of data in a given period of time. Eventually, realizing that this effort was futile, communication engineers used the knowledge gained from their past efforts and applied it t6 design a way for transmitting only the significant data or information. The methods devised are called data compression methods. Data compression is a technique to reduce the bandwidth needed to transmit a given amount of information in a given time or to reduce the time needed to transmit a given bandwidth signal. Such compress ion must eliminate redundancies so that only those values which are essential to the faithful reproduction of the input signal (relative to some error criterion) are transmitted. The performance enhancement of a basic data acquisition system by incorporation of data compression can be manifested in a variety of ways, depending. In part, on the manner in which the data compressor is utilized in the system and the performance desired. As indicated in Figure 1, the engineer has the option of incorporating data compression into either the transmitter or the receiver portions of the system. Four basic categories of data handling come under this definition: parameter extraction, adaptive sampling, redundancy reduction, and encoding

    Radio Frequency Interference Impact Assessment on Global Navigation Satellite Systems

    Get PDF
    The Institute for the Protection and Security of the Citizen of the EC Joint Research Centre (IPSC-JRC) has been mandated to perform a study on the Radio Frequency (RF) threat against telecommunications and ICT control systems. This study is divided into two parts. The rst part concerns the assessment of high energy radio frequency (HERF) threats, where the focus is on the generation of electromagnetic pulses (EMP), the development of corresponding devices and the possible impact on ICT and power distribution systems. The second part of the study concerns radio frequency interference (RFI) with regard to global navigation satellite systems (GNSS). This document contributes to the second part and contains a detailed literature study disclosing the weaknesses of GNSS systems. Whereas the HERF analysis only concerns intentional interference issues, this study on GNSS also takes into account unintentional interference, enlarging the spectrum of plausible interference scenarios.JRC.DG.G.6-Security technology assessmen

    dtControl: Decision Tree Learning Algorithms for Controller Representation

    Full text link
    Decision tree learning is a popular classification technique most commonly used in machine learning applications. Recent work has shown that decision trees can be used to represent provably-correct controllers concisely. Compared to representations using lookup tables or binary decision diagrams, decision trees are smaller and more explainable. We present dtControl, an easily extensible tool for representing memoryless controllers as decision trees. We give a comprehensive evaluation of various decision tree learning algorithms applied to 10 case studies arising out of correct-by-construction controller synthesis. These algorithms include two new techniques, one for using arbitrary linear binary classifiers in the decision tree learning, and one novel approach for determinizing controllers during the decision tree construction. In particular the latter turns out to be extremely efficient, yielding decision trees with a single-digit number of decision nodes on 5 of the case studies
    • …
    corecore