17 research outputs found

    Entropy-based algorithms for signal processing

    Get PDF
    Entropy, the key factor of information theory, is one of the most important research areas in computer science. Entropy coding informs us of the formal limits of today’s storage and communication infrastructure. Over the last few years, entropy has become an important trade-off measure in signal processing. Entropy measures especially have been used in image and video processing by applying sparsity and are able to help us to solve several of the issues that we are currently facing. As the daily produced data are increasing rapidly, a more effective approach to encode or compress the big data is required. In this sense, applications of entropy coding can improve image and video coding, imaging, quality assessment in agricultural products, and product inspection, by applying more effective coding approaches. In light of these and many other challenges, a Special Issue of Entropy-Based Algorithms for Signal Processing has been dedicated to address the current status, challenges, and future research priorities for the entropy of signal processing

    A novel method of frequency band selection for squared envelope analysis for fault diagnosing of rolling element bearings in a locomotive powertrain

    Get PDF
    The development of diagnostics for rolling element bearings (REBs) in recent years makes it possible to detect faults in bearings in real-time. Squared envelope analysis (SEA), in which the statistical characteristics of the squared envelope (SE) or the squared envelope spectrum (SES) are analysed, is widely recognized as both an effective and the simplest method. The most critical step of SEA is to find an optimal frequency band that contains the maximum defect information. The most commonly used approaches for selecting the optimal frequency band are derived from measuring the kurtosis of the SE or the SES. However, most methods fail to cope with the interference of a single or a few impulses in the corresponding domain. A new method is proposed in this paper called “PMFSgram”, which just calculates the kurtosis of the SES in the range centred at the first two harmonics with a span of three times the modulation frequency rather than the entire SES of the band filtered signals. It is possible to avoid most of the interference from a small number of undesired impulses in the SES. PMFSgram uses several bandwidths from 1.5 times to 4.5 times the fault frequency and for each bandwidth has the same number of central frequencies. The frequency setting method is able to select an optimal frequency band containing most of the useful information with less noise. The performance of the new method is verified using synthesized signals and actual vibration data

    A Digital Triplet for Utilizing Offline Environments to Train Condition Monitoring Systems for Rolling Element Bearings

    Get PDF
    Manufacturing competitiveness is related to making a quality product while incurring the lowest costs. Unexpected downtime caused by equipment failure negatively impacts manufacturing competitiveness due to the ensuing defects and delays caused by the downtime. Manufacturers have adopted condition monitoring (CM) techniques to reduce unexpected downtime to augment maintenance strategies. The CM adoption has transitioned maintenance from Breakdown Maintenance (BM) to Condition-Based Maintenance (CbM) to anticipate impending failures and provide maintenance actions before equipment failure. CbM is the umbrella term for maintenance strategies that use condition monitoring techniques such as Preventive Maintenance (PM) and Predictive Maintenance (PdM). Preventive Maintenance involves providing periodic checks based on either time or sensory input. Predictive Maintenance utilizes continuous or periodic sensory inputs to determine the machine health state to predict the equipment failure. The overall goal of the work is to improve bearing diagnostic and prognostic predictions for equipment health by utilizing surrogate systems to generate failure data that represents production equipment failure, thereby providing training data for condition monitoring solutions without waiting for real world failure data. This research seeks to address the challenges of obtaining failure data for CM systems by incorporating a third system into monitoring strategies to create a Digital Triplet (DTr) for condition monitoring to increase the amount of possible data for condition monitoring. Bearings are a critical component in rotational manufacturing systems with wide application to other industries outside of manufacturing, such as energy and defense. The reinvented DTr system considers three components: the physical, surrogate, and digital systems. The physical system represents the real-world application in production that cannot fail. The surrogate system represents a physical component in a test system in an offline environment where data is generated to fill in gaps from data unavailable in the real-world system. The digital system is the CM system, which provides maintenance recommendations based on the ingested data from the real world and surrogate systems. In pursuing the research goal, a comprehensive bearing dataset detailing these four failure modes over different collection operating parameters was created. Subsequently, the collections occurred under different operating conditions, such as speed-varying, load-varying, and steadystate. Different frequency and time measures were used to analyze and identify differentiating criteria between the different failure classes over the differing operating conditions. These empirical observations were recreated using simulations to filter out potential outliers. The outputs of the physical model were combined with knowledge from the empirical observations to create ”spectral deltas” to augment existing bearing data and create new failure data that resemble similar frequency criteria to the original data. The primary verification occurred on a laboratory-bearing test stand. A conjecture is provided on how to scale to a larger system by analyzing a larger system from a local manufacturer. From the subsequent analysis of machine learning diagnosis and prognosis models, the original and augmented bearing data can complement each other during model training. The subsequent data substitution verifies that bearing data collected under different operating conditions and sizes can be substituted between different systems. Ostensibly, the full formulation of the digital triplet system is that bearing data generated at a smaller size can be scaled to train predictive failure models for larger bearing sizes. Future work should consider implementing this method for other systems outside of bearings, such as gears, non-rotational equipment, such as pumps, or even larger complex systems, such as computer numerically controlled machine tools or car engines. In addition, the method and process should not be restricted to only mechanical systems and could be applied to electrical systems, such as batteries. Furthermore, an investigation should consider further data-driven approximations to specific bearing characteristics related to the stiffness and damping parameters needed in modeling. A final consideration is for further investigation into the scalability quantities within the data and how to track these changes through different system levels

    Advances in Condition Monitoring, Optimization and Control for Complex Industrial Processes

    Get PDF
    The book documents 25 papers collected from the Special Issue “Advances in Condition Monitoring, Optimization and Control for Complex Industrial Processes”, highlighting recent research trends in complex industrial processes. The book aims to stimulate the research field and be of benefit to readers from both academic institutes and industrial sectors

    Of evolution, information, vitalism and entropy: reflections of the history of science and epistemology in the works of Balzac, Zola, Queneau, and Houellebecq

    Full text link
    This dissertation proposes the application of rarely-used epistemological and scientific lenses to the works of four authors spanning two centuries: Honoré de Balzac, Émile Zola, Raymond Queneau, and Michel Houellebecq. Each of these novelists engaged closely with questions of science and epistemology, yet each approached that engagement from a different scientific perspective and epistemological moment. In Balzac’s La Peau de chagrin, limits of determinism and experimental method tend to demonstrate that there remains an inscrutable yet guided excess in the interactions between the protagonist Raphaël and his enchanted skin. This speaks to an embodiment of the esprit préscientifique, a framework that minimizes the utility of scientific practice in favor of the unresolved mystery of vitalism. With Zola comes a move away from undefinable mystery to a construction of the novel consistent with Claude Bernard’s deterministic experimental medicine. Yet Zola’s Roman expérimental project is only partially executed, in that the Newtonian framework that underlies Bernard’s method yields to contrary evidence in Zola’s text of entropy, error, and loss of information consistent with the field of thermodynamics. In Queneau’s texts, Zola’s interest in current science not only remains, but is updated to reflect the massive upheaval in scientific thought that took place in the last half of the nineteenth and early part of the twentieth centuries. If Queneau’s texts explicitly mention advances like relativity, however, they often do so in a humorously dismissive manner that values pre-entropic and even early geometric constructs like perpetual motion machines and squared circles. Queneau’s apparent return to the pre-scientific ultimately yields to Houellebecq’s textual abyss. For Houellebecq, science is not only to be embraced in its entropic and relativistic constructs; it is these very constructs - and the style typically used to present them – that serve as a reminder of the abjection, decay, and hopelessness of human existence. Gone is the mystery of life in its totality. In its place remain humans acting as a series of particles mechanically obeying deterministic laws. The parenthesis that opened with Balzac’s positive coding of pre-scientific thought closes with Houellebecq’s negative coding of modern scientific theory

    Psychopolitical Anaphylaxis

    Get PDF
    The great acceleration that has become known as the Anthropocene has brought with it destructive consequences that threaten to give rise to a dangerous and potentially explosive convergent reaching of limits, not just climatically or biospherically, but psychosocially. This convergence demands a new kind of thinking and a reconsideration of fundamental philosophical, political and economic theory in light especially of the age of computational capitalism, in order to prevent this convergence from becoming absolutely catastrophic. The French philosopher Bernard Stiegler argued that the basis for such a reconsideration must be, in a very general way, the thought of entropy. Psychopolitical Anaphylaxis examines, draws on, and dialogues with Stiegler’s work, and aims to take steps towards this new kind of thinking. Borrowing also from Georges Canguilhem and Peter Sloterdijk, among others, it argues as well for an immunological perspective that sees psychopolitical convulsions as a kind of anaphylactic shock that threatens to prove fatal. The paradox that must ultimately be confronted in the Anthropocene conceived as an Entropocene is the contradiction between the urgent need for a global emergency procedure and the equally necessary task of finding the time to carefully rethink our way beyond this anaphylaxis. The task of thinking today must be to inhabit this paradox and make it the basis of a new dynamic

    Negentropy Spectrum Decomposition and Its Application in Compound Fault Diagnosis of Rolling Bearing

    No full text
    The rolling bearings often suffer from compound fault in practice. Compared with single fault, compound fault contains multiple fault features that are coupled together and make it difficult to detect and extract all fault features by traditional methods such as Hilbert envelope demodulation, wavelet transform and empirical node decomposition (EMD). In order to realize the compound fault diagnosis of rolling bearings and improve the diagnostic accuracy, we developed negentropy spectrum decomposition (NSD), which is based on fast empirical wavelet transform (FEWT) and spectral negentropy, with cyclic extraction as the extraction method. The infogram is constructed by FEWT combined with spectral negentropy to select the best band center and bandwidth for band-pass filtering. The filtered signal is used as a new measured signal, and the fast empirical wavelet transform combined with spectral negentropy is used to filter the new measured signal again. This operation is repeated to achieve cyclic extraction, until the signal no longer contains obvious fault features. After obtaining the envelope of all extracted components, compound fault diagnosis of rolling bearings can be realized. The analysis of the simulation signal and the experimental signal shows that the method can realize the compound fault diagnosis of rolling bearings, which verifies the feasibility and effectiveness of the method. The method proposed in this paper can detect and extract all the fault features of compound fault completely, and it is more reliable for the diagnosis of compound fault. Therefore, the method has practical significance in rolling bearing compound fault diagnosis
    corecore