295 research outputs found
Recommended from our members
The design of an effective sensor fusion model for condition monitoring systems of turning processes
High energy price and the increasing requirements of quality and low cost of products have created an urgent need to implement new technologies in current automated manufacturing environments. Condition monitoring systems of manufacturing processes have been recognised in recent years as one of the essential technologies that provide the competitive advantage in many manufacturing environments. This research aims to develop an effective sensor fusion model for turning processes for the detection of tool wear. Multi-sensors combined with a novelty detection algorithm and Learning Vector Quantisation (LVQ) neural networks are used in this research to detect tool wear and provide diagnostic and prognostic information. A novel approach, termed ASPST, (Automated Sensor and Signal Processing Selection System for Turning) is used to select the most appropriate sensors and signal processing methods. The aim is to reduce the number of sensors needed in the overall system and reduce the cost. The ASPST approach is based on simplifying complex sensory signals into a group of Sensory Characteristic Features (SCFs) and evaluating the sensitivity of these SCFs in detecting tool wear. A wide range of sensory signals (cutting forces, strain, acceleration, acoustic emission and sound) and signal processing methods are also implemented to verify the capability of the approach. A cost reduction method is also implemented based on eliminating the least utilised sensor in an attempt to reduce the overall cost of the system without sacrificing the capability of the condition monitoring system. The experimental results prove that the suggested approach provides a responsive and effective solution in monitoring tool wear in turning with reduced time and cost
Early Prediction of Diabetes Using Deep Learning Convolution Neural Network and Harris Hawks Optimization
Owing to the gravity of the diabetic disease the minimal level symptoms for diabetic failure in the early stage must be forecasted. The prediction system instantaneous and prior must thus be developed to eliminate serious medical factors. Information gathered from Pima Indian Diabetic dataset are synthesized through a profound learning approach that provides features for diabetic level information. Metadata is used to enhance the recognition process for the profound learned features. The distinct details retrieved by integrated machine and computer technology, including glucose level, health information, age, insulin level, etc. Due to the efficacious Hawks Optimization Algorithm (HOA), the data's insignificant participation in diabetic diagnostic processes is minimized in process analysis luminosity. Diabetic disease has been categorized with Deep Learning Convolution Networks (DLCNN) from among the chosen diabetic characteristics. The process output developed is measured on the basis of test results in terms of error rate, sensitivity, specificity and accuracy
Proceedings of the Third International Workshop on Neural Networks and Fuzzy Logic, volume 2
Papers presented at the Neural Networks and Fuzzy Logic Workshop sponsored by the National Aeronautics and Space Administration and cosponsored by the University of Houston, Clear Lake, held 1-3 Jun. 1992 at the Lyndon B. Johnson Space Center in Houston, Texas are included. During the three days approximately 50 papers were presented. Technical topics addressed included adaptive systems; learning algorithms; network architectures; vision; robotics; neurobiological connections; speech recognition and synthesis; fuzzy set theory and application, control and dynamics processing; space applications; fuzzy logic and neural network computers; approximate reasoning; and multiobject decision making
Comparing spatial features of urban housing markets:
Various location specific attributes contribute to the spatial dynamics of housing markets. This effect may partly be of a qualitative and discontinuous nature, which causes market segmentation into submarkets. The question however is, whether the most relevant partitioning criteria is directly related to the transaction price of to other, socioeconomic, demographic and physical features of the location. Two neural network techniques are used for analysing statistical house price data from Amsterdam and Helsinki. The analytic hierarchy process is used as a supporting technique. With these techniques it is possible to analyse various dimensions of housing submarket formation. The findings show that, while the price and demand factors have increased in importance, supply factors still prevail as key criteria in both cases. The outcome also indicates that the housing market structure of Amsterdam is more fragmented than that of Helsinki, and that the main discriminating housing market features, and the ways they have changed in time, are somewhat different
Recommended from our members
Investigation into the effect of fixturing systems on the design of condition monitoring for machining operations
The global market competition has drawn the manufacturer’s attention on automated manufacturing processes using condition monitoring systems. These systems have been used for improving product quality, eliminating inspection, and enhancing manufacturing productivity. Fixtures are essential devices in machining processes to hold the tool or workpiece, hence they are influenced directly by the stability of the cutting tool. Therefore, tool and fixturing faults play an important part in the inaccuracy of the machining processes causing deterioration of surface roughness. For the above mentioned reasons, and the limited work in this domain, this thesis develops an experimental investigation to evaluate the effect of fixturing quality on the design of condition monitoring systems. The proposed monitoring system implements multisensors and signal processing methods able to analyse the sensory information and make an appropriate decision. Therefore, several sensors namely force, vibration, acoustic emission, eddy current, power, strain and sound, are combined with a newly suggested approach, named Taylor’s Equation Induced Pattern (TIP), and neural networks to detect tool wear and tool breakage. It also evaluates the monitoring system to provide valuable data to show the effect of fixturing quality. Surface roughness of the workpiece has been measured and compared with the sensitivity of the monitoring system, which reflects the state of tool and fixturing conditions
A survey on machine learning for recurring concept drifting data streams
The problem of concept drift has gained a lot of attention in recent years. This aspect is key in many domains exhibiting non-stationary as well as cyclic patterns and structural breaks affecting their generative processes. In this survey, we review the relevant literature to deal with regime changes in the behaviour of continuous data streams. The study starts with a general introduction to the field of data stream learning, describing recent works on passive or active mechanisms to adapt or detect concept drifts, frequent challenges in this area, and related performance metrics. Then, different supervised and non-supervised approaches such as online ensembles, meta-learning and model-based clustering that can be used to deal with seasonalities in a data stream are covered. The aim is to point out new research trends and give future research directions on the usage of machine learning techniques for data streams which can help in the event of shifts and recurrences in continuous learning scenarios in near real-time
The Shallow and the Deep:A biased introduction to neural networks and old school machine learning
The Shallow and the Deep is a collection of lecture notes that offers an accessible introduction to neural networks and machine learning in general. However, it was clear from the beginning that these notes would not be able to cover this rapidly changing and growing field in its entirety. The focus lies on classical machine learning techniques, with a bias towards classification and regression. Other learning paradigms and many recent developments in, for instance, Deep Learning are not addressed or only briefly touched upon.Biehl argues that having a solid knowledge of the foundations of the field is essential, especially for anyone who wants to explore the world of machine learning with an ambition that goes beyond the application of some software package to some data set. Therefore, The Shallow and the Deep places emphasis on fundamental concepts and theoretical background. This also involves delving into the history and pre-history of neural networks, where the foundations for most of the recent developments were laid. These notes aim to demystify machine learning and neural networks without losing the appreciation for their impressive power and versatility
- …