360 research outputs found

    Deep Neural Networks for the Recognition and Classification of Heart Murmurs Using Neuromorphic Auditory Sensors

    Get PDF
    Auscultation is one of the most used techniques for detecting cardiovascular diseases, which is one of the main causes of death in the world. Heart murmurs are the most common abnormal finding when a patient visits the physician for auscultation. These heart sounds can either be innocent, which are harmless, or abnormal, which may be a sign of a more serious heart condition. However, the accuracy rate of primary care physicians and expert cardiologists when auscultating is not good enough to avoid most of both type-I (healthy patients are sent for echocardiogram) and type-II (pathological patients are sent home without medication or treatment) errors made. In this paper, the authors present a novel convolutional neural network based tool for classifying between healthy people and pathological patients using a neuromorphic auditory sensor for FPGA that is able to decompose the audio into frequency bands in real time. For this purpose, different networks have been trained with the heart murmur information contained in heart sound recordings obtained from nine different heart sound databases sourced from multiple research groups. These samples are segmented and preprocessed using the neuromorphic auditory sensor to decompose their audio information into frequency bands and, after that, sonogram images with the same size are generated. These images have been used to train and test different convolutional neural network architectures. The best results have been obtained with a modified version of the AlexNet model, achieving 97% accuracy (specificity: 95.12%, sensitivity: 93.20%, PhysioNet/CinC Challenge 2016 score: 0.9416). This tool could aid cardiologists and primary care physicians in the auscultation process, improving the decision making task and reducing type-I and type-II errors.Ministerio de Economía y Competitividad TEC2016-77785-

    NAVIS: Neuromorphic Auditory VISualizer Tool

    Get PDF
    This software presents diverse utilities to perform the first post-processing layer taking the neuromorphic auditory sensors (NAS) information. The used NAS implements in FPGA a cascade filters architecture, imitating the behavior of the basilar membrane and inner hair cells and working with the sound information decomposed into its frequency components as spike streams. The well-known neuromorphic hardware interface Address-Event-Representation (AER) is used to propagate auditory information out of the NAS, emulating the auditory vestibular nerve. Using the information packetized into aedat files, which are generated through the jAER software plus an AER to USB computer interface, NAVIS implements a set of graphs that allows to represent the auditory information as cochleograms, histograms, sonograms, etc. It can also split the auditory information into different sets depending on the activity level of the spike streams. The main contribution of this software tool is that it allows complex audio post-processing treatments and representations, which is a novelty for spike-based systems in the neuromorphic community and it will help neuromorphic engineers to build sets for training spiking neural networks (SNN).Ministerio de Economía y Competitividad TEC2012-37868-C04-0

    Stereo Matching in Address-Event-Representation (AER) Bio-Inspired Binocular Systems in a Field-Programmable Gate Array (FPGA)

    Get PDF
    In stereo-vision processing, the image-matching step is essential for results, although it involves a very high computational cost. Moreover, the more information is processed, the more time is spent by the matching algorithm, and the more ine cient it is. Spike-based processing is a relatively new approach that implements processing methods by manipulating spikes one by one at the time they are transmitted, like a human brain. The mammal nervous system can solve much more complex problems, such as visual recognition by manipulating neuron spikes. The spike-based philosophy for visual information processing based on the neuro-inspired address-event-representation (AER) is currently achieving very high performance. The aim of this work was to study the viability of a matching mechanism in stereo-vision systems, using AER codification and its implementation in a field-programmable gate array (FPGA). Some studies have been done before in an AER system with monitored data using a computer; however, this kind of mechanism has not been implemented directly on hardware. To this end, an epipolar geometry basis applied to AER systems was studied and implemented, with other restrictions, in order to achieve good results in a real-time scenario. The results and conclusions are shown, and the viability of its implementation is proven.Ministerio de Economía y Competitividad TEC2016-77785-

    A Sensor Fusion Horse Gait Classification by a Spiking Neural Network on SpiNNaker

    Get PDF
    The study and monitoring of the behavior of wildlife has always been a subject of great interest. Although many systems can track animal positions using GPS systems, the behavior classification is not a common task. For this work, a multi-sensory wearable device has been designed and implemented to be used in the Doñana National Park in order to control and monitor wild and semiwild life animals. The data obtained with these sensors is processed using a Spiking Neural Network (SNN), with Address-Event-Representation (AER) coding, and it is classified between some fixed activity behaviors. This works presents the full infrastructure deployed in Doñana to collect the data, the wearable device, the SNN implementation in SpiNNaker and the classification results.Ministerio de Economía y Competitividad TEC2012-37868-C04-02Junta de Andalucía P12-TIC-130

    Wearable Fall Detector Using Recurrent Neural Networks

    Get PDF
    Falls have become a relevant public health issue due to their high prevalence and negative effects in elderly people. Wearable fall detector devices allow the implementation of continuous and ubiquitous monitoring systems. The effectiveness for analyzing temporal signals with low energy consumption is one of the most relevant characteristics of these devices. Recurrent neural networks (RNNs) have demonstrated a great accuracy in some problems that require analyzing sequential inputs. However, getting appropriate response times in low power microcontrollers remains a difficult task due to their limited hardware resources. This work shows a feasibility study about using RNN-based deep learning models to detect both falls and falls’ risks in real time using accelerometer signals. The effectiveness of four different architectures was analyzed using the SisFall dataset at different frequencies. The resulting models were integrated into two different embedded systems to analyze the execution times and changes in the model effectiveness. Finally, a study of power consumption was carried out. A sensitivity of 88.2% and a specificity of 96.4% was obtained. The simplest models reached inference times lower than 34 ms, which implies the capability to detect fall events in real-time with high energy efficiency. This suggests that RNN models provide an effective method that can be implemented in low power microcontrollers for the creation of autonomous wearable fall detection systems in real-time

    A Microcontroller Based System for Controlling Patient Respiratory Guidelines

    Get PDF
    The need of making improvements in obtaining (in a non-invasive way) and monitoring the breathing rate parameters in a patient emerges due to (1) the great amount of breathing problems our society suffer, (2) the problems that can be solved, and (3) the methods used so far. Non-specific machines are usually used to carry out these measures or simply calculate the number of inhalations and exhalations within a particular timeframe. These methods lack of effectiveness and precision thus, influencing the capacity of getting a good diagnosis. This proposal focuses on drawing up a technology composed of a mechanism and a user application which allows doctors to obtain the breathing rate parameters in a comfortable and concise way. In addition, such parameters are stored in a database for potential consultation as well as for the medical history of the patients. For this, the current approach takes into account the needs, the capacities, the expectations and the user motivations which have been compiled by means of open interviews, forum discussions, surveys and application uses. In addition, an empirical evaluation has been conducted with a set of volunteers. Results indicate that the proposed technology may reduce cost and improve the reliability of the diagnosis.Ministerio de Economía y Competitividad TIN2016-76956-C3-2-RMinisterio de Economía y Competitividad TIN2015-71938-RED

    Event-based Row-by-Row Multi-convolution engine for Dynamic-Vision Feature Extraction on FPGA

    Get PDF
    Neural networks algorithms are commonly used to recognize patterns from different data sources such as audio or vision. In image recognition, Convolutional Neural Networks are one of the most effective techniques due to the high accuracy they achieve. This kind of algorithms require billions of addition and multiplication operations over all pixels of an image. However, it is possible to reduce the number of operations using other computer vision techniques rather than frame-based ones, e.g. neuromorphic frame-free techniques. There exists many neuromorphic vision sensors that detect pixels that have changed their luminosity. In this study, an event-based convolution engine for FPGA is presented. This engine models an array of leaky integrate and fire neurons. It is able to apply different kernel sizes, from 1x1 to 7x7, which are computed row by row, with a maximum number of 64 different convolution kernels. The design presented is able to process 64 feature maps of 7x7 with a latency of 8.98 s.Ministerio de Economía y Competitividad TEC2016-77785-

    Embedded neural network for real-time animal behavior classification

    Get PDF
    Recent biological studies have focused on understanding animal interactions and welfare. To help biolo- gists to obtain animals’ behavior information, resources like wireless sensor networks are needed. More- over, large amounts of obtained data have to be processed off-line in order to classify different behaviors.There are recent research projects focused on designing monitoring systems capable of measuring someanimals’ parameters in order to recognize and monitor their gaits or behaviors. However, network unre- liability and high power consumption have limited their applicability.In this work, we present an animal behavior recognition, classification and monitoring system based ona wireless sensor network and a smart collar device, provided with inertial sensors and an embeddedmulti-layer perceptron-based feed-forward neural network, to classify the different gaits or behaviorsbased on the collected information. In similar works, classification mechanisms are implemented in aserver (or base station). The main novelty of this work is the full implementation of a reconfigurableneural network embedded into the animal’s collar, which allows a real-time behavior classification andenables its local storage in SD memory. Moreover, this approach reduces the amount of data transmittedto the base station (and its periodicity), achieving a significantly improving battery life. The system hasbeen simulated and tested in a real scenario for three different horse gaits, using different heuristics andsensors to improve the accuracy of behavior recognition, achieving a maximum of 81%.Junta de Andalucía P12-TIC-130

    Accuracy Improvement of Neural Networks Through Self-Organizing-Maps over Training Datasets

    Get PDF
    Although it is not a novel topic, pattern recognition has become very popular and relevant in the last years. Different classification systems like neural networks, support vector machines or even complex statistical methods have been used for this purpose. Several works have used these systems to classify animal behavior, mainly in an offline way. Their main problem is usually the data pre-processing step, because the better input data are, the higher may be the accuracy of the classification system. In previous papers by the authors an embedded implementation of a neural network was deployed on a portable device that was placed on animals. This approach allows the classification to be done online and in real time. This is one of the aims of the research project MINERVA, which is focused on monitoring wildlife in Do˜nana National Park using low power devices. Many difficulties were faced when pre-processing methods quality needed to be evaluated. In this work, a novel pre-processing evaluation system based on self-organizing maps (SOM) to measure the quality of the neural network training dataset is presented. The paper is focused on a three different horse gaits classification study. Preliminary results show that a better SOM output map matches with the embedded ANN classification hit improvement.Junta de Andalucía P12-TIC-1300Ministerio de Economía y Competitividad TEC2016-77785-

    Worker’s physical fatigue classification using neural networks

    Get PDF
    Physical fatigue is not only an indication of the user’s physical condition and/or need for sleep or rest, but can also be a significant symptom of various diseases. This fatigue affects the performance of workers in jobs that involve some continuous physical activity, and is the cause of a large proportion of accidents at work. The physical fatigue is commonly measured by the perceived exertion (RPE). Many previous studies have attempted to continuously monitor workers in order to detect the level of fatigue and prevent these accidents, but most have used invasive sensors that are difficult to place and prevent the worker from performing their tasks correctly. Other works use activity measurement sensors such as accelerometers, but the large amount of information obtained is difficult to analyse in order to extract the characteristics of each fatigue state. In this work, we use a dataset that contains data from inertial sensors of several workers performing various activities during their working day, labelled every 10 min based on their level of fatigue using questionnaires and the Borg fatigue scale. Applying Machine Learning techniques, we design, develop and test a system based on a neural network capable of classifying the variation of fatigue caused by the physical activity collected every 10 min; for this purpose, a feature extraction is performed after the time decomposition done with the Discrete Wavelet Transform (DWT). The results show that the proposed system has an accuracy higher than 92% for all the cases, being viable for its application in the proposed scenario.European Commission (EC). Fondo Europeo de Desarrollo Regional (FEDER)Consejería de Economía, Conocimiento, Empresas y Universidad (Junta de Andalucía) US-126371
    corecore