30 research outputs found

    Survey of FPGA applications in the period 2000 – 2015 (Technical Report)

    Get PDF
    Romoth J, Porrmann M, Rückert U. Survey of FPGA applications in the period 2000 – 2015 (Technical Report).; 2017.Since their introduction, FPGAs can be seen in more and more different fields of applications. The key advantage is the combination of software-like flexibility with the performance otherwise common to hardware. Nevertheless, every application field introduces special requirements to the used computational architecture. This paper provides an overview of the different topics FPGAs have been used for in the last 15 years of research and why they have been chosen over other processing units like e.g. CPUs

    Intelligent data mining using artificial neural networks and genetic algorithms : techniques and applications

    Get PDF
    Data Mining (DM) refers to the analysis of observational datasets to find relationships and to summarize the data in ways that are both understandable and useful. Many DM techniques exist. Compared with other DM techniques, Intelligent Systems (ISs) based approaches, which include Artificial Neural Networks (ANNs), fuzzy set theory, approximate reasoning, and derivative-free optimization methods such as Genetic Algorithms (GAs), are tolerant of imprecision, uncertainty, partial truth, and approximation. They provide flexible information processing capability for handling real-life situations. This thesis is concerned with the ideas behind design, implementation, testing and application of a novel ISs based DM technique. The unique contribution of this thesis is in the implementation of a hybrid IS DM technique (Genetic Neural Mathematical Method, GNMM) for solving novel practical problems, the detailed description of this technique, and the illustrations of several applications solved by this novel technique. GNMM consists of three steps: (1) GA-based input variable selection, (2) Multi- Layer Perceptron (MLP) modelling, and (3) mathematical programming based rule extraction. In the first step, GAs are used to evolve an optimal set of MLP inputs. An adaptive method based on the average fitness of successive generations is used to adjust the mutation rate, and hence the exploration/exploitation balance. In addition, GNMM uses the elite group and appearance percentage to minimize the randomness associated with GAs. In the second step, MLP modelling serves as the core DM engine in performing classification/prediction tasks. An Independent Component Analysis (ICA) based weight initialization algorithm is used to determine optimal weights before the commencement of training algorithms. The Levenberg-Marquardt (LM) algorithm is used to achieve a second-order speedup compared to conventional Back-Propagation (BP) training. In the third step, mathematical programming based rule extraction is not only used to identify the premises of multivariate polynomial rules, but also to explore features from the extracted rules based on data samples associated with each rule. Therefore, the methodology can provide regression rules and features not only in the polyhedrons with data instances, but also in the polyhedrons without data instances. A total of six datasets from environmental and medical disciplines were used as case study applications. These datasets involve the prediction of longitudinal dispersion coefficient, classification of electrocorticography (ECoG)/Electroencephalogram (EEG) data, eye bacteria Multisensor Data Fusion (MDF), and diabetes classification (denoted by Data I through to Data VI). GNMM was applied to all these six datasets to explore its effectiveness, but the emphasis is different for different datasets. For example, the emphasis of Data I and II was to give a detailed illustration of how GNMM works; Data III and IV aimed to show how to deal with difficult classification problems; the aim of Data V was to illustrate the averaging effect of GNMM; and finally Data VI was concerned with the GA parameter selection and benchmarking GNMM with other IS DM techniques such as Adaptive Neuro-Fuzzy Inference System (ANFIS), Evolving Fuzzy Neural Network (EFuNN), Fuzzy ARTMAP, and Cartesian Genetic Programming (CGP). In addition, datasets obtained from published works (i.e. Data II & III) or public domains (i.e. Data VI) where previous results were present in the literature were also used to benchmark GNMM’s effectiveness. As a closely integrated system GNMM has the merit that it needs little human interaction. With some predefined parameters, such as GA’s crossover probability and the shape of ANNs’ activation functions, GNMM is able to process raw data until some human-interpretable rules being extracted. This is an important feature in terms of practice as quite often users of a DM system have little or no need to fully understand the internal components of such a system. Through case study applications, it has been shown that the GA-based variable selection stage is capable of: filtering out irrelevant and noisy variables, improving the accuracy of the model; making the ANN structure less complex and easier to understand; and reducing the computational complexity and memory requirements. Furthermore, rule extraction ensures that the MLP training results are easily understandable and transferrable

    Models and analysis of vocal emissions for biomedical applications

    Get PDF
    This book of Proceedings collects the papers presented at the 3rd International Workshop on Models and Analysis of Vocal Emissions for Biomedical Applications, MAVEBA 2003, held 10-12 December 2003, Firenze, Italy. The workshop is organised every two years, and aims to stimulate contacts between specialists active in research and industrial developments, in the area of voice analysis for biomedical applications. The scope of the Workshop includes all aspects of voice modelling and analysis, ranging from fundamental research to all kinds of biomedical applications and related established and advanced technologies

    Design of a secure architecture for the exchange of biomedical information in m-Health scenarios

    Get PDF
    El paradigma de m-Salud (salud móvil) aboga por la integración masiva de las más avanzadas tecnologías de comunicación, red móvil y sensores en aplicaciones y sistemas de salud, para fomentar el despliegue de un nuevo modelo de atención clínica centrada en el usuario/paciente. Este modelo tiene por objetivos el empoderamiento de los usuarios en la gestión de su propia salud (p.ej. aumentando sus conocimientos, promocionando estilos de vida saludable y previniendo enfermedades), la prestación de una mejor tele-asistencia sanitaria en el hogar para ancianos y pacientes crónicos y una notable disminución del gasto de los Sistemas de Salud gracias a la reducción del número y la duración de las hospitalizaciones. No obstante, estas ventajas, atribuidas a las aplicaciones de m-Salud, suelen venir acompañadas del requisito de un alto grado de disponibilidad de la información biomédica de sus usuarios para garantizar una alta calidad de servicio, p.ej. fusionar varias señales de un usuario para obtener un diagnóstico más preciso. La consecuencia negativa de cumplir esta demanda es el aumento directo de las superficies potencialmente vulnerables a ataques, lo que sitúa a la seguridad (y a la privacidad) del modelo de m-Salud como factor crítico para su éxito. Como requisito no funcional de las aplicaciones de m-Salud, la seguridad ha recibido menos atención que otros requisitos técnicos que eran más urgentes en etapas de desarrollo previas, tales como la robustez, la eficiencia, la interoperabilidad o la usabilidad. Otro factor importante que ha contribuido a retrasar la implementación de políticas de seguridad sólidas es que garantizar un determinado nivel de seguridad implica unos costes que pueden ser muy relevantes en varias dimensiones, en especial en la económica (p.ej. sobrecostes por la inclusión de hardware extra para la autenticación de usuarios), en el rendimiento (p.ej. reducción de la eficiencia y de la interoperabilidad debido a la integración de elementos de seguridad) y en la usabilidad (p.ej. configuración más complicada de dispositivos y aplicaciones de salud debido a las nuevas opciones de seguridad). Por tanto, las soluciones de seguridad que persigan satisfacer a todos los actores del contexto de m-Salud (usuarios, pacientes, personal médico, personal técnico, legisladores, fabricantes de dispositivos y equipos, etc.) deben ser robustas y al mismo tiempo minimizar sus costes asociados. Esta Tesis detalla una propuesta de seguridad, compuesta por cuatro grandes bloques interconectados, para dotar de seguridad a las arquitecturas de m-Salud con unos costes reducidos. El primer bloque define un esquema global que proporciona unos niveles de seguridad e interoperabilidad acordes con las características de las distintas aplicaciones de m-Salud. Este esquema está compuesto por tres capas diferenciadas, diseñadas a la medidas de los dominios de m-Salud y de sus restricciones, incluyendo medidas de seguridad adecuadas para la defensa contra las amenazas asociadas a sus aplicaciones de m-Salud. El segundo bloque establece la extensión de seguridad de aquellos protocolos estándar que permiten la adquisición, el intercambio y/o la administración de información biomédica -- por tanto, usados por muchas aplicaciones de m-Salud -- pero no reúnen los niveles de seguridad detallados en el esquema previo. Estas extensiones se concretan para los estándares biomédicos ISO/IEEE 11073 PHD y SCP-ECG. El tercer bloque propone nuevas formas de fortalecer la seguridad de los tests biomédicos, que constituyen el elemento esencial de muchas aplicaciones de m-Salud de carácter clínico, mediante codificaciones novedosas. Finalmente el cuarto bloque, que se sitúa en paralelo a los anteriores, selecciona herramientas genéricas de seguridad (elementos de autenticación y criptográficos) cuya integración en los otros bloques resulta idónea, y desarrolla nuevas herramientas de seguridad, basadas en señal -- embedding y keytagging --, para reforzar la protección de los test biomédicos.The paradigm of m-Health (mobile health) advocates for the massive integration of advanced mobile communications, network and sensor technologies in healthcare applications and systems to foster the deployment of a new, user/patient-centered healthcare model enabling the empowerment of users in the management of their health (e.g. by increasing their health literacy, promoting healthy lifestyles and the prevention of diseases), a better home-based healthcare delivery for elderly and chronic patients and important savings for healthcare systems due to the reduction of hospitalizations in number and duration. It is a fact that many m-Health applications demand high availability of biomedical information from their users (for further accurate analysis, e.g. by fusion of various signals) to guarantee high quality of service, which on the other hand entails increasing the potential surfaces for attacks. Therefore, it is not surprising that security (and privacy) is commonly included among the most important barriers for the success of m-Health. As a non-functional requirement for m-Health applications, security has received less attention than other technical issues that were more pressing at earlier development stages, such as reliability, eficiency, interoperability or usability. Another fact that has contributed to delaying the enforcement of robust security policies is that guaranteeing a certain security level implies costs that can be very relevant and that span along diferent dimensions. These include budgeting (e.g. the demand of extra hardware for user authentication), performance (e.g. lower eficiency and interoperability due to the addition of security elements) and usability (e.g. cumbersome configuration of devices and applications due to security options). Therefore, security solutions that aim to satisfy all the stakeholders in the m-Health context (users/patients, medical staff, technical staff, systems and devices manufacturers, regulators, etc.) shall be robust and, at the same time, minimize their associated costs. This Thesis details a proposal, composed of four interrelated blocks, to integrate appropriate levels of security in m-Health architectures in a cost-efcient manner. The first block designes a global scheme that provides different security and interoperability levels accordingto how critical are the m-Health applications to be implemented. This consists ofthree layers tailored to the m-Health domains and their constraints, whose security countermeasures defend against the threats of their associated m-Health applications. Next, the second block addresses the security extension of those standard protocols that enable the acquisition, exchange and/or management of biomedical information | thus, used by many m-Health applications | but do not meet the security levels described in the former scheme. These extensions are materialized for the biomedical standards ISO/IEEE 11073 PHD and SCP-ECG. Then, the third block proposes new ways of enhancing the security of biomedical standards, which are the centerpiece of many clinical m-Health applications, by means of novel codings. Finally the fourth block, with is parallel to the others, selects generic security methods (for user authentication and cryptographic protection) whose integration in the other blocks results optimal, and also develops novel signal-based methods (embedding and keytagging) for strengthening the security of biomedical tests. The layer-based extensions of the standards ISO/IEEE 11073 PHD and SCP-ECG can be considered as robust, cost-eficient and respectful with their original features and contents. The former adds no attributes to its data information model, four new frames to the service model |and extends four with new sub-frames|, and only one new sub-state to the communication model. Furthermore, a lightweight architecture consisting of a personal health device mounting a 9 MHz processor and an aggregator mounting a 1 GHz processor is enough to transmit a 3-lead electrocardiogram in real-time implementing the top security layer. The extra requirements associated to this extension are an initial configuration of the health device and the aggregator, tokens for identification/authentication of users if these devices are to be shared and the implementation of certain IHE profiles in the aggregator to enable the integration of measurements in healthcare systems. As regards to the extension of SCP-ECG, it only adds a new section with selected security elements and syntax in order to protect the rest of file contents and provide proper role-based access control. The overhead introduced in the protected SCP-ECG is typically 2{13 % of the regular file size, and the extra delays to protect a newly generated SCP-ECG file and to access it for interpretation are respectively a 2{10 % and a 5 % of the regular delays. As regards to the signal-based security techniques developed, the embedding method is the basis for the proposal of a generic coding for tests composed of biomedical signals, periodic measurements and contextual information. This has been adjusted and evaluated with electrocardiogram and electroencephalogram-based tests, proving the objective clinical quality of the coded tests, the capacity of the coding-access system to operate in real-time (overall delays of 2 s for electrocardiograms and 3.3 s for electroencephalograms) and its high usability. Despite of the embedding of security and metadata to enable m-Health services, the compression ratios obtained by this coding range from ' 3 in real-time transmission to ' 5 in offline operation. Complementarily, keytagging permits associating information to images (and other signals) by means of keys in a secure and non-distorting fashion, which has been availed to implement security measures such as image authentication, integrity control and location of tampered areas, private captioning with role-based access control, traceability and copyright protection. The tests conducted indicate a remarkable robustness-capacity tradeoff that permits implementing all this measures simultaneously, and the compatibility of keytagging with JPEG2000 compression, maintaining this tradeoff while setting the overall keytagging delay in only ' 120 ms for any image size | evidencing the scalability of this technique. As a general conclusion, it has been demonstrated and illustrated with examples that there are various, complementary and structured manners to contribute in the implementation of suitable security levels for m-Health architectures with a moderate cost in budget, performance, interoperability and usability. The m-Health landscape is evolving permanently along all their dimensions, and this Thesis aims to do so with its security. Furthermore, the lessons learned herein may offer further guidance for the elaboration of more comprehensive and updated security schemes, for the extension of other biomedical standards featuring low emphasis on security or privacy, and for the improvement of the state of the art regarding signal-based protection methods and applications

    Recent Advances in Signal Processing

    Get PDF
    The signal processing task is a very critical issue in the majority of new technological inventions and challenges in a variety of applications in both science and engineering fields. Classical signal processing techniques have largely worked with mathematical models that are linear, local, stationary, and Gaussian. They have always favored closed-form tractability over real-world accuracy. These constraints were imposed by the lack of powerful computing tools. During the last few decades, signal processing theories, developments, and applications have matured rapidly and now include tools from many areas of mathematics, computer science, physics, and engineering. This book is targeted primarily toward both students and researchers who want to be exposed to a wide variety of signal processing techniques and algorithms. It includes 27 chapters that can be categorized into five different areas depending on the application at hand. These five categories are ordered to address image processing, speech processing, communication systems, time-series analysis, and educational packages respectively. The book has the advantage of providing a collection of applications that are completely independent and self-contained; thus, the interested reader can choose any chapter and skip to another without losing continuity

    Intelligent data mining using artificial neural networks and genetic algorithms : techniques and applications

    Get PDF
    Data Mining (DM) refers to the analysis of observational datasets to find relationships and to summarize the data in ways that are both understandable and useful. Many DM techniques exist. Compared with other DM techniques, Intelligent Systems (ISs) based approaches, which include Artificial Neural Networks (ANNs), fuzzy set theory, approximate reasoning, and derivative-free optimization methods such as Genetic Algorithms (GAs), are tolerant of imprecision, uncertainty, partial truth, and approximation. They provide flexible information processing capability for handling real-life situations. This thesis is concerned with the ideas behind design, implementation, testing and application of a novel ISs based DM technique. The unique contribution of this thesis is in the implementation of a hybrid IS DM technique (Genetic Neural Mathematical Method, GNMM) for solving novel practical problems, the detailed description of this technique, and the illustrations of several applications solved by this novel technique. GNMM consists of three steps: (1) GA-based input variable selection, (2) Multi- Layer Perceptron (MLP) modelling, and (3) mathematical programming based rule extraction. In the first step, GAs are used to evolve an optimal set of MLP inputs. An adaptive method based on the average fitness of successive generations is used to adjust the mutation rate, and hence the exploration/exploitation balance. In addition, GNMM uses the elite group and appearance percentage to minimize the randomness associated with GAs. In the second step, MLP modelling serves as the core DM engine in performing classification/prediction tasks. An Independent Component Analysis (ICA) based weight initialization algorithm is used to determine optimal weights before the commencement of training algorithms. The Levenberg-Marquardt (LM) algorithm is used to achieve a second-order speedup compared to conventional Back-Propagation (BP) training. In the third step, mathematical programming based rule extraction is not only used to identify the premises of multivariate polynomial rules, but also to explore features from the extracted rules based on data samples associated with each rule. Therefore, the methodology can provide regression rules and features not only in the polyhedrons with data instances, but also in the polyhedrons without data instances. A total of six datasets from environmental and medical disciplines were used as case study applications. These datasets involve the prediction of longitudinal dispersion coefficient, classification of electrocorticography (ECoG)/Electroencephalogram (EEG) data, eye bacteria Multisensor Data Fusion (MDF), and diabetes classification (denoted by Data I through to Data VI). GNMM was applied to all these six datasets to explore its effectiveness, but the emphasis is different for different datasets. For example, the emphasis of Data I and II was to give a detailed illustration of how GNMM works; Data III and IV aimed to show how to deal with difficult classification problems; the aim of Data V was to illustrate the averaging effect of GNMM; and finally Data VI was concerned with the GA parameter selection and benchmarking GNMM with other IS DM techniques such as Adaptive Neuro-Fuzzy Inference System (ANFIS), Evolving Fuzzy Neural Network (EFuNN), Fuzzy ARTMAP, and Cartesian Genetic Programming (CGP). In addition, datasets obtained from published works (i.e. Data II ;III) or public domains (i.e. Data VI) where previous results were present in the literature were also used to benchmark GNMM’s effectiveness. As a closely integrated system GNMM has the merit that it needs little human interaction. With some predefined parameters, such as GA’s crossover probability and the shape of ANNs’ activation functions, GNMM is able to process raw data until some human-interpretable rules being extracted. This is an important feature in terms of practice as quite often users of a DM system have little or no need to fully understand the internal components of such a system. Through case study applications, it has been shown that the GA-based variable selection stage is capable of: filtering out irrelevant and noisy variables, improving the accuracy of the model; making the ANN structure less complex and easier to understand; and reducing the computational complexity and memory requirements. Furthermore, rule extraction ensures that the MLP training results are easily understandable and transferrable.EThOS - Electronic Theses Online ServiceUniversity of WarwickOverseas Research Students Awards SchemeGBUnited Kingdo

    Intelligent data mining using artificial neural networks and genetic algorithms : techniques and applications

    Get PDF
    Data Mining (DM) refers to the analysis of observational datasets to find relationships and to summarize the data in ways that are both understandable and useful. Many DM techniques exist. Compared with other DM techniques, Intelligent Systems (ISs) based approaches, which include Artificial Neural Networks (ANNs), fuzzy set theory, approximate reasoning, and derivative-free optimization methods such as Genetic Algorithms (GAs), are tolerant of imprecision, uncertainty, partial truth, and approximation. They provide flexible information processing capability for handling real-life situations. This thesis is concerned with the ideas behind design, implementation, testing and application of a novel ISs based DM technique. The unique contribution of this thesis is in the implementation of a hybrid IS DM technique (Genetic Neural Mathematical Method, GNMM) for solving novel practical problems, the detailed description of this technique, and the illustrations of several applications solved by this novel technique. GNMM consists of three steps: (1) GA-based input variable selection, (2) Multi- Layer Perceptron (MLP) modelling, and (3) mathematical programming based rule extraction. In the first step, GAs are used to evolve an optimal set of MLP inputs. An adaptive method based on the average fitness of successive generations is used to adjust the mutation rate, and hence the exploration/exploitation balance. In addition, GNMM uses the elite group and appearance percentage to minimize the randomness associated with GAs. In the second step, MLP modelling serves as the core DM engine in performing classification/prediction tasks. An Independent Component Analysis (ICA) based weight initialization algorithm is used to determine optimal weights before the commencement of training algorithms. The Levenberg-Marquardt (LM) algorithm is used to achieve a second-order speedup compared to conventional Back-Propagation (BP) training. In the third step, mathematical programming based rule extraction is not only used to identify the premises of multivariate polynomial rules, but also to explore features from the extracted rules based on data samples associated with each rule. Therefore, the methodology can provide regression rules and features not only in the polyhedrons with data instances, but also in the polyhedrons without data instances. A total of six datasets from environmental and medical disciplines were used as case study applications. These datasets involve the prediction of longitudinal dispersion coefficient, classification of electrocorticography (ECoG)/Electroencephalogram (EEG) data, eye bacteria Multisensor Data Fusion (MDF), and diabetes classification (denoted by Data I through to Data VI). GNMM was applied to all these six datasets to explore its effectiveness, but the emphasis is different for different datasets. For example, the emphasis of Data I and II was to give a detailed illustration of how GNMM works; Data III and IV aimed to show how to deal with difficult classification problems; the aim of Data V was to illustrate the averaging effect of GNMM; and finally Data VI was concerned with the GA parameter selection and benchmarking GNMM with other IS DM techniques such as Adaptive Neuro-Fuzzy Inference System (ANFIS), Evolving Fuzzy Neural Network (EFuNN), Fuzzy ARTMAP, and Cartesian Genetic Programming (CGP). In addition, datasets obtained from published works (i.e. Data II ;III) or public domains (i.e. Data VI) where previous results were present in the literature were also used to benchmark GNMM’s effectiveness. As a closely integrated system GNMM has the merit that it needs little human interaction. With some predefined parameters, such as GA’s crossover probability and the shape of ANNs’ activation functions, GNMM is able to process raw data until some human-interpretable rules being extracted. This is an important feature in terms of practice as quite often users of a DM system have little or no need to fully understand the internal components of such a system. Through case study applications, it has been shown that the GA-based variable selection stage is capable of: filtering out irrelevant and noisy variables, improving the accuracy of the model; making the ANN structure less complex and easier to understand; and reducing the computational complexity and memory requirements. Furthermore, rule extraction ensures that the MLP training results are easily understandable and transferrable.EThOS - Electronic Theses Online ServiceUniversity of WarwickOverseas Research Students Awards SchemeGBUnited Kingdo

    Design of large polyphase filters in the Quadratic Residue Number System

    Full text link

    Temperature aware power optimization for multicore floating-point units

    Full text link

    Intelligent Circuits and Systems

    Get PDF
    ICICS-2020 is the third conference initiated by the School of Electronics and Electrical Engineering at Lovely Professional University that explored recent innovations of researchers working for the development of smart and green technologies in the fields of Energy, Electronics, Communications, Computers, and Control. ICICS provides innovators to identify new opportunities for the social and economic benefits of society.  This conference bridges the gap between academics and R&D institutions, social visionaries, and experts from all strata of society to present their ongoing research activities and foster research relations between them. It provides opportunities for the exchange of new ideas, applications, and experiences in the field of smart technologies and finding global partners for future collaboration. The ICICS-2020 was conducted in two broad categories, Intelligent Circuits & Intelligent Systems and Emerging Technologies in Electrical Engineering
    corecore