2,084 research outputs found

    MODELFY: A Model-driven Solution for Decision Making based on Fuzzy Information

    Get PDF
    There exist areas, such as the disease prevention or inclement weather protocols, in which the analysis of the information based on strict protocols require a high level of rigor and security. In this situation, it would be desirable to apply formal methodologies that provide these features. In this scope, recently, it has been proposed a formalism, fuzzy automaton, that captures two relevant aspects for fuzzy information analysis: imprecision and uncertainty. However, the models should be designed by domain experts, who have the required knowledge for the design of the processes, but do not have the necessary technical knowledge. To address this limitation, this paper proposes MODELFY, a novel model-driven solution for designing a decision-making process based on fuzzy automata that allows users to abstract from technical complexities. With this goal in mind, we have developed a framework for fuzzy automaton model design based on a Domain- Specific Modeling Language (DSML) and a graphical editor. To improve the interoperability and functionality of this framework, it also includes a model-to-text transformation that translates the models designed by using the graphical editor into a format that can be used by a tool for data analysis. The practical value of this proposal is also evaluated through a non-trivial medical protocol for detecting potential heart problems. The results confirm that MODELFY is useful for defining such a protocol in a user-friendly and rigorous manner, bringing fuzzy automata closer to domain expert

    Predicting complex system behavior using hybrid modeling and computational intelligence

    Get PDF
    “Modeling and prediction of complex systems is a challenging problem due to the sub-system interactions and dependencies. This research examines combining various computational intelligence algorithms and modeling techniques to provide insights into these complex processes and allow for better decision making. This hybrid methodology provided additional capabilities to analyze and predict the overall system behavior where a single model cannot be used to understand the complex problem. The systems analyzed here are flooding events and fetal health care. The impact of floods on road infrastructure is investigated using graph theory, agent-based traffic simulation, and Long Short-Term Memory deep learning to predict water level rise from river gauge height. Combined with existing infrastructure models, these techniques provide a 15-minute interval for making closure decisions rather than the current 6-hour interval. The second system explored is fetal monitoring, which is essential to diagnose severe fetal conditions such as acidosis. Support Vector Machine and Random Forest were compared to identify the best model for classification of fetal state. This model provided a more accurate classification than existing research on the CTG. A deep learning forecasting model was developed to predict the future values for fetal heart rate and uterine contractions. The forecasting and classification algorithms are then integrated to evaluate the future condition of the fetus. The final model can predict the fetal state 4 minutes ahead to help the obstetricians to plan necessary interventions for preventing acidosis and asphyxiation. In both cases, time series predictions using hybrid modeling provided superior results to existing methods to predict complex behaviors”--Abstract, page iv

    Applying fuzzy automata to analyze heart data

    Get PDF
    There has been several attempts to introduce formal methods in the development of medical computer systems. Fuzzy automata provide a way to cope with imprecision, which appears in almost every biological or medical system. In this Thesis, we improve and extend a previous formalism, based on fuzzy automata, and develop tools to facilitate the definition of models using our formalism and its practical use. We have applied our formalism and tools to analyze heart data to detect and prevent arrhythmia

    Security and privacy services based on biosignals for implantable and wearable device

    Get PDF
    Mención Internacional en el título de doctorThe proliferation of wearable and implantable medical devices has given rise to an interest in developing security schemes suitable for these devices and the environment in which they operate. One area that has received much attention lately is the use of (human) biological signals as the basis for biometric authentication, identification and the generation of cryptographic keys. More concretely, in this dissertation we use the Electrocardiogram (ECG) to extract some fiducial points which are later used on crytographic protocols. The fiducial points are used to describe the points of interest which can be extracted from biological signals. Some examples of fiducials points of the ECG are P-wave, QRS complex,T-wave, R peaks or the RR-time-interval. In particular, we focus on the time difference between two consecutive heartbeats (R-peaks). These time intervals are referred to as Inter-Pulse Intervals (IPIs) and have been proven to contain entropy after applying some signal processing algorithms. This process is known as quantization algorithm. Theentropy that the heart signal has makes the ECG values an ideal candidate to generate tokens to be used on security protocols. Most of the proposed solutions in the literature rely on some questionable assumptions. For instance, it is commonly assumed that it possible to generate the same cryptographic token in at least two different devices that are sensing the same signal using the IPI of each cardiac signal without applying any synchronization algorithm; authors typically only measure the entropy of the LSB to determine whether the generated cryptographic values are random or not; authors usually pick the four LSBs assuming they are the best ones to create the best cryptographic tokens; the datasets used in these works are rather small and, therefore, possibly not significant enough, or; in general it is impossible to reproduce the experiments carried out by other researchers because the source code of such experiments is not usually available. In this Thesis, we overcome these weaknesses trying to systematically address most of the open research questions. That is why, in all the experiments carried out during this research we used a public database called PhysioNet which is available on Internet and stores a huge heart database named PhysioBank. This repository is constantly being up dated by medical researchers who share the sensitive information about patients and it also offers an open source software named PhysioToolkit which can be used to read and display these signals. All datasets we used contain ECG records obtained from a variety of real subjects with different heart-related pathologies as well as healthy people. The first chapter of this dissertation (Chapter 1) is entirely dedicated to present the research questions, introduce the main concepts used all along this document as well as settle down some medical and cryptographic definitions. Finally, the objectives that this dissertation tackles down are described together with the main motivations for this Thesis. In Chapter 2 we report the results of a large-scale statistical study to determine if heart signal is a good source of entropy. For this, we analyze 19 public datasets of heart signals from the Physionet repository, spanning electrocardiograms from multiple subjects sampled at different frequencies and lengths. We then apply both ENT and NIST STS standard battery of randomness tests to the extracted IPIs. The results we obtain through the analysis, clearly show that a short burst of bits derived from an ECG record may seem random, but large files derived from long ECG records should not be used for security purposes. In Chapter3, we carry out an análisis to check whether it is reasonable or not the assumption that two different sensors can generate the same cryptographic token. We systematically check if two sensors can agree on the same token without sharing any type of information. Similarly to other proposals, we include ECC algorithms like BCH to the token generation. We conclude that a fuzzy extractor (or another error correction technique) is not enough to correct the synchronization errors between the IPI values derived from two ECG signals captured via two sensors placed on different positions. We demonstrate that a pre-processing of the heart signal must be performed before the fuzzy extractor is applied. Going one step forward and, in order to generate the same token on different sensors, we propose a synchronization algorithm. To do so, we include a runtimemonitoralgorithm. Afterapplyingourproposedsolution,werun again the experiments with 19 public databases from the PhysioNet repository. The only constraint to pick those databases was that they need at least two measurements of heart signals (ECG1 and ECG2). As a conclusion, running the experiments, the same token can be dexix rived on different sensors in most of the tested databases if and only if a pre-processing of the heart signal is performed before extracting the tokens. In Chapter 4, we analyze the entropy of the tokens extracted from a heart signal according to the NISTSTS recommendation (i.e.,SP80090B Recommendation for the Entropy Sources Used for Random Bit Generation). We downloaded 19 databases from the Physionet public repository and analyze, in terms of min-entropy, more than 160,000 files. Finally, we propose other combinations for extracting tokens by taking 2, 3, 4 and 5 bits different than the usual four LSBs. Also, we demonstrate that the four LSB are not the best bits to be used in cryptographic applications. We offer other alternative combinations for two (e.g., 87), three (e.g., 638), four (e.g., 2638) and five (e.g., 23758) bits which are, in general, much better than taking the four LSBs from the entropy point of view. Finally, the last Chapter of this dissertation (Chapter 5) summarizes the main conclusions arisen from this PhD Thesis and introduces some open questions.Programa de Doctorado en Ciencia y Tecnología Informática por la Universidad Carlos III de MadridPresidente: Arturo Ribagorda Garnacho.- Secretario: Jorge Blasco Alis.- Vocal: Jesús García López de la Call

    Urban land cover change detection analysis and modeling spatio-temporal Growth dynamics using Remote Sensing and GIS Techniques: A case study of Dhaka, Bangladesh

    Get PDF
    Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.Dhaka, the capital of Bangladesh, has undergone radical changes in its physical form, not only in its vast territorial expansion, but also through internal physical transformations over the last decades. In the process of urbanization, the physical characteristic of Dhaka is gradually changing as open spaces have been transformed into building areas, low land and water bodies into reclaimed builtup lands etc. This new urban fabric should be analyzed to understand the changes that have led to its creation. The primary objective of this research is to predict and analyze the future urban growth of Dhaka City. Another objective is to quantify and investigate the characteristics of urban land cover changes (1989-2009) using the Landsat satellite images of 1989, 1999 and 2009. Dhaka City Corporation (DCC) and its surrounding impact areas have been selected as the study area. A fisher supervised classification method has been applied to prepare the base maps with five land cover classes. To observe the change detection, different spatial metrics have been used for quantitative analysis. Moreover, some postclassification change detection techniques have also been implemented. Then it is found that the ‘builtup area’ land cover type is increasing in high rate over the years. The major contributors to this change are ‘fallow land’ and ‘water body’ land cover types. In the next stage, three different models have been implemented to simulate the land cover map of Dhaka city of 2009. These are named as ‘Stochastic Markov (St_Markov)’ Model, ‘Cellular Automata Markov (CA_Markov)’ Model and ‘Multi Layer Perceptron Markov (MLP_Markov)’ Model. Then the best-fitted model has been selected based on various Kappa statistics values and also by implementing other model validation techniques. This is how the ‘Multi Layer Perceptron Markov (MLP_Markov)’ Model has been qualified as the most suitable model for this research. Later, using the MLP_Markov model, the land cover map of 2019 has been predicted. The MLP_Markov model shows that 58% of the total study area will be converted into builtup area cover type in 2019. The interpretation of depicting the future scenario in quantitative accounts, as demonstrated in this research, will be of great value to the urban planners and decision makers, for the future planning of modern Dhaka City

    Structural recognition of curves using a neural-aided fuzzy-statistic method with applications to graphs of heart-rate ratios

    Get PDF
    Pattern recognition is one of principle problems in computer science. Many issues such as controlling, making decisions or predictions are related to it. It also has the main position in robotics. Therefore, this branch of computer science has been developing for a long time both in theoretical and implementation aspects. In a lot of cases pattern recognition can be a difficult problem and consequently the only method commonly used to sort out this issue does not exist. Presently, a wide range of methods based on various elements of mathematics, for instance calculus of probability or approximation theory, is applied. However, a universal recognition method does not exists - a given one can be effective for a specific sort of tasks and can fail for others. This is the reason why new methods are created and the existing ones developed. For example, syntactic methods are supported with probabilistic mechanisms and methods, which are combination of different basic methods such as neural-fuzzy ones, are created or hybrid expert systems are built. This paper concerns recognition curves in relation to their structural features. The considered problem is situated in a group of problems where pattern representation is a sequence of primitives being elements of a context language. For this group of languages, admittedly, automata which analyze these languages exist but their complexity is non-polinominal and consequently their usefulness in practical applications is limited. Moreover, an algorithm of grammar inference does not exist, consequently a method of automatic creation of tables controlling parsers (conversion functions in automata) does not exist, which in a practical nontrivial application disqualifies these languages. So, for structural patterns whose representations belong to context languages syntactic methods allowing to analyze them do not exist. Therefore, an application of nonsyntactic methods to structural features analyzing seems to be valuable. The aim of this paper is to propose a new methodology of curves recognition in relation to their structural features taking advantage of fuzzy methods statistically aided. The possibility of a neural implementation of a recognition system based on the proposed methodology is tested. In the second chapter of this paper, the methodology of a decision function construction in an axiomatic recognition of patterns is presented. In the third chapter the proposed methodology is applied to classification curves describing relative changes in the cardiac rhythm between different people with and without a cognitive load, respectively. The curves were obtained in the Department of Psychophysiology of the Jagiellonian University. The experiment is described in details in [14], [15], [27]. The fourth chapter contains the description of a neural network computing the value of membership functions for each class

    Fuzzy expert system and its utility in various field

    Get PDF
    Today’s Fuzzy Logic plays an important role in every field of our life. Fuzzy expert system presents expertise knowledge and it has been effectively applied to solve the problems, classification and modeling in such diverse areas as science, engineering, business and medicine. This paper provides an overview of this fuzzy tool and highlights the basic feature of expert system with fuzzy logic.&nbsp

    Intelligent Embedded Software: New Perspectives and Challenges

    Get PDF
    Intelligent embedded systems (IES) represent a novel and promising generation of embedded systems (ES). IES have the capacity of reasoning about their external environments and adapt their behavior accordingly. Such systems are situated in the intersection of two different branches that are the embedded computing and the intelligent computing. On the other hand, intelligent embedded software (IESo) is becoming a large part of the engineering cost of intelligent embedded systems. IESo can include some artificial intelligence (AI)-based systems such as expert systems, neural networks and other sophisticated artificial intelligence (AI) models to guarantee some important characteristics such as self-learning, self-optimizing and self-repairing. Despite the widespread of such systems, some design challenging issues are arising. Designing a resource-constrained software and at the same time intelligent is not a trivial task especially in a real-time context. To deal with this dilemma, embedded system researchers have profited from the progress in semiconductor technology to develop specific hardware to support well AI models and render the integration of AI with the embedded world a reality

    Cloud computing application model for online recommendation through fuzzy logic system

    Get PDF
    Cloud computing can offer us different distance services over the internet. We propose an online application model for health care systems that works by use of cloud computing. It can provide a higher quality of services remotely and along with that, it decreases the cost of chronic patient. This model is composed of two sub-model that each one uses a different service, one of these is software as a service (SaaS) which is user related and another one is Platform as a service (PaaS) that is engineer related. Doctors classify the chronic diseases into different stages according to their symptoms. As the clinical data has a non-numeric value, we use the fuzzy logic system in Paas model to design this online application model. Based on this classification, patienst can receive the proper recommendation through smart devices (SaaS model).Facultad de Informátic
    corecore