100 research outputs found

    On a fractional-order delay Mackey-Glass equation

    Get PDF

    A Survey Study of the Current Challenges and Opportunities of Deploying the ECG Biometric Authentication Method in IoT and 5G Environments

    Get PDF
    The environment prototype of the Internet of Things (IoT) has opened the horizon for researchers to utilize such environments in deploying useful new techniques and methods in different fields and areas. The deployment process takes place when numerous IoT devices are utilized in the implementation phase for new techniques and methods. With the wide use of IoT devices in our daily lives in many fields, personal identification is becoming increasingly important for our society. This survey aims to demonstrate various aspects related to the implementation of biometric authentication in healthcare monitoring systems based on acquiring vital ECG signals via designated wearable devices that are compatible with 5G technology. The nature of ECG signals and current ongoing research related to ECG authentication are investigated in this survey along with the factors that may affect the signal acquisition process. In addition, the survey addresses the psycho-physiological factors that pose a challenge to the usage of ECG signals as a biometric trait in biometric authentication systems along with other challenges that must be addressed and resolved in any future related research.

    Wavelet Theory

    Get PDF
    The wavelet is a powerful mathematical tool that plays an important role in science and technology. This book looks at some of the most creative and popular applications of wavelets including biomedical signal processing, image processing, communication signal processing, Internet of Things (IoT), acoustical signal processing, financial market data analysis, energy and power management, and COVID-19 pandemic measurements and calculations. The editor’s personal interest is the application of wavelet transform to identify time domain changes on signals and corresponding frequency components and in improving power amplifier behavior

    Sensor Signal and Information Processing II

    Get PDF
    In the current age of information explosion, newly invented technological sensors and software are now tightly integrated with our everyday lives. Many sensor processing algorithms have incorporated some forms of computational intelligence as part of their core framework in problem solving. These algorithms have the capacity to generalize and discover knowledge for themselves and learn new information whenever unseen data are captured. The primary aim of sensor processing is to develop techniques to interpret, understand, and act on information contained in the data. The interest of this book is in developing intelligent signal processing in order to pave the way for smart sensors. This involves mathematical advancement of nonlinear signal processing theory and its applications that extend far beyond traditional techniques. It bridges the boundary between theory and application, developing novel theoretically inspired methodologies targeting both longstanding and emergent signal processing applications. The topic ranges from phishing detection to integration of terrestrial laser scanning, and from fault diagnosis to bio-inspiring filtering. The book will appeal to established practitioners, along with researchers and students in the emerging field of smart sensors processing

    River bed sediment surface characterisation using wavelet transform-based methods.

    Get PDF
    The primary purpose of this work was to study the morphological change of river-bedsediment surfaces over time using wavelet transform analysis techniques. The wavelettransform is a rapidly developing area of applied mathematics in both science andengineering. As it allows for interrogation of the spectral made up of local signalfeatures, it has superior performance compared to the traditionally used Fouriertransform which provides only signal averaged spectral information. The main study ofthis thesis includes the analysis of both synthetically generated sediment surfaces andlaboratory experimental sediment bed-surface data. This was undertaken usingtwo-dimensional wavelet transform techniques based on both the discrete and thestationary wavelet transforms.A comprehensive data-base of surface scans from experimental river-bed sedimentsurfaces topographies were included in the study. A novel wavelet-basedcharacterisation measure - the form size distribution ifsd) - was developed to quantifythe global characteristics of the sediment data. The fsd is based on the distribution ofwavelet-based scale-dependent energies. It is argued that this measure will potentiallybe more useful than the traditionally used particle size distribution (psd), as it is themorphology of the surface rather than the individual particle sizes that affects the nearbed flow regime and hence bed friction characteristics.Amplitude and scale dependent thresholding techniques were then studied. It was foundthat these thresholding techniques could be used to: (1) extract the overall surfacestructure, and (2) enhance dominant grains and formations of dominant grains withinthe surfaces. It is shown that assessment of the surface data-sets post-thresholding mayallow for the detection of structural changes over time

    Contribuciones de las técnicas machine learning a la cardiología. Predicción de reestenosis tras implante de stent coronario

    Get PDF
    [ES]Antecedentes: Existen pocos temas de actualidad equiparables a la posibilidad de la tecnología actual para desarrollar las mismas capacidades que el ser humano, incluso en medicina. Esta capacidad de simular los procesos de inteligencia humana por parte de máquinas o sistemas informáticos es lo que conocemos hoy en día como inteligencia artificial. Uno de los campos de la inteligencia artificial con mayor aplicación a día de hoy en medicina es el de la predicción, recomendación o diagnóstico, donde se aplican las técnicas machine learning. Asimismo, existe un creciente interés en las técnicas de medicina de precisión, donde las técnicas machine learning pueden ofrecer atención médica individualizada a cada paciente. El intervencionismo coronario percutáneo (ICP) con stent se ha convertido en una práctica habitual en la revascularización de los vasos coronarios con enfermedad aterosclerótica obstructiva significativa. El ICP es asimismo patrón oro de tratamiento en pacientes con infarto agudo de miocardio; reduciendo las tasas de muerte e isquemia recurrente en comparación con el tratamiento médico. El éxito a largo plazo del procedimiento está limitado por la reestenosis del stent, un proceso patológico que provoca un estrechamiento arterial recurrente en el sitio de la ICP. Identificar qué pacientes harán reestenosis es un desafío clínico importante; ya que puede manifestarse como un nuevo infarto agudo de miocardio o forzar una nueva resvascularización del vaso afectado, y que en casos de reestenosis recurrente representa un reto terapéutico. Objetivos: Después de realizar una revisión de las técnicas de inteligencia artificial aplicadas a la medicina y con mayor profundidad, de las técnicas machine learning aplicadas a la cardiología, el objetivo principal de esta tesis doctoral ha sido desarrollar un modelo machine learning para predecir la aparición de reestenosis en pacientes con infarto agudo de miocardio sometidos a ICP con implante de un stent. Asimismo, han sido objetivos secundarios comparar el modelo desarrollado con machine learning con los scores clásicos de riesgo de reestenosis utilizados hasta la fecha; y desarrollar un software que permita trasladar esta contribución a la práctica clínica diaria de forma sencilla. Para desarrollar un modelo fácilmente aplicable, realizamos nuestras predicciones sin variables adicionales a las obtenidas en la práctica rutinaria. Material: El conjunto de datos, obtenido del ensayo GRACIA-3, consistió en 263 pacientes con características demográficas, clínicas y angiográficas; 23 de ellos presentaron reestenosis a los 12 meses después de la implantación del stent. Todos los desarrollos llevados a cabo se han hecho en Python y se ha utilizado computación en la nube, en concreto AWS (Amazon Web Services). Metodología: Se ha utilizado una metodología para trabajar con conjuntos de datos pequeños y no balanceados, siendo importante el esquema de validación cruzada anidada utilizado, así como la utilización de las curvas PR (precision-recall, exhaustividad-sensibilidad), además de las curvas ROC, para la interpretación de los modelos. Se han entrenado los algoritmos más habituales en la literatura para elegir el que mejor comportamiento ha presentado. Resultados: El modelo con mejores resultados ha sido el desarrollado con un clasificador extremely randomized trees; que superó significativamente (0,77; área bajo la curva ROC a los tres scores clínicos clásicos; PRESTO-1 (0,58), PRESTO-2 (0,58) y TLR (0,62). Las curvas exhaustividad sensibilidad ofrecieron una imagen más precisa del rendimiento del modelo extremely randomized trees que muestra un algoritmo eficiente (0,96) para no reestenosis, con alta exhaustividad y alta sensibilidad. Para un umbral considerado óptimo, de 1,000 pacientes sometidos a implante de stent, nuestro modelo machine learning predeciría correctamente 181 (18%) más casos en comparación con el mejor score de riesgo clásico (TLR). Las variables más importantes clasificadas según su contribución a las predicciones fueron diabetes, enfermedad coronaria en 2 ó más vasos, flujo TIMI post-ICP, plaquetas anormales, trombo post-ICP y colesterol anormal. Finalmente, se ha desarrollado una calculadora para trasladar el modelo a la práctica clínica. La calculadora permite estimar el riesgo individual de cada paciente y situarlo en una zona de riesgo, facilitando la toma de decisión al médico en cuanto al seguimiento adecuado para el mismo. Conclusiones: Aplicado inmediatamente después de la implantación del stent, un modelo machine learning diferencia mejor a aquellos pacientes que presentarán o no reestenosis respecto a los discriminadores clásicos actuales

    Self-Similar Vector Fields

    Get PDF
    We propose statistically self-similar and rotation-invariant models for vector fields, study some of the more significant properties of these models, and suggest algorithms and methods for reconstructing vector fields from numerical observations, using the same notions of self-similarity and invariance that give rise to our stochastic models. We illustrate the efficacy of the proposed schemes by applying them to the problems of denoising synthetic flow phantoms and enhancing flow-sensitive magnetic resonance imaging (MRI) of blood flow in the aorta. In constructing our models and devising our applied schemes and algorithms, we rely on two fundamental notions. The first of these, referred to as "innovation modelling" in the thesis, is the principle —applicable both analytically and synthetically— of reducing complex phenomena to combinations of simple independent components or "innovations". The second fundamental idea is that of "invariance", which indicates that in the absence of any distinguishing factor, two equally valid models or solutions should be given equal consideration
    corecore