11 research outputs found

    Novel neural approaches to data topology analysis and telemedicine

    Get PDF
    1noL'abstract è presente nell'allegato / the abstract is in the attachmentopen676. INGEGNERIA ELETTRICAnoopenRandazzo, Vincenz

    Induction Machine Stator Fault Tracking using the Growing Curvilinear Component Analysis

    Get PDF
    Detection of stator-based faults in Induction Machines (IMs) can be carried out in numerous ways. In particular, the shorted turns in stator windings of IM are among the most common faults in the industry. As a matter of fact, most IMs come with pre-installed current sensors for the purpose of control and protection. At this aim, using only the stator current for fault detection has become a recent trend nowadays as it is much cheaper than installing additional sensors. The three-phase stator current signatures have been used in this study to observe the effect of stator inter-turn fault with respect to the healthy condition of the IM. The pre-processing of the healthy and faulty current signatures has been done via the in-built DSP module of dSPACE after which, these current signatures are passed into the MATLAB® software for further analysis using AI techniques. The authors present a Growing Curvilinear Component Analysis (GCCA) neural network that is capable of detecting and follow the evolution of the stator fault using the stator current signature, making online fault detection possible. For this purpose, a topological manifold analysis is carried out to study the fault evolution, which is a fundamental step for calibrating the GCCA neural network. The effectiveness of the proposed method has been verified experimentally

    Tracking Evolution of Stator-based Fault in Induction Machines using the Growing Curvilinear Component Analysis Neural Network

    Get PDF
    Stator-based faults are one of the most common faults among induction motors (IMs). The conventional approach to IM control and protection employs current sensors installed on the motor. Recently, most studies have focused on fault detection by means of stator current. This paper presents an application of the Growing Curvilinear Component Analysis (GCCA) neural network aided by the Extended Park Vector Approach (EPVA) for the purpose of transforming the three-phase current signals. The GCCA is a growing neural based technique specifically designed to detect and follow changes in the input distribution, e.g. stator faults. In particular, the GCCA has proven its capability of correctly identifying and tracking stator inter-turn fault in IMs. To this purpose, the three-phase stator currents have been acquired from IMs, which start at healthy operating state and, evolve to different fault severities (up to 10%) under different loading conditions. Data has been transformed using the EPVA and pre-processed to extract statistical time domain features. To calibrate the GCCA neural network, a topological manifold analysis has been carried out to study the input features. The efficacy of the proposed method has been verified experimentally using IM with l.lkW rating and has potential for IMs with different manufacturing conditions

    Shallow Neural Network for Biometrics from the ECG-WATCH

    Get PDF
    Applications such as surveillance, banking and healthcare deal with sensitive data whose confidentiality and integrity depends on accurate human recognition. In this sense, the crucial mechanism for performing an effective access control is authentication, which unequivocally yields user identity. In 2018, just in North America, around 445K identity thefts have been denounced. The most adopted strategy for automatic identity recognition uses a secret for encrypting and decrypting the authentication information. This approach works very well until the secret is kept safe. Electrocardiograms (ECGs) can be exploited for biometric purposes because both the physiological and geometrical differences in each human heart correspond to uniqueness in the ECG morphology. Compared with classical biometric techniques, e.g. fingerprints, ECG-based methods can definitely be considered a more reliable and safer way for user authentication due to ECG inherent robustness to circumvention, obfuscation and replay attacks. In this paper, the ECG WATCH, a non-expensive wristwatch for recording ECGs anytime, anywhere, in just 10 s, is proposed for user authentication. The ECG WATCH acquisitions have been used to train a shallow neural network, which has reached a 99% classification accuracy and 100% intruder recognition rate

    Gaining deep knowledge of Android malware families through dimensionality reduction techniques

    Get PDF
    [Abstract] This research proposes the analysis and subsequent characterisation of Android malware families by means of low dimensional visualisations using dimensional reduction techniques. The well-known Malgenome data set, coming from the Android Malware Genome Project, has been thoroughly analysed through the following six dimensionality reduction techniques: Principal Component Analysis, Maximum Likelihood Hebbian Learning, Cooperative Maximum Likelihood Hebbian Learning, Curvilinear Component Analysis, Isomap and Self Organizing Map. Results obtained enable a clear visual analysis of the structure of this high-dimensionality data set, letting us gain deep knowledge about the nature of such Android malware families. Interesting conclusions are obtained from the real-life data set under analysis

    Gaining deep knowledge of Android malware families through dimensionality reduction techniques

    Get PDF
    This research proposes the analysis and subsequent characterisation of Android malware families by means of low dimensional visualisations using dimensional reduction techniques. The well-known Malgenome data set, coming from the Android Malware Genome Project, has been thoroughly analysed through the following six dimensionality reduction techniques: Principal Component Analysis, Maximum Likelihood Hebbian Learning, Cooperative Maximum Likelihood Hebbian Learning, Curvilinear Component Analysis, Isomap and Self Organizing Map. Results obtained enable a clear visual analysis of the structure of this high-dimensionality data set, letting us gain deep knowledge about the nature of such Android malware families. Interesting conclusions are obtained from the real-life data set under analysis

    Análisis y detección de ataques informáticos mediante sistemas inteligentes de reducción dimensional

    Get PDF
    Programa Oficial de Doutoramento en Enerxía e Propulsión Mariña. 5014P01[Resumen] El presente trabajo de investigación aborda el estudio y desarrollo de una metodología para la detección de ataques informáticos mediante el uso de sistemas y técnicas inteligentes de reducción dimensional en el ámbito de la ciberseguridad. Con esta propuesta se pretende dividir el problema en dos fases. La primera consiste en un reducción dimensional del espacio de entrada original, proyectando los datos sobre un espacio de salida de menor dimensión mediante transformaciones lineales y/o no lineales que permiten obtener una mejor visualización de la estructura interna del conjunto de datos. En la segunda fase se introduce el conocimiento de un experto humano que permite aportar su conocimiento mediante el etiquetado de las muestras en base a las proyecciones obtenidas y su experiencia sobre el problema. Esta novedosa propuesta pone a disposición del usuario final una herramienta sencilla y proporciona unos resultados intuitivos y fácilmente interpretables, permitiendo hacer frente a nuevas amenazas a las que el usuario no se haya visto expuesto, obteniendo resultados altamente satisfactorios en todos los casos reales en los que se ha aplicado. El sistema desarrollado ha sido validado sobre tres supuestos reales diferentes, en los que se ha avanzado en términos de conocimiento con un claro hilo conductor de progreso positivo de la propuesta. En el primero de los casos se efectúa un análisis de un conocido conjunto de datos de malware de Android en el que, mediante técnicas clásicas de reducción dimensional, se efectúa una caracterización de las diversas familias de malware. Para la segunda de las propuestas se trabaja sobre el mismo conjunto de datos, pero en este caso se aplican técnicas más avanzadas e incipientes de reducción dimensional y visualización, consiguiendo que los resultados se mejoren significativamente. En el último de los trabajos se aprovecha el conocimiento de los dos trabajos previos, y se aplica a la detección de intrusión en sistemas informáticos sobre datos de redes, en las que se producen ataques de diversa índole durante procesos de funcionamiento normal de la red.[Abstract] This research work addresses the study and development of a methodology for the detection of computer attacks using intelligent systems and techniques for dimensional reduction in the eld of cybersecurity. This proposal is intended to divide the problem into two phases. The rst consists of a dimensional reduction of the original input space, projecting the data onto a lower-dimensional output space using linear or non-linear transformations that allow a better visualization of the internal structure of the dataset. In the second phase, the experience of an human expert is presented, which makes it possible to contribute his knowledge by labeling the samples based on the projections obtained and his experience on the problem. This innovative proposal makes a simple tool available to the end user and provides intuitive and easily interpretable results, allowing to face new threats to which the user has not been exposed, obtaining highly satisfactory results in all real cases in which has been applied. The developed system has been validated on three di erent real case studies, in which progress has been made in terms of knowledge with a clear guiding thread of positive progress of the proposal. In the rst case, an analysis of a well-known Android malware dataset is carried out, in which a characterization of the various families of malware is developed using classical dimensional reduction techniques. For the second of the proposals, it has been worked on the same data set, but in this case more advanced and incipient techniques of dimensional reduction and visualization are applied, achieving a signi cant improvement in the results. The last work takes advantage of the knowledge of the two previous works, which is applied to the detection of intrusion in computer systems on network dataset, in which attacks of di erent kinds occur during normal network operation processes.[Resumo] Este traballo de investigación aborda o estudo e desenvolvemento dunha metodoloxía para a detección de ataques informáticos mediante o uso de sistemas e técnicas intelixentes de reducción dimensional no ámbito da ciberseguridade. Esta proposta pretende dividir o problema en dúas fases. A primeira consiste nunha redución dimensional do espazo de entrada orixinal, proxectando os datos nun espazo de saída de menor dimensionalidade mediante transformacións lineais ou non lineais que permitan unha mellor visualización da estrutura interna do conxunto de datos. Na segunda fase, introdúcese a experiencia dun experto humano, que lle permite achegar os seus coñecementos etiquetando as mostras en función das proxeccións obtidas e da súa experiencia sobre o problema. Esta proposta innovadora pon a disposición do usuario nal unha ferramenta sinxela e proporciona resultados intuitivos e facilmente interpretables, que permiten facer fronte a novas ameazas ás que o usuario non estivo exposto, obtendo resultados altamente satisfactorios en todos os casos reais nos que se aplicou. O sistema desenvolvido validouse sobre tres supostos reais diferentes, nos que se avanzou en canto ao coñecemento cun claro fío condutor de avance positivo da proposta. No primeiro caso, realízase unha análise dun coñecido conxunto de datos de malware Android, no que se realiza unha caracterización das distintas familias de malware mediante técnicas clásicas de reducción dimensional. Para a segunda das propostas trabállase sobre o mesmo conxunto de datos, pero neste caso aplícanse técnicas máis avanzadas e incipientes de reducción dimensional e visualización, conseguindo que os resultados se melloren notablemente. O último dos traballos aproveita o coñecemento dos dous traballos anteriores, e aplícase á detección de intrusos en sistemas informáticos en datos da rede, nos que se producen ataques de diversa índole durante os procesos normais de funcionamento da rede

    Beta hebbian learning: definition and analysis of a new family of learning rules for exploratory projection pursuit

    Get PDF
    [EN] This thesis comprises an investigation into the derivation of learning rules in artificial neural networks from probabilistic criteria. •Beta Hebbian Learning (BHL). First of all, it is derived a new family of learning rules which are based on maximising the likelihood of the residual from a negative feedback network when such residual is deemed to come from the Beta Distribution, obtaining an algorithm called Beta Hebbian Learning, which outperforms current neural algorithms in Exploratory Projection Pursuit. • Beta-Scale Invariant Map (Beta-SIM). Secondly, Beta Hebbian Learning is applied to a well-known Topology Preserving Map algorithm called Scale Invariant Map (SIM) to design a new of its version called Beta-Scale Invariant Map (Beta-SIM). It is developed to facilitate the clustering and visualization of the internal structure of high dimensional complex datasets effectively and efficiently, specially those characterized by having internal radial distribution. The Beta-SIM behaviour is thoroughly analysed comparing its results, in terms performance quality measures with other well-known topology preserving models. • Weighted Voting Superposition Beta-Scale Invariant Map (WeVoS-Beta-SIM). Finally, the use of ensembles such as the Weighted Voting Superposition (WeVoS) is tested over the previous novel Beta-SIM algorithm, in order to improve its stability and to generate accurate topology maps when using complex datasets. Therefore, the WeVoS-Beta-Scale Invariant Map (WeVoS-Beta-SIM), is presented, analysed and compared with other well-known topology preserving models. All algorithms have been successfully tested using different artificial datasets to corroborate their properties and also with high-complex real datasets.[ES] Esta tesis abarca la investigación sobre la derivación de reglas de aprendizaje en redes neuronales artificiales a partir de criterios probabilísticos. • Beta Hebbian Learning (BHL). En primer lugar, se deriva una nueva familia de reglas de aprendizaje basadas en maximizar la probabilidad del residuo de una red con retroalimentación negativa cuando se considera que dicho residuo proviene de la Distribución Beta, obteniendo un algoritmo llamado Beta Hebbian Learning, que mejora a algoritmos neuronales actuales de búsqueda de proyecciones exploratorias. • Beta-Scale Invariant Map (Beta-SIM). En Segundo lugar, Beta Hebbian Learning se aplica a un conocido algoritmo de Mapa de Preservación de la Topología llamado Scale Invariant Map (SIM) para diseñar una nueva versión llamada Beta-Scale Invariant Map (Beta-SIM). Este nuevo algoritmo ha sido desarrollado para facilitar el agrupamiento y visualización de la estructura interna de conjuntos de datos complejos de alta dimensionalidad de manera eficaz y eficiente, especialmente aquellos caracterizados por tener una distribución radial interna. El comportamiento de Beta-SIM es analizado en profundidad comparando sus resultados, en términos de medidas de calidad de rendimiento con otros modelos bien conocidos de preservación de topología. • Weighted Voting Superposition Beta-Scale Invariant Map (WeVoS-Beta-SIM). Finalmente, el uso de ensembles como el Weighted Voting Superposition (WeVoS) sobre el algoritmo Beta-SIM es probado, con objeto de mejorar su estabilidad y generar mapas topológicos precisos cuando se utilizan conjuntos de datos complejos. Por lo tanto, se presenta, analiza y compara el WeVoS-Beta-Scale Invariant Map (WeVoS-Beta-SIM) con otros modelos bien conocidos de preservación de topología. Todos los algoritmos han sido probados con éxito sobre conjuntos de datos artificiales para corroborar sus propiedades, así como con conjuntos de datos reales de gran complejidad

    Contributions to the multivariate Analysis of Marine Environmental Monitoring

    Get PDF
    The thesis parts from the view that statistics starts with data, and starts by introducing the data sets studied: marine benthic species counts and chemical measurements made at a set of sites in the Norwegian Ekofisk oil field, with replicates and annually repeated. An introductory chapter details the sampling procedure and shows with reliability calculations that the (transformed) chemical variables have excellent reliability, whereas the biological variables have poor reliability, except for a small subset of abundant species. Transformed chemical variables are shown to be approximately normal. Bootstrap methods are used to assess whether the biological variables follow a Poisson distribution, and lead to the conclusion that the Poisson distribution must be rejected, except for rare species. A separate chapter details more work on the distribution of the species variables: truncated and zero-inflated Poisson distributions as well as Poisson mixtures are used in order to account for sparseness and overdispersion. Species are thought to respond to environmental variables, and regressions of the abundance of a few selected species onto chemical variables are reported. For rare species, logistic regression and Poisson regression are the tools considered, though there are problems of overdispersion. For abundant species, random coefficient models are needed in order to cope with intraclass correlation. The environmental variables, mainly heavy metals, are highly correlated, leading to multicollinearity problems. The next chapters use a multivariate approach, where all species data is now treated simultaneously. The theory of correspondence analysis is reviewed, and some theoretical results on this method are reported (bounds for singular values, centring matrices). An applied chapter discusses the correspondence analysis of the species data in detail, detects outliers, addresses stability issues, and considers different ways of stacking data matrices to obtain an integrated analysis of several years of data, and to decompose variation into a within-sites and between-sites component. More than 40 % of the total inertia is due to variation within stations. Principal components analysis is used to analyse the set of chemical variables. Attempts are made to integrate the analysis of the biological and chemical variables. A detailed theoretical development shows how continuous variables can be mapped in an optimal manner as supplementary vectors into a correspondence analysis biplot. Geometrical properties are worked out in detail, and measures for the quality of the display are given, whereas artificial data and data from the monitoring survey are used to illustrate the theory developed. The theory of display of supplementary variables in biplots is also worked out in detail for principal component analysis, with attention for the different types of scaling, and optimality of displayed correlations. A theoretical chapter follows that gives an in depth theoretical treatment of canonical correspondence analysis, (linearly constrained correspondence analysis, CCA for short) detailing many mathematical properties and aspects of this multivariate method, such as geometrical properties, biplots, use of generalized inverses, relationships with other methods, etc. Some applications of CCA to the survey data are dealt with in a separate chapter, with their interpretation and indication of the quality of the display of the different matrices involved in the analysis. Weighted principal component analysis of weighted averages is proposed as an alternative for CCA. This leads to a better display of the weighted averages of the species, and in the cases so far studied, also leads to biplots with a higher amount of explained variance for the environmental data. The thesis closes with a bibliography and outlines some suggestions for further research, such as a the generalization of canonical correlation analysis for working with singular covariance matrices, the use partial least squares methods to account for the excess of predictors, and data fusion problems to estimate missing biological data

    Livestock Production

    Get PDF
    Innumerable publications on livestock production are available in the world market. The book under discussion has not been produced to burden the market with another such publication rather it has been brought out employing a novice format to meet the requirements of students, researchers who are working in different parts of the world in different environments
    corecore