51 research outputs found

    Prosody in text-to-speech synthesis using fuzzy logic

    Get PDF
    For over a thousand years, inventors, scientists and researchers have tried to reproduce human speech. Today, the quality of synthesized speech is not equivalent to the quality of real speech. Most research on speech synthesis focuses on improving the quality of the speech produced by Text-to-Speech (TTS) systems. The best TTS systems use unit selection-based concatenation to synthesize speech. However, this method is very timely and the speech database is very large. Diphone concatenated synthesized speech requires less memory, but sounds robotic. This thesis explores the use of fuzzy logic to make diphone concatenated speech sound more natural. A TTS is built using both neural networks and fuzzy logic. Text is converted into phonemes using neural networks. Fuzzy logic is used to control the fundamental frequency for three types of sentences. In conclusion, the fuzzy system produces f0 contours that make the diphone concatenated speech sound more natural

    Cogitator : a parallel, fuzzy, database-driven expert system

    Get PDF
    The quest to build anthropomorphic machines has led researchers to focus on knowledge and the manipulation thereof. Recently, the expert system was proposed as a solution, working well in small, well understood domains. However these initial attempts highlighted the tedious process associated with building systems to display intelligence, the most notable being the Knowledge Acquisition Bottleneck. Attempts to circumvent this problem have led researchers to propose the use of machine learning databases as a source of knowledge. Attempts to utilise databases as sources of knowledge has led to the development Database-Driven Expert Systems. Furthermore, it has been ascertained that a requisite for intelligent systems is powerful computation. In response to these problems and proposals, a new type of database-driven expert system, Cogitator is proposed. It is shown to circumvent the Knowledge Acquisition Bottleneck and posess many other advantages over both traditional expert systems and connectionist systems, whilst having non-serious disadvantages.KMBT_22

    Novel Processing and Transmission Techniques Leveraging Edge Computing for Smart Health Systems

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    Cellular Automata

    Get PDF
    Modelling and simulation are disciplines of major importance for science and engineering. There is no science without models, and simulation has nowadays become a very useful tool, sometimes unavoidable, for development of both science and engineering. The main attractive feature of cellular automata is that, in spite of their conceptual simplicity which allows an easiness of implementation for computer simulation, as a detailed and complete mathematical analysis in principle, they are able to exhibit a wide variety of amazingly complex behaviour. This feature of cellular automata has attracted the researchers' attention from a wide variety of divergent fields of the exact disciplines of science and engineering, but also of the social sciences, and sometimes beyond. The collective complex behaviour of numerous systems, which emerge from the interaction of a multitude of simple individuals, is being conveniently modelled and simulated with cellular automata for very different purposes. In this book, a number of innovative applications of cellular automata models in the fields of Quantum Computing, Materials Science, Cryptography and Coding, and Robotics and Image Processing are presented

    Multi-agent and knowledge-based system for power transformer fault diagnosis

    Get PDF
    Transformer reliability and stability are the key concerns. In order to increase their efficiency, an automatic monitoring and fault diagnosing of the power transformers are required. Dissolved Gas Analysis (DGA) is one of the most important tools to diagnose the condition of oil-immersed transformer. Agents technology as a new, robust and helpful technique, successfully applied for various applications. Integration of the Multi-Agent System (MAS) with knowledge base provides a robust system for various applications, such as fault diagnosis and automated actions performing, etc. For this purpose, the present study was conducted in the field of MAS based on Gaia methodology and knowledge base. The developed MAS followed by Gaia methodology represents a generic framework that is capable to manage agents executions and message delivery. Real-time data is sampled from a power transformer and saved into a database, and it is also available to the user on request. Three types of knowledge-based systems, namely the rule-based reasoning, ontology and fuzzy ontology, were applied for the MAS. Therefore, the developed MAS is shown to be successfully applied for condition monitoring of power transformer using the real-time data. The Roger’s method was used with all of the knowledge-based systems named above, and the accuracy of the results was compared and discussed. Of the knowledge-based systems studied, fuzzy ontology is found to be the best performing one in terms of results accuracy, compared to the rule-based reasoning and ontology. The application of the developed fuzzy ontology allowed to improve the accuracy by over 22%. Unlike the previous works in this field, that were not capable of dealing with the uncertainty situations, the present work based on fuzzy ontology has a clear advantage of successfully solving the problem with some degree of uncertainty. This is especially important, as the most of the real-world situations involve some uncertainty. Overall, the work contributes the use of the knowledge base and the multi-agent system for the fault diagnosis of the power transformer, including the novel application of fuzzy ontology for dealing with the uncertain situations. The advantages of the proposed method are the ease of the upgrade, flexibility, efficient fault diagnosis and reliability. The application of the proposed technique would benefit the power system reliability, as it would result in reduction of the number of engineering experts required, lower maintenance expenses and extended lifetime of power transformer

    Robust damage detection in smart structures

    Get PDF
    This thesis is devoted to present some novel techniques in Structural Health Monitoring (SHM). SHM is a developing field that tries to monitor structures to make sure that they remain in their desired condition to avoid any catastrophe. SHM includes different levels from damage detection area to prognosis field. This work is dedicated to the first level, which might be considered the main and most important level. New techniques presented in this work are based on different statistical and signal processing methods such as Principal Component Analysis and its robust counterpart, Wavelet Transform, Fuzzy similarity, Andrew plots, etc. These techniques are applied on the propagated waves that are activated and captured in the structure using appropriate transducers. Piezoceramic (PZT) devices are chosen in this work to capture the signals due to their special characteristics such as high performance, low energy consumption and reasonable price. To guarantee the efficiency of the suggested techniques, they are tested on different laboratory and real scale test benchmarks, such as aluminum and composite plates, fuselage, wing skeleton, tube, etc. Because of the variety of tested benchmarks, this thesis is called damage detection in smart structures. This variety may promise the ability and capability of the proposed methods on different fields such as aerospace and gas/oil industry. In addition to the normal laboratory conditions, it is shown in this work that environmental changes can affect the performance of the damage detection and wave propagation significantly. As such, there is a vital need to consider their effect. In this work, temperature change is chosen as it is one of the main environmental fluctuation factors. To scrutinize its effect on damage detection, first, the effect of temperature is considered on wave propagation and then all the proposed methods are tested to check whether they are sensitive to temperature change or not. Finally, a temperature compensation method is applied to ensure that the proposed methods are stable and robust even when structures are subjected to variant environmental conditions.La presente tesis doctoral se dedica a la exploración y presentación de técnicas novedosas para la Monitorización y detección de defectos en estructuras (Structural Health Monitoring -SHM-) SHM es un campo actualmente en desarrollo que pretende asegurarse que las estructuras permanecen en su condición deseada para evitar cualquier catástrofe. En SHM se presentan diferentes niveles de diagnóstico, Este trabajo se concentra en el primer nivel, que se considera el más importante, la detección de los defectos. Las nuevas técnicas presentadas en esta tesis se basan en diferentes métodos estadísticos y de procesamiento de señales tales como el Análisis de Componentes Princpales (PCA) y sus variaciones robustas, Transformada wavelets, lógica difusa, gráficas de Andrew, etc. Estas técnicas de aplican sobre las ondas de vibración que se generan y se miden en la estructura utilizando trasductores apropiados. Dispositivos piezocerámicos (PZT's) se han escogido para este trabajo ya que presentan características especiales tales como: alto rendimiento, bajo consumo de energia y bajo costo. Para garantizar la eficacia de la metodología propuesta,se ha validado en diferentes laboratorios y estructuras a escala real: placas de aluminio y de material compuesto, fuselage de un avión, revestimiento del ala de un avóin, tubería, etc. Debido a la gran variedad de estructuras utilizadas, su aplicación en la industria aeroespacial y/o petrolera es prometedora. Por otra parte, los cambios ambientales pueden afectar al rendimiento de la detección de daños y propagación de la onda significativamente . En este trabajo , se estudia el efecto de las variaciones de temperatura ya que es uno de los principales factores de fluctuación del medio ambiente . Para examinar su efecto en la detección de daños, en primer lugar, todos los métodos propuestos se prueban para comprobar si son sensibles a los cambios de temperatura o no. Finalmente , se aplica un método de compensación de temperatura para garantizar que los métodos propuestos son estables y robustos incluso cuando las estructuras se someten a condiciones ambientales variante

    Emergent Design

    Get PDF
    Explorations in Systems Phenomenology in Relation to Ontology, Hermeneutics and the Meta-dialectics of Design SYNOPSIS A Phenomenological Analysis of Emergent Design is performed based on the foundations of General Schemas Theory. The concept of Sign Engineering is explored in terms of Hermeneutics, Dialectics, and Ontology in order to define Emergent Systems and Metasystems Engineering based on the concept of Meta-dialectics. ABSTRACT Phenomenology, Ontology, Hermeneutics, and Dialectics will dominate our inquiry into the nature of the Emergent Design of the System and its inverse dual, the Meta-system. This is an speculative dissertation that attempts to produce a philosophical, mathematical, and theoretical view of the nature of Systems Engineering Design. Emergent System Design, i.e., the design of yet unheard of and/or hitherto non-existent Systems and Metasystems is the focus. This study is a frontal assault on the hard problem of explaining how Engineering produces new things, rather than a repetition or reordering of concepts that already exist. In this work the philosophies of E. Husserl, A. Gurwitsch, M. Heidegger, J. Derrida, G. Deleuze, A. Badiou, G. Hegel, I. Kant and other Continental Philosophers are brought to bear on different aspects of how new technological systems come into existence through the midwifery of Systems Engineering. Sign Engineering is singled out as the most important aspect of Systems Engineering. We will build on the work of Pieter Wisse and extend his theory of Sign Engineering to define Meta-dialectics in the form of Quadralectics and then Pentalectics. Along the way the various ontological levels of Being are explored in conjunction with the discovery that the Quadralectic is related to the possibility of design primarily at the Third Meta-level of Being, called Hyper Being. Design Process is dependent upon the emergent possibilities that appear in Hyper Being. Hyper Being, termed by Heidegger as Being (Being crossed-out) and termed by Derrida as Differance, also appears as the widest space within the Design Field at the third meta-level of Being and therefore provides the most leverage that is needed to produce emergent effects. Hyper Being is where possibilities appear within our worldview. Possibility is necessary for emergent events to occur. Hyper Being possibilities are extended by Wild Being propensities to allow the embodiment of new things. We discuss how this philosophical background relates to meta-methods such as the Gurevich Abstract State Machine and the Wisse Metapattern methods, as well as real-time architectural design methods as described in the Integral Software Engineering Methodology. One aim of this research is to find the foundation for extending the ISEM methodology to become a general purpose Systems Design Methodology. Our purpose is also to bring these philosophical considerations into the practical realm by examining P. Bourdieu’s ideas on the relationship between theoretical and practical reason and M. de Certeau’s ideas on practice. The relationship between design and implementation is seen in terms of the Set/Mass conceptual opposition. General Schemas Theory is used as a way of critiquing the dependence of Set based mathematics as a basis for Design. The dissertation delineates a new foundation for Systems Engineering as Emergent Engineering based on General Schemas Theory, and provides an advanced theory of Design based on the understanding of the meta-levels of Being, particularly focusing upon the relationship between Hyper Being and Wild Being in the context of Pure and Process Being

    Automatic construction and updating of knowledge base from log data

    Get PDF
    Large software systems can be very complex, and they get more and more complex in the recently popular Microservice Architecture due to increasing interactions among more components in bigger systems. Plain model-based and data-driven diagnosis approaches can be used for fault detection, but they are usually opaque and demand massive computing power. On the other hand, knowledge-based methods have shown to be not only effective but explainable and human-friendly for various tasks such as Fault Analysis, but are dependent on having a knowledge base. The construction and maintenance of knowledge bases are not a trivial problem, which is referred to as the knowledge bottleneck. Software system logs are the primary and most available, sometimes the only available data that record system runtime information, which are critical for software system Operation and Maintenance (O\&M). I proposed the TREAT framework, which can automate the construction and update a knowledge base from a continual stream of logs, which aims to, as faithfully as possible, reflect the latest states of the assisted software system, and facilitate downstream tasks, typically fault localisation. To the best of our knowledge, this is the first effort to construct a fully automated ever-updating knowledge base from logs that aims at reflecting the internal changing states of a software system. To evaluate the TREAT framework, I devised a knowledge-based solution involving logic programming and inductive logic programming that makes use of a TREAT-powered knowledge base to fault localisation and conducted empirical experiments of this solution on a real-life 5G network test bed system. Since evaluating the TREAT framework by fault localisation is indirect and involves many confounding factors, e.g., the specific solution to fault localisation, I explored and came up with a novel method called LP-Measure that can directly assess the quality of a given knowledge base, in particular the robustness and redundancy of a knowledge graph. Besides, it was observed that although the extracted knowledge is of high quality in general, there are also errors in the knowledge extraction process. I surveyed the way to quantify the uncertainty during the knowledge extraction process and assign probabilities of correct extraction to every piece of knowledge, which led to a deep investigation into probability calibration and knowledge graph embeddings, specifically testing and confirming the phenomenon of uncalibrated probabilities in knowledge graph embeddings and how to choose specific calibration models from the existing toolbox
    corecore