978 research outputs found

    Speech Recognition

    Get PDF
    Chapters in the first part of the book cover all the essential speech processing techniques for building robust, automatic speech recognition systems: the representation for speech signals and the methods for speech-features extraction, acoustic and language modeling, efficient algorithms for searching the hypothesis space, and multimodal approaches to speech recognition. The last part of the book is devoted to other speech processing applications that can use the information from automatic speech recognition for speaker identification and tracking, for prosody modeling in emotion-detection systems and in other speech processing applications that are able to operate in real-world environments, like mobile communication services and smart homes

    Blind Estimation of Linear and Nonlinear Sparse Channels, Journal of Telecommunications and Information Technology, 2013, nr 1

    Get PDF
    This paper presents a Clustering Based Blind Channel Estimator for a special case of sparse channels – the zero pad channels. The proposed algorithm uses an unsupervised clustering technique for the estimation of data clusters. Clusters labelling is performed by a Hidden Markov Model of the observation sequence appropriately modified to exploit channel sparsity. The algorithm achieves a substantial complexity reduction compared to the fully evaluated technique. The proposed algorithm is used in conjunction with a Parallel Trellis Viterbi Algorithm for data detection and simulation results show that the overall scheme exhibits the reduced complexity benefits without performance reduction

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Blind Change Point Detection And Regime Segmentation Using Gaussian Process Regression

    Get PDF
    Time-series analysis is used heavily in modeling and forecasting weather, economics, medical data as well as in various other fields. Change point detection (CPD) means finding abrupt changes in the time-series when the statistical property of a certain part of it starts to differ. CPD has attracted a lot of attention in the artificial intelligence, machine learning and data mining communities. In this thesis, a novel CPD algorithm is introduced for segmenting multivariate time-series data. The proposed algorithm is a general pipeline to process any high dimensional multivariate time-series data using nonlinear non-parametric dynamic system. It consists of manifold learning technique for dimensionality reduction, Gaussian process regression to model the non-linear dynamics of the data and predict the next possible time-step, as well as outlier detection based on Mahalanobis distance to determine the change points. The performance of the new CPD algorithm is assessed on synthetic as well as real-world data for validation. The pipeline is used on federal reserve economic data (FRED) to detect recession. Finally, functional magnetic resonance imaging (fMRI) data of larval zebrafish is used to segment regions of homogeneous brain activity

    Machine learning for optical fiber communication systems: An introduction and overview

    Get PDF
    Optical networks generate a vast amount of diagnostic, control and performance monitoring data. When information is extracted from this data, reconfigurable network elements and reconfigurable transceivers allow the network to adapt both to changes in the physical infrastructure but also changing traffic conditions. Machine learning is emerging as a disruptive technology for extracting useful information from this raw data to enable enhanced planning, monitoring and dynamic control. We provide a survey of the recent literature and highlight numerous promising avenues for machine learning applied to optical networks, including explainable machine learning, digital twins and approaches in which we embed our knowledge into the machine learning such as physics-informed machine learning for the physical layer and graph-based machine learning for the networking layer
    corecore