7 research outputs found

    Radial Basis Function Neural Network with Localized Stochastic-Sensitive Autoencoder for Home-Based Activity Recognition

    No full text
    Over the past few years, the Internet of Things (IoT) has been greatly developed with one instance being smart home devices gradually entering into people’s lives. To maximize the impact of such deployments, home-based activity recognition is required to initially recognize behaviors within smart home environments and to use this information to provide better health and social care services. Activity recognition has the ability to recognize people’s activities from the information about their interaction with the environment collected by sensors embedded within the home. In this paper, binary data collected by anonymous binary sensors such as pressure sensors, contact sensors, passive infrared sensors etc. are used to recognize activities. A radial basis function neural network (RBFNN) with localized stochastic-sensitive autoencoder (LiSSA) method is proposed for the purposes of home-based activity recognition. An autoencoder (AE) is introduced to extract useful features from the binary sensor data by converting binary inputs into continuous inputs to extract increased levels of hidden information. The generalization capability of the proposed method is enhanced by minimizing both the training error and the stochastic sensitivity measure in an attempt to improve the ability of the classifier to tolerate uncertainties in the sensor data. Four binary home-based activity recognition datasets including OrdonezA, OrdonezB, Ulster, and activities of daily living data from van Kasteren (vanKasterenADL) are used to evaluate the effectiveness of the proposed method. Compared with well-known benchmarking approaches including support vector machine (SVM), multilayer perceptron neural network (MLPNN), random forest and an RBFNN-based method, the proposed method yielded the best performance with 98.35%, 86.26%, 96.31%, 92.31% accuracy on four datasets, respectively

    Selection of weight quantisation accuracy for radial basis function neural network using stochastic sensitivity measure

    No full text

    SeDepTTS: Enhancing the Naturalness via Semantic Dependency and Local Convolution for Text-to-Speech Synthesis

    No full text
    Self-attention-based networks have obtained impressive performance in parallel training and global context modeling. However, it is weak in local dependency capturing, especially for data with strong local correlations such as utterances. Therefore, we will mine linguistic information of the original text based on a semantic dependency and the semantic relationship between nodes is regarded as prior knowledge to revise the distribution of self-attention. On the other hand, given the strong correlation between input characters, we introduce a one-dimensional (1-D) convolution neural network (CNN) producing query(Q) and value(V) in the self-attention mechanism for a better fusion of local contextual information. Then, we migrate this variant of the self-attention networks to speech synthesis tasks and propose a non-autoregressive (NAR) neural Text-to-Speech (TTS): SeDepTTS. Experimental results show that our model yields good performance in speech synthesis. Specifically, the proposed method yields significant improvement for the processing of pause, stress, and intonation in speech

    Unsupervised multi-hashing for image retrieval in non-stationary environments

    No full text
    Hashing methods help retrieve swiftly in large-scale dataset, which is important for real-world image retrieval. New data is produced continually in the real world which may cause concept drift and inaccurate retrieval results. To address this issue, hashing methods in non-stationary environments are proposed. However, most hashing methods in non-stationary data environments are supervised. In practice, it is hard to get exact labels of data especially in non-stationary data environments. Therefore, we propose the unsupervised multi-hashing (UMH) method for unsupervised image retrieval in non-stationary environments. Thus, in the UMH, a set of hash functions is trained and added to the kept list of hash functions sets when a new data chunk occurs. Then, multiple sets of hash functions are kept with different weights to guarantee that similarity information in old and new data are both adapted. Experiments on two real-world image datasets show that the UMH yields better retrieval performance in non-stationary environments than other comparative methods.</p

    Hashing-based undersampling for large scale histopathology image classification

    No full text
    The early diagnosis of cancer based on histopathology images plays an important role in medical science. Existing techniques generally partition the original histopathology image into small pieces for further classification. However, due to the fact that the number of benign (majority) samples is much larger than that of malignant (minority) samples, the classification is significantly imbalanced which adversely affects classification performance. Undersampling is commonly used to address the class-imbalance problem. However, existing methods are typically time consuming so they are not suitable to handle large-scale and high-dimensional data. In this paper we propose a fast and scalable undersampling method, hashing-based undersampling (HBU), to address class imbalance in large-scale medical image classification. Benign images are hashed and then placed into different buckets according to their locations in the input space. Undersampling is achieved by proportionally selecting benign images from the hash buckets. The HBU method is experimentally evaluated on two real histopathology image datasets, CAMELYON16 and ACDC@LUNGHP, by comparison with existing methods. Experimental results show that the HBU method outperforms six state-of-The-Art methods and is scalable and fast.</p
    corecore