4,698 research outputs found

    On Separating Environmental and Speaker Adaptation

    Get PDF
    This paper presents a maximum likelihood (ML) approach, concerned to the background model estimation, in noisy acoustic non-stationary environments. The external noise source is characterised by a time constant convolutional and a time varying additive components. The HMM composition technique, provides a mechanism for integrating parametric models of acoustic background with the signal model, so that noise compensation is tightly coupled with the background model estimation. However, the existing continuous adaptation algorithms usually do not take advantage of this approach, being essentially based on the MLLR algorithm. Consequently, a model for environmental mismatch is not available and, even under constrained conditions a significant number of model parameters have to be updated. From a theoretical point of view only the noise model parameters need to be updated, being the clean speech ones unchanged by the environment. So, it can be advantageous to have a model for environmental mismatch. Additionally separating the additive and convolutional components means a separation between the environmental mismatch and speaker mismatch when the channel does not change for long periods. This approach was followed in the development of the algorithm proposed in this paper. One drawback sometimes attributed to the continuous adaptation approach is that recognition failures originate poor background estimates. This paper also proposes a MAP-like method to deal with this situation

    Spectral normalization MFCC derived features for robust speech recognition

    Get PDF
    This paper presents a method for extracting MFCC parameters from a normalised power spectrum density. The underlined spectral normalisation method is based on the fact that the speech regions with less energy need more robustness, since in these regions the noise is more dominant, thus the speech is more corrupted. Less energy speech regions contain usually sounds of unvoiced nature where are included nearly half of the consonants, and are by nature the least reliable ones due to the effective noise presence even when the speech is acquired under controlled conditions. This spectral normalisation was tested under additive artificial white noise in an Isolated Speech Recogniser and showed very promising results [1]. It is well known that concerned to speech representation, MFCC parameters appear to be more effective than power spectrum based features. This paper shows how the cepstral speech representation can take advantage of the above-referred spectral normalisation and shows some results in the continuous speech recognition paradigm in clean and artificial noise conditions

    Blind source separation by independent component analysis applied to electroencephalographic signals

    Get PDF
    Independent Component Analysis (ICA) is a statistical based method, which goal is to find a linear transformation to apply to an observed multidimensional random vector such that its components become as statistically independent from each other as possible. Usually the Electroencephalographic (EEG) signal is hard to interpret and analyse since it is corrupted by some artifacts which originates the rejection of contaminated segments and perhaps in an unacceptable loss of data. The ICA filters trained on data collected during EEG sessions can identify statistically independent source channels which could then be further processed by using event-related potential (ERP), event-related spectral perturbation (ERSP) or other signal processing techniques. This paper describes, as a preliminary work, the application of ICA to EEG recordings of the human brain activity, showing its applicability

    Characterization and strengthening a "ghost" building

    Get PDF
    In the middle of the eighties it was intended to build a one family dwelling at a North Portuguese Region, but a much bigger edification was constructed, without any design elements. At the end of the nineties this construction was acquired, and another architectonic and functional configuration was designed for this space. Since there were not any elements available for the existent construction, it was carried out several strategies for its geometrical, structural and material characterization. These elements gave the indispensable information for analysing the structural stability of the building, which revealed to be necessary to strengthen foundations, beams and columns. The procedures for characterizing the construction, the structural stability analysis and the strengthening strategies are described in the work

    Molecular detection of EGFRvIII-positive cells in the peripheral blood of breast cancer patients

    Get PDF
    The aim of this study is to evaluate epidermal growth factor receptor variant III, EGFRvIII, a cancer specific mutant, as a possible marker for the diagnosis of breast cancer occult systemic disease. EGFRvIII mRNA was identified by an RT-nested PCR with a high sensitivity. In 102 women studied, the mutant was detected in the peripheral blood of 30% of 33 low risk, early stage patients, in 56% of 18 patients selected for neoadjuvant chemotherapy, in 63.6% of 11 patients with disseminated disease and 0% of 40 control women. In low risk, early stage patients, the presence of one or more tumour characteristics predicting recurrence such as the absence of oestrogen receptors and the presence of ERBB2 or histologic grades G2/G3 was significantly associated with EFGRvIII detection (p < 0.05). EGFRvIII mRNA has characteristics to be a useful marker for the diagnosis of occult systemic disease in breast cancer. Follow-up studies will evaluate its clinical value as a decision criterion for systemic therapy.http://www.sciencedirect.com/science/article/B6T68-4KV2RH2-1/1/8d7f06700e09e0cb34c8a3861e1b0ba

    OpenLab ESEV: novas aventuras no desenvolvimento de software

    Get PDF
    OpenLab ESEV is the Free Software project of the School of Education - Polytechnic Institute of Viseu (ESEV). The project aims to establish a platform to aggregate activities that foster the use of Free/Libre and Open Source Software (F/LOSS), Free Culture and more flexible licenses for creative and educational purposes in the ESEV's domains of activity (education, arts, media). OpenLab exists since 2009. It emerged in an environment characterized by the lack of knowledge of the existing Libre alternatives and by work habits exclusively built around proprietary software. Today, OpenLab activities are implemented within four key areas of action: dissemination, training, support and production. This paper presents two of the most important ongoing projects: Ottographer and StudiozCollabPress. StudiozCollabPress is a customized version of a popular WordPress plugin for project management that was developed to support short movie projects management. We'll present its main features and results from real-case scenarios of use, specifically, finished and ongoing 3D animation students' projects. Ottographer is a webcam time-lapse tool for operating systems based on Debian GNU/Linux. Besides the main features, we'll present some examples and suggestions for educational settings as well as for creative and educational purposes. Both projects are distributed as F/LOSS, meaning that they can be used, studied, and modified without restrictions, as well as copied and redistributed in modified or unmodified form. These projects might help us launch a new trend at our school community that we highly antecipate: the development and sharing of our own tools

    Using MACBETH with the choquet integral fundamentals to model interdependencies between elementary concerns in the context of risk management

    Get PDF
    Effective risk management typically requires the evaluation of multiple consequences of different sources of risk, and multicriteria value models have been used for that purpose. The value of mitigating a risk impact is often considered by risk managers as dependent on the levels of other impacts, therefore there is a need for procedures to identify and model these interactions within a value measurement framework. The Choquet Integral (CI) has been used for this purpose, and several studies in the performance measurement literature have combined the 2-additive CI operator with the MACBETH approach to model interdependencies in real contexts. In this paper, we propose an alternative procedure to model interdependencies and determine the CI parameters from one single MACBETH global matrix. The procedure is illustrated with the construction of a descriptor of impacts to evaluate the risk impacts at ALSTOM Power. The paper further explains the questioning protocol to apply the proposed procedure, as well as how decision-makers can interpret the CI parameters
    • …
    corecore