5 research outputs found

    Evaluation of the sparse coding shrinkage noise reduction algorithm for the hearing impaired

    No full text
    Although there are numerous single-channel noise reduction strategies to improve speech perception in a noisy environment, most of them can only improve speech quality but not improve speech intelligibility for normal hearing (NH) or hearing impaired (HI) listeners. Exceptions that can improve speech intelligibility currently are only those that require a priori statistics of speech or noise. Most of the noise reduction algorithms in hearing aids are adopted directly from the algorithms for NH listeners without taking into account of the hearing loss factors within HI listeners. HI listeners suffer more in speech intelligibility than NH listeners in the same noisy environment. Further study of monaural noise reduction algorithms for HI listeners is required.The motivation is to adapt a model-based approach in contrast to the conventional Wiener filtering approach. The model-based algorithm called sparse coding shrinkage (SCS) was proposed to extract key speech information from noisy speech. The SCS algorithm was evaluated by comparison with another state-of-the-art Wiener filtering approach through speech intelligibility and quality tests using 9 NH and 9 HI listeners. The SCS algorithm matched the performance of the Wiener filtering algorithm in speech intelligibility and speech quality. Both algorithms showed some intelligibility improvements for HI listeners but not at all for NH listeners. The algorithms improved speech quality for both HI and NH listeners.Additionally, a physiologically-inspired hearing loss simulation (HLS) model was developed to characterize hearing loss factors and simulate hearing loss consequences. A methodology was proposed to evaluate signal processing strategies for HI listeners with the proposed HLS model and NH subjects. The corresponding experiment was performed by asking NH subjects to listen to unprocessed/enhanced speech with the HLS model. Some of the effects of the algorithms seen in HI listeners are reproduced, at least qualitatively, by using the HLS model with NH listeners.Conclusions: The model-based algorithm SCS is promising for improving performance in stationary noise although no clear difference was seen in the performance of SCS and a competitive Wiener filtering algorithm. Fluctuating noise is more difficult to reduce compared to stationary noise. Noise reduction algorithms may perform better at higher input signal-to-noise ratios (SNRs) where HI listeners can get benefit but where NH listeners already reach ceiling performance. The proposed HLS model can save time and cost when evaluating noise reduction algorithms for HI listeners

    Conflicting Objectives in Decisions

    Get PDF
    This book deals with quantitative approaches in making decisions when conflicting objectives are present. This problem is central to many applications of decision analysis, policy analysis, operational research, etc. in a wide range of fields, for example, business, economics, engineering, psychology, and planning. The book surveys different approaches to the same problem area and each approach is discussed in considerable detail so that the coverage of the book is both broad and deep. The problem of conflicting objectives is of paramount importance, both in planned and market economies, and this book represents a cross-cultural mixture of approaches from many countries to the same class of problem

    Risk analysis for tunneling projects

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2010.Cataloged from PDF version of thesis.Includes bibliographical references (p. 574-589).Tunnel construction is increasing world wide. Although the majority of tunnel construction projects have been completed safely, there have been several incidents that have resulted in delays, cost overruns, and sometimes more significant consequences such as injury and loss of life. To help eliminate these accidents, it is necessary to systematically assess and manage the risks associated with tunnel construction. In order to better understand the conditions under which accidents occur, a database of 204 tunnel construction accidents was assembled. This is the most comprehensive database known to date. The database was analyzed to better understand the causes of accidents. Influence diagrams were constructed containing the main factors, and the interactions between them. These served as the basis of the risk assessment methodology presented in this work. The risk assessment methodology consists of combining a geologic prediction model that allows one to predict geology ahead of the tunnel construction, with a decision support model that allows one to choose amongst different construction strategies the one that leads to minimum risk. The geologic prediction model is based on Bayesian networks because of their ability to combine domain knowledge with data, encode dependencies among variables, and their ability to learn causal relationships.(cont.) The combined geologic prediction - decision support model was then applied to the Porto Metro, in Portugal. The results of the geologic prediction model were in good agreement with the observed geology, and the results of the decision support model were in good agreement with the construction methods used. More significant, however, is the ability of the model to predict changes in geology and consequently changes in construction strategy. This was shown in two zones of the tunnel were accidents occurred, where the model predicted an abrupt change in geology, and the construction method should have been changed but was not. Using the model could have possibly avoiding the accidents. This risk assessment methodology provides a powerful tool with which planners and engineers can systematically assess and mitigate the inherent risks associated with tunnel construction.by Rita L. Sousa.Ph.D

    Exploring subjective image quality through isopreference curves

    No full text
    Abstract-Perceptual image quality correlates ineffectively with the traditional error measures. By defining the factors behind perceived image quality, many image processing systems can be optimized. Thus, a great deal of effort has been made for solving this challenging issue. However, considerably less research has been focused on the role of spatial resolution and number of gray levels in subjective image quality. In our study, the relation between these two fundamental image parameters is experimentally defined in terms of perceptual quality. The study was carried out by organizing an experiment where 80 subjects were used. Using the collected data set, the results are illustrated in the form of isopreference curves. Finally, our results are compared with the corresponding ones presented previously. Although the comparison reveals a clear correlation between the results, novel information about the relation between spatial resolution and the number of gray levels is presented
    corecore