1,408 research outputs found

    Comparative Study of Performance of Particle Swarm Optimization and Fast Independent Component Analysis method in Cocktail Party Problem

    Get PDF
    هنالك الكثير من الطرق التي تستخدم لحل مشكلة فصل المصدر المحجوب، مثل طريقة تحليل المكونات المستقلة والتي اصبحت من اكثر الطرق استخداما. طريقة تحليل المكونات المستقلة تعتمد على واحدة من اثنتين من الخصائص: استقلالية العينة او non-Gaussianity. في هذا البحث استخدمت طريقة فصل المكونات المستقلة لحل مشكلة حفلة الكوكتيل. حيث تمت دراسة انجازية طريقتين: طريقة فصل المكونات السريعة وطريقة تحسين سرب الطيور ومقارنة النتائج بالاعتماد على بعض مقاييس الانجازية مثل (الموضوعي مثل  SNR و SDR (  و (ذاتي مثل single plotting  و playing ) . حيث طبقت الخوارزميتين على مصادر ذوي اشارتين وثلاث اشارات. وكنتيجة لعملية التقييم فأن خوازمية فصل المكونات السريعة اعطت نتائج اكثر دقة من خوارزمية تحسين سرب الطيور. حيث استخدمت اشارات للكلام بتردد 8 كيلو هرتز والتي حققت شروط كل من ال  i.i.d و well-condition والتي اختبرت على احاديث مختلفة لرجال ونساء وكذلك الموسيقى.     There are many methods used for solving the Blind Source Separation problem, such as Independent Component Analysis which became the most commonly used method. ICA methods depend on one of two properties: sample dependency or non-Gaussianity. In our study, the cocktail-party problem processed using ICA method. In this work, we studied the performance of two techniques with the independent component analysis is standard FastICA, and PSO; and compare the results of each algorithm with others according to some evaluation metrics (objective such as SNR and SDR ) and (subjective such as signals plotting and playing). The implement of these algorithms was to be made with two source signals and three source signals. As in the evaluation process, the PSO gives more accurate results than FastICA. Many input speech signals of 8 KHz sampling frequency, that achieve i.i.d. condition and well-condition were tested for different speeches for men and/or women, also music

    Sequential Monte Carlo Methods for Estimating Dynamic Microeconomic Models

    Get PDF
    This paper develops methods for estimating dynamic structural microeconomic models with serially correlated latent state variables. The proposed estimators are based on sequential Monte Carlo methods, or particle filters, and simultaneously estimate both the structural parameters and the trajectory of the unobserved state variables for each observational unit in the dataset. We focus two important special cases: single agent dynamic discrete choice models and dynamic games of incomplete information. The methods are applicable to both discrete and continuous state space models. We first develop a broad nonlinear state space framework which includes as special cases many dynamic structural models commonly used in applied microeconomics. Next, we discuss the nonlinear filtering problem that arises due to the presence of a latent state variable and show how it can be solved using sequential Monte Carlo methods. We then turn to estimation of the structural parameters and consider two approaches: an extension of the standard full-solution maximum likelihood procedure (Rust, 1987) and an extension of the two-step estimation method of Bajari, Benkard, and Levin (2007), in which the structural parameters are estimated using revealed preference conditions. Finally, we introduce an extension of the classic bus engine replacement model of Rust (1987) and use it both to carry out a series of Monte Carlo experiments and to provide empirical results using the original data.dynamic discrete choice, latent state variables, serial correlation, sequential Monte Carlo methods, particle filtering

    Multimodal estimation of distribution algorithms

    Get PDF
    Taking the advantage of estimation of distribution algorithms (EDAs) in preserving high diversity, this paper proposes a multimodal EDA. Integrated with clustering strategies for crowding and speciation, two versions of this algorithm are developed, which operate at the niche level. Then these two algorithms are equipped with three distinctive techniques: 1) a dynamic cluster sizing strategy; 2) an alternative utilization of Gaussian and Cauchy distributions to generate offspring; and 3) an adaptive local search. The dynamic cluster sizing affords a potential balance between exploration and exploitation and reduces the sensitivity to the cluster size in the niching methods. Taking advantages of Gaussian and Cauchy distributions, we generate the offspring at the niche level through alternatively using these two distributions. Such utilization can also potentially offer a balance between exploration and exploitation. Further, solution accuracy is enhanced through a new local search scheme probabilistically conducted around seeds of niches with probabilities determined self-adaptively according to fitness values of these seeds. Extensive experiments conducted on 20 benchmark multimodal problems confirm that both algorithms can achieve competitive performance compared with several state-of-the-art multimodal algorithms, which is supported by nonparametric tests. Especially, the proposed algorithms are very promising for complex problems with many local optima

    Methods for Shape-Constrained Kernel Density Estimation

    Get PDF
    Nonparametric density estimators are used to estimate an unknown probability density while making minimal assumptions about its functional form. Although the low reliance of nonparametric estimators on modelling assumptions is a benefit, their performance will be improved if auxiliary information about the density\u27s shape is incorporated into the estimate. Auxiliary information can take the form of shape constraints, such as unimodality or symmetry, that the estimate must satisfy. Finding the constrained estimate is usually a difficult optimization problem, however, and a consistent framework for finding estimates across a variety of problems is lacking. It is proposed to find shape-constrained density estimates by starting with a pilot estimate obtained by standard methods, and subsequently adjusting its shape until the constraints are satisfied. This strategy is part of a general approach, in which a constrained estimation problem is defined by an estimator, a method of shape adjustment, a constraint, and an objective function. Optimization methods are developed to suit this approach, with a focus on kernel density estimation under a variety of constraints. Two methods of shape adjustment are examined in detail. The first is data sharpening, for which two optimization algorithms are proposed: a greedy algorithm that runs quickly but can handle a limited set of constraints, and a particle swarm algorithm that is suitable for a wider range of problems. The second is the method of adjustment curves, for which it is often possible to use quadratic programming to find optimal estimates. The methods presented here can be used for univariate or higher-dimensional kernel density estimation with shape constraints. They can also be extended to other estimators, in both the density estimation and regression settings. As such they constitute a step toward a truly general optimizer, that can be used on arbitrary combinations of estimator and constraint

    Multi-objective methods for history matching, uncertainty prediction and optimisation in reservoir modelling

    Get PDF
    corecore