97 research outputs found
Arcing High Impedance Fault Detection Using Real Coded Genetic Algorithm
Safety and reliability are two of the most important aspects of electric power supply systems. Sensitivity and robustness to detect and isolate faults can influence the safety and reliability of such systems. Overcurrent relays are generally used to protect the high voltage feeders in distribution systems. Downed conductors, tree branches touching conductors, and failing insulators often cause high-impedance faults in overhead distribution systems. The levels of currents of these faults are often much smaller than detection thresholds of traditional ground fault detection devices, thus reliable detection of these high impedance faults is a real challenge. With modern signal processing techniques, special hardware and software can be used to significantly improve the reliability of detection of certain types of faults. This paper presents a new method for detecting High Impedance Faults (HIF) in distribution systems using real coded genetic algorithm (RCGA) to analyse the harmonics and phase angles of the fault current signals. The method is used to discriminate HIFs by identifying specific events that happen when a HIF occurs
Genetic Algorithm and its Variants: Theory and Applications
The Genetic Algorithm is a popular optimization technique which is bio-inspired and is based on the concepts of natural genetics and natural selection theories proposed by Charles Darwin. The Algorithm functions on three basic genetic operators of selection, crossover and mutation. Based on the types of these operators GA has many variants like Real coded GA, Binary coded GA, Sawtooth GA, Micro GA, Improved GA, Differential Evolution GA. This paper discusses a few of the forms of GA and applies the techniques to the problem of Function optimization and System Identification. The paper makes a comparative analysis of the advantages and disadvantages of the different types of GA. The computer simulations illustrate the results. It also makes a comparison between the GA technique and Incremental LMS algorithm for System Identification
Evaluation of Varies Model Order in GA-Optimized Parameter Estimation of Toothbrush Rig System
Parameter estimation is a vital part in constructing the best model of a dynamic system. This paper analyzed the performance of toothbrush rig parameter estimation using different model orders. Parameter estimation process of the system is performed through system identification. The approximate mathematical model that resemble the real system is obtained when the output is measured after loading the input signal. The application of real-coded genetic algorithm (RCGA) is proposed as optimization method in estimating the parameters of dynamic system. The best model is obtained by optimizing the objective function of mean squared errors. The performance is analyzed to get the approximate model of the real system using three different model orders with 10 times analysis for each model. A few criteria have been considered which are the optimization result of objective function, time execution and validation process. Real- coded genetic algorithm indicates that parameter estimation with model order 3 is chosen as the best model or the dynamic system as it has the highest performance compared to others
CIXL2: A Crossover Operator for Evolutionary Algorithms Based on Population Features
In this paper we propose a crossover operator for evolutionary algorithms
with real values that is based on the statistical theory of population
distributions. The operator is based on the theoretical distribution of the
values of the genes of the best individuals in the population. The proposed
operator takes into account the localization and dispersion features of the
best individuals of the population with the objective that these features would
be inherited by the offspring. Our aim is the optimization of the balance
between exploration and exploitation in the search process. In order to test
the efficiency and robustness of this crossover, we have used a set of
functions to be optimized with regard to different criteria, such as,
multimodality, separability, regularity and epistasis. With this set of
functions we can extract conclusions in function of the problem at hand. We
analyze the results using ANOVA and multiple comparison statistical tests. As
an example of how our crossover can be used to solve artificial intelligence
problems, we have applied the proposed model to the problem of obtaining the
weight of each network in a ensemble of neural networks. The results obtained
are above the performance of standard methods
Green synthesis, characterization and in-vitro bioactivities of gold nanoparticles mediated by turmeric crude extract and curcumin
In the present study, green synthesis and characterization of gold nanoparticles
(GNPs) using the ethanolic crude extract of turmeric (Tur-CE) and curcumin (Cur)
have been described. Antioxidant, anticholinesterase (anti-ChE), and antiurolithiatic
activities of the extracts and the GNPs were tested in vitro and Pearson’s correlation
analysis was performed. The optimization of GNP synthesis was performed using
Tur-CE (2 % of 10 mg/mL) and curcumin (0.5 % of 10 mg/mL) at a reactant ratio
(Tur-CE or Cur: HAuCl4) of 1:4, pH 6 and 12 of Tur-CE and curcumin, respectively.
The other conditions include the concentration of HAuCl4 (0.25 mM) and different
reaction temperatures (25, 40, 55 and 70 °C). FESEM analysis of GNPs synthesized
at 25 °C by using Tur-CE (TP6.25) and curcumin (CP12.25) revealed the size range
of 11-40 and 31-100 nm, respectively. The highest antioxidant activity was recorded
for TP6.25, that is, 82.60, 79.60 and 12.22 % for ABTS, DPPH and FRAP,
respectively, and anti-ChE activity at 75.10 and 74.33 % for AChE and BChE,
respectively, However, the maximum % I for ABTS, DPPH, and FRAP, that is,
87.20, 86.00 % and 19.36 µg FSE, respectively, was shown by positive control
ascorbic acid, and for AChE and BChE inhibition, 86.69 and 89.30 %, respectively,
by galanthamine. TP6.25 also showed good antiurolithiatic activity as indicated by
absorbance value of 0.198 and 0.164 in nucleation and crystallization assays,
respectively. Pearson correlation revealed a positive correlation between antioxidant
and anti-ChE, as well as antioxidant and antiurolithiatic activities. The GNPs were
haemocompatible as TP6.25 and CP12.25 induced 1.172 and 0.763 % haemolysis,
respectively, at the highest concentration of 500 µL in haemolysis assay compared to
99.85 % haemolysis by positive control. In conclusion, Tur-GNPs and Cur-GNPs
possess good antioxidant, anticholinesterase and antiurolithiatic properties, in
addition to being non-haemolytic, may be an option to further explore their
therapeutic potential as antioxidant, anticholinesterase, and antiurolithiatic agents in
neurodegeneration and urolithiasis
Channel Equalization using GA Family
High speed data transmissions over communication channels distort the trans- mitted signals in both amplitude and phase due to presence of Inter Symbol Inter- ference (ISI). Other impairments like thermal noise, impulse noise and cross talk also cause further distortions to the received symbols. Adaptive equalization of the digital channels at the receiver removes/reduces the e®ects of such ISIs and attempts to recover the transmitted symbols. Basically an equalizer is an inverse ¯lter which is placed at the front end of the receiver. Its transfer function is inverse to the transfer function of the associated channel. The Least-Mean-Square (LMS), Recursive-Least-Square (RLS) and Multilayer perceptron (MLP) based adaptive equalizers aim to minimize the ISI present in the digital communication channel. These are gradient based learning algorithms and therefore there is possibility that during training of the equalizers, its weights do not reach to their optimum values due to ..
A Taxonomy for the Crossover Operator for Real-Coded Genetic Algorithms: An Experimental Study
The main real-coded genetic algorithm (RCGA) research effort has been spent on developing
efficient crossover operators. This study presents a taxonomy for this operator that groups its
instances in different categories according to the way they generate the genes of the offspring
from the genes of the parents. The empirical study of representative crossovers of all the
categories reveals concrete features that allow the crossover operator to have a positive influence
on RCGA performance. They may be useful to design more effective crossover models
Implementation, integration, and optimization of a fuzzy foreground segmentation system
Foreground segmentation is often an important preliminary step for various video processing systems. By improving the accuracy of the foreground segmentation process, the overall performance of a video processing system has the potential for improvement. This work introduces a Fuzzy Foreground Segmentation System (FFSS) that uses Mamdani-type Fuzzy Inference Systems (FIS) to control pixel-level accumulated statistics. The error of the FFSS is quantified by comparing its output with hand-segmented ground-truth images from a set of image sequences that specifically model canonical problems of foreground segmentation. Optimization of the FFSS parameters is achieved using a Real-Coded Genetic Algorithm (RCGA). Additionally, multiple central composite designed experiments used to analyze the performance of RCGA under selected schemes and their respective parameters. The RCGA schemes and parameters are chosen as to reduce variation and execution time for a set of known multi-dimensional test functions. The selected multi-dimensional test functions represent assorted function landscapes. To demonstrate accuracy of the FFSS and implicate the importance of the foreground segmentation process, the system is applied to real-time human detection from a single-camera security system. The Human Detection System (HDS) is composed of an IP Camera networked to multiple heterogeneous computers for distributed parallel processing. The implementation of the HDS, adheres to a System of Systems (SoS) architecture which standardizes data/communication, reduces overall complexity, and maintains a high level of interoperability
- …