2,803 research outputs found

    Particle swarm optimization for solving thesis defense timetabling problem

    Get PDF
    The thesis defense timetabling problem is a fascinating and original NP-hard optimization problem. The problem involves assigning the participants to defense sessions, composing the relevant committees, satisfying the constraints, and optimizing the objectives. This study defines the problem formulation that applies to Universitas Multimedia Nusantara (UMN) and use the particle swarm optimization (PSO) algorithm to solve it. As a demonstration of concept and viability, the proposed method is implemented in a web-based platform using Python and Flask. The implementation is tested and evaluated using real-world instances. The results show that the fastest timetable generation is 0.18 seconds, and the slowest is 21.88 minutes for 25 students and 18 department members, without any violation of the hard constraints. The overall score of the EUCS evaluation for the application is 4.3 out of 6

    Priority Scheduling Implementation for Exam Schedule

    Get PDF
    Scheduling is a common problem that has been raised for a long time. Many algorithms have been created for this problem. Some algorithms offer flexibility in terms of constraints and complex operations. Because of that complexity, many algorithms will need huge computation resources and execution time. A platform like a web application has many restrictions such as execution time and computation resources. A complex algorithm is not suited for the web application platform. Priority scheduling is a scheduling algorithm based on a priority queue. Every schedule slot will produce a queue based on the constraints. Each constraint will have a different weight. Weight in queue represents their priority. This algorithm provides a light algorithm that only needs a few computations and execution times. The exam schedule is one of many problems in educational institutions. A web application is a popular platform that can be accessed from everywhere. Many educational institutions use web platforms as their main system platform. Web platforms have some restrictions such as execution time. Due to web platform restrictions, priority scheduling is a suitable algorithm for this platform. In this study, the author tries to implement a priority scheduling algorithm in scheduling cases with a website platform and shows that this algorithm solution can be an alternative for solving scheduling cases with low computational resources

    Global swarm optimization algorithms with hybrid search strategies

    Full text link
    University of Technology, Sydney. Faculty of Engineering and Information Technology.In decades, global optimization algorithm with nature-inspired technique has become an important research topic. The aim is to find the optimal solutions for given problems without knowing the characteristics of solutions beforehand. In particular, Swarm Intelligence is a population-based meta-heuristic methodology belonging to Soft Computing. The collective behaviour of swarm members is often inspired by the biological system and behaviours of nature. For instance, Particle Swarm Optimization is inspired from bird flocking. Evolutionary algorithms such as Genetic Algorithm and Differential Evolution are inspired from biological evolution. These algorithms try to iteratively improve the discovered solutions by employing specially designed formulae to synthesize new solution candidates. However, sometimes algorithms present low performance in some problems. Possible reasons could be an algorithm itself is not specialized for particular types of problems; an algorithm is with the inappropriate selection of control parameters, or an inappropriate way to perform evaluation. To address the above issues, this research is to design swarm optimization algorithms to operate in the black-box scenario where objective functions are the only direct source of information. Different optimization methods are specialized for solving different types of problems, but they may not achieve good results in other problem classes. Hybridization of different algorithms and incorporating their knowledge may combine the strength of different optimization approaches and cancel out their weaknesses. Therefore, the two swarm optimization algorithms are developed with this manner. The optimization performance is verified by public benchmark mathematical functions. The proposed methods in the thesis are: 1) Simplified Swarm Optimization with Differential Evaluation mutation strategy (SSODE) and 2) Macroscopic Indeterminacy Swarm Optimization (MISO). SSODE is an experimental method which is developed to verify the proposed hybrid principle in the thesis. SSODE hybridizes Simplified Swarm Optimization (SSO) algorithm structure with multiple mutation strategies from DE. The experiment results of SSODE indicate that the hybridization of different algorithms and mutation strategies is able to achieve general efficiency. By continuing the research of SSODE, MISO presents a well-structured memetic algorithm with new evaluation schema. Substantial experiments have shown that the performance of MISO is significantly superior to many well-known algorithms in many objective functions

    Orthogonal learning particle swarm optimization

    Get PDF
    Particle swarm optimization (PSO) relies on its learning strategy to guide its search direction. Traditionally, each particle utilizes its historical best experience and its neighborhood’s best experience through linear summation. Such a learning strategy is easy to use, but is inefficient when searching in complex problem spaces. Hence, designing learning strategies that can utilize previous search information (experience) more efficiently has become one of the most salient and active PSO research topics. In this paper, we proposes an orthogonal learning (OL) strategy for PSO to discover more useful information that lies in the above two experiences via orthogonal experimental design. We name this PSO as orthogonal learning particle swarm optimization (OLPSO). The OL strategy can guide particles to fly in better directions by constructing a much promising and efficient exemplar. The OL strategy can be applied to PSO with any topological structure. In this paper, it is applied to both global and local versions of PSO, yielding the OLPSO-G and OLPSOL algorithms, respectively. This new learning strategy and the new algorithms are tested on a set of 16 benchmark functions, and are compared with other PSO algorithms and some state of the art evolutionary algorithms. The experimental results illustrate the effectiveness and efficiency of the proposed learning strategy and algorithms. The comparisons show that OLPSO significantly improves the performance of PSO, offering faster global convergence, higher solution quality, and stronger robustness

    Crypt Edge Detection Using PSO,Label Matrix And BI-Cubic Interpolation For Better Iris Recognition(PSOLB)

    Get PDF
    Iris identification is an automatic system to recognise an individual in biometric applications.Human iris is an internal organ that can be accessed from external view of the body.Moreover,the structure of the iris is formed in a complete random manner and has unique features such as crypts,furrows,collarets,pupil,freckles, and blotches.In fact, no iris patterns are the same.The iris structure is stable which it means the location of the iris features is permanent at certain point.Nevertheless,the shape of iris features changes slowly due to several factors which include aging,surgery,growth,emotion and dietary habits. Recently,there has been renewed interest in iris features detection.Gabor filter,cross entrophy, upport vector,and canny edge detection are methods which produce iris codes in binary codes representation.However,problems have occurred in iris recognition since low quality iris images are created due to blurriness,indoor or outdoor settings, and camera specifications.Failure was detected in 21% of the intra-class comparisons cases which were taken between intervals of three and six months intervals.However,the mismatch or False Rejection Rate (FRR) in iris recognition is still alarmingly high.Higher FRR also causes the value of Equal Error Rate (EER) to be high.The main reason for high values of FRR and EER is that there are changes in the iris due to the amount of light entering into the iris that changes the size of the unique features in the iris.One of the solutions to this problem is by finding any technique or algorithm to automatically detect the unique features.Therefore a new model is introduced which is called Crypt Edge Detection which combines PSO,Label Matrix,and Bi-Cubic Interpolation for Iris Recognition (PSOLB) to solve the problem of detection in iris features.In this research, the unique feature known as crypts has been chosen due to its accessibility and sustainability.Feature detection is performed using particle swarm optimisation (PSO) as an algorithm to select the best iris texture among the unique iris features by finding the pixel values according to the range of selected features.Meanwhile, label matrix will detect the edge of the crypt and the bi-cubic interpolation technique creates sharp and refined crypt images.In order to evaluate the proposed approach,FAR and FRR are measured using Chinese Academy of Sciences' Institute of Automation (CASIA) database for high quality images.For CASIA version 3 image databases, the crypt feature shows that the result of FRR is 21.83% and FAR is 78.17%.The finding from the experiment indicates that by using the PSOLB,the intersection between FAR and FRR produces the Equal Error Rate (EER) with 0.28%,which indicated that equal error rate is lower than previous value, which is 0.38%.Thus,there are advantages from using PSOLB as it has the ability to adapt with unique iris features and use information in iris template features to determine the user.The outcome of this new approach is to reduce the EER rates since lower EER rates can produce accurate detection of unique features.In conclusion,the contribution of PSOLB brings an innovation to the extraction process in the biometric technology and is beneficial to the communities

    Electrocardiogram and hybrid support vector algorithms for detection of hypoglycaemia in patients with type 1 diabetes

    Full text link
    University of Technology, Sydney. Faculty of Engineering and Information Technology.Hypoglycaemia is the most acute and common complication of type 1 diabetes. Physiological changes occur when blood glucose concentration falls to a certain level. A number of studies have demonstrated that hypoglycaemia causes electrocardiographic (ECG) alteration. The serious harmful effects of hypoglycaemia on the body motivate research groups to find an optimal strategy to detect it. Detection of hypoglycaemia can be performed by puncturing the skin to measure the blood glucose level. However, this method is unsuitable as frequent puncturing may produce anxiety in patients and periodic puncturing is difficult to conduct, not to mention inconvenient, while the patient is sleeping. Therefore, a continuous and non-invasive technique can be considered for hypoglycaemia detection. Several techniques have been reported, such as reverse iontophoresis and absorption spectroscopy. Another approach to hypoglycaemia detection is based on the physiological effects of hypoglycaemia on the various parts of the body such as the brain, heart and skin. Physiological effects of hypoglycemia to the brain are studied by investigating electroencephalography (EEG) features. Hypoglycemic effects to the heart include alteration of electrocardiographic (ECG) parameters such as heart rate, QT intervals and T-wave amplitude alteration. Several algorithms were developed to process ECG parameters for hypoglycemia detection. The algorithms include neural network and fuzzy system based intelligent algorithms. Furthermore, hybrid systems were also developed, such as fuzzy neural network and genetic-algorithm-based multiple regression with fuzzy inference systems. So far, hypoglycaemia detection systems which are based on the physiological effects still require extensive validation before they can be adopted for worldwide clinical practices. The research in this thesis introduces several ECG parameters especially which relate to the repolarization phase and could contribute to hypoglycaemia detection. Furthermore, this research aims to introduce novel computational intelligent techniques for hypoglycaemia detection. The detection is based on electrocardiographic (ECG) parameters. A support vector machine (SVM) is the first algorithm introduced for hypoglycaemia detection in this research. The second algorithm is a hybrid of SVM with particle swarm optimization (PSO), which is called an SSVM algorithm. This algorithm is intended to improve the performance of the first algorithm. PSO is an evolutionary technique based on the movement of swarms. It is employed to optimize SVM parameters in order that the SVM perform well for hypoglycaemia detection. The third algorithm is for the improvement of the second algorithm where a fuzzy inference system (FIS) is included. This algorithm involves SVM, FIS and a PSO, which is called SFSVM. The FIS is used to process some ECG parameters to find a better performance of hypoglycaemia detection. FIS is an effective intelligent system which employs fuzzy logic and fuzzy set theory. Its frameworks are based on the concepts of fuzzy set theory, fuzzy if-then rules, and fuzzy reasoning. In addition, the proposed algorithms are compared with the other algorithms. All the algorithms are investigated with clinical electrocardiographic data. The data is collected from a hypoglycaemia study of type 1 diabetic patients. This study shows that the selected ECG parameters in hypoglycaemia differ significantly from those in nonhypoglycaemia (p<0.01). This difference might consider that the ECG parameters are part of repolarization, in which repolarization prolongs hypoglycaemia. It implies that the ECG parameters are important parameters which possibly contribute to hypoglycaemia detection. Therefore, the ECG parameters are used for inputs of hypoglycaemia detection in this study. The result also shows that the hypoglycaemia detection strategy which uses SSVM performs better than that which uses SVM (80.04% vs. 73.63%, in terms of geometric mean). Furthermore, the SFSVM performs better than the SSVM (87.22% vs. 80.45% in terms of sensitivity and 79.41% vs. 79.64% in terms of specificity). In summary, SFSVM performs better than the other two algorithms (SVM and SSVM), with acceptable sensitivity, specificity and geometric mean of 87.22%, 79.41% and 83.22%, respectively

    Ensemble rapid centroid estimation : a semi-stochastic consensus particle swarm approach for large scale cluster optimization

    Full text link
    University of Technology Sydney. Faculty of Engineering and Information Technology.This thesis details rigorous theoretical and empirical analyses on the related works in the clustering literature based on the Particle Swarm Optimization (PSO) principles. In particular, we detail the discovery of disadvantages in Van Der Merwe - Engelbrecht’s PSO clustering, Cohen - de Castro Particle Swarm Clustering (PSC), Szabo’s modified PSC (mPSC) and Szabo’s Fuzzy PSC (FPSC). We validate, both theoretically and empirically, that Van Der Merwe - Engelbrecht’s PSO clustering algorithm is not significantly better than the conventional k-means. We propose that under random initialization, the performance of their proposed algorithm diminishes exponentially as the number of classes or dimensions increase. We unravel that the PSC, mPSC, and FPSC algorithms suffer from significant complexity issues which do not translate into performance. Their cognitive and social parameters have negligible effect to convergence and the algorithms generalize to the k-means, retaining all of its characteristics including the most severe: the curse of initial position. Furthermore we observe that the three algorithms, although proposed under varying names and time frames, behave similarly to the original PSC. This thesis analyzes, both theoretically and empirically, the strengths and limitations of our proposed semi-stochastic particle swarm clustering algorithm, Rapid Centroid Estimation (RCE), self-evolutionary Ensemble RCE (ERCE), and Consensus Engrams, which are developed mainly to address the fundamental issues in PSO Clustering and the PSC families. The algorithms extend the scalability, applicability, and reliability of earlier approaches to handle large-scale non-convex cluster optimization in quasilinear complexity in both time and space. This thesis establishes the fundamentals, much surpassing those outlined in our published manuscripts

    Production quantity estimation using an improved artificial neural network

    Get PDF
    By considering on the competitive market today, managing inventory becomes one factor that affected in improving business performance. This encouraged most industries to manage it efficiently by determining effective decision for inventory replenishment. For instance, mostly, industries decide next inventory replenishment by considering on their last historical production. However, this decision cannot be implemented on the next production due to uncertainty/fluctuated condition. Therefore, poor decision on producing product will influence the business’ costs. Hence, this research proposes model based on Neural Network Back Propagation (NNBP) to estimate production quantity. This model is designed based on input variables that affect the determination of production quantity which include demand, setup costs, production, material costs, holding costs, transportation costs. The performance of NNBP can be analyzed using Root Mean Square Error (RMSE) and Mean Absolute Error (MAE). In order to increase the performance of NNBP, optimization techniques such as Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) are being hybrid with the ANN model to become Hybrid Neural Network Genetic Algorithm (HNNGA) model and Hybrid Neural Network Particle Swarm Optimization (HNNPSO) model respectively. These techniques were used to optimize attribute weighting on NNBP model. The proposed models were examined using private dataset that collected from Iron Casting Manufacturing in Klaten, Indonesia. Moreover, validation is conducted for all proposed models through both Cross-Validation and statistical analysis. The cross-validation is common technique used to prevent over fitting problem by dividing the data into two categories namely data training and data test. Meanwhile, statistical analysis considers normality test on error estimation and the significant difference among the proposed models. Experimental result shows that HNNGA and HNNPSO provide smaller measurement error that concurrently improves the performance of NNBP model. In this work, the proposed model contributes not only to update the original instrument, but also applicable and beneficial for industry, particularly in deciding effective inventory replenishment decision on production quantity

    Optimization of milling parameters using ant colony optimization

    Get PDF
    In process planning of conventional milling, selecting reasonable milling parameters is necessary to satisfy requirements involving machining economics, quality and safety. This study is to develop optimization procedures based on the Ant Colony Optimization (ACO). This method was demonstrated for the optimization of machining parameters for milling operation. The machining parameters in milling operations consist of cutting speed, feed rate and depth of cut. These machining parameters significantly impact on the cost, productivity and quality of machining parts. The developed strategy based on the maximize production rate criterion. This study describes development and utilization of an optimization system, which determines optimum machining parameters for milling operations. The ACO simulation is develop to achieve the objective to optimize milling parameters to maximize the production rate in milling operation. The Matlab software will be use to develop the ACO simulation. All the references are taken from related articles, journals and books. An example to apply the Ant Colony Algorithm to the problem has been presented at the end of the paper to give clear picture from the application of the system and its efficiency. The result obtained from this simulation will compare with another method like Genetic Algorithm (GA) and Linear Programming Technique (LPT) to validation. The simulation based on ACO algorithm are successful develop and the optimization of parameters values is to maximize the production rate is obtain from the simulation

    Implementing modified particle swarm optimization method to solve economic load dispatch problem

    Get PDF
    Economic Load Dispatch (ELD) is one of the important optimization tasks which provide an economic condition for power systems. In this work, Modified Particle Swarm Optimization (PSO) as an efficient and reliable evolutionary based approach has been proposed to solve the constraint economic load dispatch problem. The proposed method is able to determine, output power generation for all of the power generation units, so that the total, constraint cost function is minimized. In project report, a piecewise quadratic function is used to represent the fuel cost of each generation units, and the B-coefficient method is used to model transmission losses. The feasibility of the proposed Modified PSO is demonstrated for 4 power system test cases, consisting 3,6,15, and 40 generation units. The obtained Modified PSO results are compared with Standard PSO (SPSO), Genetic Algorithm (GA) and Quadratic Programming (QP) base approaches. These results reveal that the proposed method is capable to get higher quality solution including mathematical simplicity, fast convergence, and robustness to solve hard economic load dispatch problem
    corecore