24 research outputs found
Enhanced artificial bee colony-least squares support vector machines algorithm for time series prediction
Over the past decades, the Least Squares Support Vector Machines (LSSVM) has been widely utilized in prediction task of various application domains. Nevertheless, existing literature showed that the capability of LSSVM is highly dependent on the value of its hyper-parameters, namely regularization parameter and kernel parameter, where this would greatly affect the generalization of LSSVM in prediction task. This study proposed a hybrid algorithm, based on Artificial Bee Colony (ABC) and LSSVM, that consists of three algorithms; ABC-LSSVM, lvABC-LSSVM and cmABC-LSSVM. The lvABC algorithm is introduced to overcome the local optima problem by enriching the searching behaviour using Levy mutation. On the other
hand, the cmABC algorithm that incorporates conventional mutation addresses the over-
fitting or under-fitting problem. The combination of lvABC and cmABC algorithm, which is later introduced as Enhanced Artificial Bee Colony–Least Squares Support Vector Machine (eABC-LSSVM), is realized in prediction of non
renewable natural resources commodity price. Upon the completion of data collection and data pre processing, the eABC-LSSVM algorithm is designed and developed. The predictability of eABC-LSSVM is measured based on five statistical
metrics which include Mean Absolute Percentage Error (MAPE), prediction accuracy, symmetric MAPE (sMAPE), Root Mean Square Percentage Error
(RMSPE) and Theils’ U. Results showed that the eABC-LSSVM possess lower prediction error rate as compared to eight hybridization models of LSSVM and Evolutionary Computation (EC) algorithms. In addition, the proposed algorithm is compared to single prediction techniques, namely, Support Vector Machines (SVM) and Back Propagation Neural Network (BPNN). In general, the eABC-LSSVM produced more than 90% prediction accuracy. This indicates that the proposed eABC-LSSVM is capable of solving optimization problem, specifically in the
prediction task. The eABC-LSSVM is hoped to be useful to investors and commodities traders in planning their investment and projecting their profit
Artificial Neural Network Inference (ANNI): A Study on Gene-Gene Interaction for Biomarkers in Childhood Sarcomas
Objective: To model the potential interaction between previously identified biomarkers in children sarcomas using artificial neural network inference (ANNI).
Method: To concisely demonstrate the biological interactions between correlated genes in an interaction network map, only 2 types of sarcomas in the children small round blue cell tumors (SRBCTs) dataset are discussed in this paper. A backpropagation neural network was used to model the potential interaction between genes. The prediction weights and signal directions were used to model the strengths of the interaction signals and the direction of the interaction link between genes. The ANN model was validated using Monte Carlo cross-validation to minimize the risk of over-fitting and to optimize generalization ability of the model.
Results: Strong connection links on certain genes (TNNT1 and FNDC5 in rhabdomyosarcoma (RMS); FCGRT and OLFM1 in Ewing’s sarcoma (EWS)) suggested their potency as central hubs in the interconnection of genes with different functionalities. The results showed that the RMS patients in this dataset are likely to be congenital and at low risk of cardiomyopathy development. The EWS patients are likely to be complicated by EWS-FLI fusion and deficiency in various signaling pathways, including Wnt, Fas/Rho and intracellular oxygen.
Conclusions: The ANN network inference approach and the examination of identified genes in the published literature within the context of the disease highlights the substantial influence of certain genes in sarcomas
Applying the Upper Integral to the Biometric Score Fusion Problem in the Identification Model
This paper presents a new biometric score fusion approach in an identification system using the upper integral with respect to Sugeno's fuzzy measure. First, the proposed method considers each individual matcher as a fuzzy set in order to handle uncertainty and imperfection in matching scores. Then, the corresponding fuzzy entropy estimates the reliability of the information provided by each biometric matcher. Next, the fuzzy densities are generated based on rank information and training accuracy. Finally, the results are aggregated using the upper fuzzy integral. Experimental results compared with other fusion methods demonstrate the good performance of the proposed approach
Water filtration by using apple and banana peels as activated carbon
Water filter is an important devices for reducing the contaminants in raw water. Activated from charcoal is used to absorb the contaminants. Fruit peels are some of the suitable alternative carbon to substitute the charcoal. Determining the role of fruit peels which were apple and banana peels powder as activated carbon in water filter is the main goal. Drying and blending the peels till they become powder is the way to allow them to absorb the contaminants. Comparing the results for raw water before and after filtering is the observation. After filtering the raw water, the reading for pH was 6.8 which is in normal pH and turbidity reading recorded was 658 NTU. As for the colour, the water becomes more clear compared to the raw water. This study has found that fruit peels such as banana and apple are an effective substitute to charcoal as natural absorbent
A novel approach to data mining using simplified swarm optimization
Data mining has become an increasingly important approach to deal with the rapid
growth of data collected and stored in databases. In data mining, data classification
and feature selection are considered the two main factors that drive people when
making decisions. However, existing traditional data classification and feature
selection techniques used in data management are no longer enough for such massive
data. This deficiency has prompted the need for a new intelligent data mining
technique based on stochastic population-based optimization that could discover
useful information from data.
In this thesis, a novel Simplified Swarm Optimization (SSO) algorithm is proposed as
a rule-based classifier and for feature selection. SSO is a simplified Particle Swarm
Optimization (PSO) that has a self-organising ability to emerge in highly distributed
control problem space, and is flexible, robust and cost effective to solve complex
computing environments. The proposed SSO classifier has been implemented to
classify audio data. To the author’s knowledge, this is the first time that SSO and PSO
have been applied for audio classification.
Furthermore, two local search strategies, named Exchange Local Search (ELS) and
Weighted Local Search (WLS), have been proposed to improve SSO performance.
SSO-ELS has been implemented to classify the 13 benchmark datasets obtained from
the UCI repository database. Meanwhile, SSO-WLS has been implemented in
Anomaly-based Network Intrusion Detection System (A-NIDS). In A-NIDS, a novel
hybrid SSO-based Rough Set (SSORS) for feature selection has also been proposed.
The empirical analysis showed promising results with high classification accuracy
rate achieved by all proposed techniques over audio data, UCI data and KDDCup 99
datasets. Therefore, the proposed SSO rule-based classifier with local search
strategies has offered a new paradigm shift in solving complex problems in data
mining which may not be able to be solved by other benchmark classifiers
How Fast Can We Play Tetris Greedily With Rectangular Pieces?
Consider a variant of Tetris played on a board of width and infinite
height, where the pieces are axis-aligned rectangles of arbitrary integer
dimensions, the pieces can only be moved before letting them drop, and a row
does not disappear once it is full. Suppose we want to follow a greedy
strategy: let each rectangle fall where it will end up the lowest given the
current state of the board. To do so, we want a data structure which can always
suggest a greedy move. In other words, we want a data structure which maintains
a set of rectangles, supports queries which return where to drop the
rectangle, and updates which insert a rectangle dropped at a certain position
and return the height of the highest point in the updated set of rectangles. We
show via a reduction to the Multiphase problem [P\u{a}tra\c{s}cu, 2010] that on
a board of width , if the OMv conjecture [Henzinger et al., 2015]
is true, then both operations cannot be supported in time
simultaneously. The reduction also implies polynomial bounds from the 3-SUM
conjecture and the APSP conjecture. On the other hand, we show that there is a
data structure supporting both operations in time on
boards of width , matching the lower bound up to a factor.Comment: Correction of typos and other minor correction
A novel statistical cerebrovascular segmentation algorithm with particle swarm optimization
AbstractWe present an automatic statistical intensity-based approach to extract the 3D cerebrovascular structure from time-of flight (TOF) magnetic resonance angiography (MRA) data. We use the finite mixture model (FMM) to fit the intensity histogram of the brain image sequence, where the cerebral vascular structure is modeled by a Gaussian distribution function and the other low intensity tissues are modeled by Gaussian and Rayleigh distribution functions. To estimate the parameters of the FMM, we propose an improved particle swarm optimization (PSO) algorithm, which has a disturbing term in speeding updating the formula of PSO to ensure its convergence. We also use the ring shape topology of the particles neighborhood to improve the performance of the algorithm. Computational results on 34 test data show that the proposed method provides accurate segmentation, especially for those blood vessels of small sizes
Mal-Netminer: Malware Classification Approach based on Social Network Analysis of System Call Graph
As the security landscape evolves over time, where thousands of species of
malicious codes are seen every day, antivirus vendors strive to detect and
classify malware families for efficient and effective responses against malware
campaigns. To enrich this effort, and by capitalizing on ideas from the social
network analysis domain, we build a tool that can help classify malware
families using features driven from the graph structure of their system calls.
To achieve that, we first construct a system call graph that consists of system
calls found in the execution of the individual malware families. To explore
distinguishing features of various malware species, we study social network
properties as applied to the call graph, including the degree distribution,
degree centrality, average distance, clustering coefficient, network density,
and component ratio. We utilize features driven from those properties to build
a classifier for malware families. Our experimental results show that
influence-based graph metrics such as the degree centrality are effective for
classifying malware, whereas the general structural metrics of malware are less
effective for classifying malware. Our experiments demonstrate that the
proposed system performs well in detecting and classifying malware families
within each malware class with accuracy greater than 96%.Comment: Mathematical Problems in Engineering, Vol 201
Intelligent System for Vehicles Number Plate Detection and Recognition Using Convolutional Neural Networks
Vehicles on the road are rising in extensive numbers, particularly in proportion to the industrial revolution and growing economy. The significant use of vehicles has increased the probability of traffic rules violation, causing unexpected accidents, and triggering traffic crimes. In order to overcome these problems, an intelligent traffic monitoring system is required. The intelligent system can play a vital role in traffic control through the number plate detection of the vehicles. In this research work, a system is developed for detecting and recognizing of vehicle number plates using a convolutional neural network (CNN), a deep learning technique. This system comprises of two parts: number plate detection and number plate recognition. In the detection part, a vehicle’s image is captured through a digital camera. Then the system segments the number plate region from the image frame. After extracting the number plate region, a super resolution method is applied to convert the low-resolution image into a high-resolution image. The super resolution technique is used with the convolutional layer of CNN to reconstruct the pixel quality of the input image. Each character of the number plate is segmented using a bounding box method. In the recognition part, features are extracted and classified using the CNN technique. The novelty of this research is the development of an intelligent system employing CNN to recognize number plates, which have less resolution, and are written in the Bengali language.</jats:p
Working and Limitations of Cable Stiffening in Flexible Link Manipulators
Rigid link manipulators (RLMs) are used in industry to move and manipulate objects in their workspaces. Flexible link manipulators (FLMs), which are much lighter and hence highly flexible compared to RLMs, have been proposed in the past as means to reduce energy consumption and increase the speed of operation. Unlike RLM, an FLM has infinite degrees of freedom actuated by finite number of actuators. Due to high flexibility affecting the precision of operation, special control algorithms are required to make them usable. Recently, a method to stiffen FLMs using cables, without adding significant inertia or adversely affecting the advantages of FLMs, has been proposed as a possible solution in a preliminary work by the authors. An FLM stiffened using cables can use existing control algorithms designed for RLMs. In this paper we discuss in detail the working principle and limitations of cable stiffening for flexible link manipulators through simulations and experiments. A systematic way of deciding the location of cable attachments to the FLM is also presented. The main result of this paper is to show the advantage of adding a second pair of cables in reducing overall link deflections