40 research outputs found

    Evolutionary Algorithms for Query Op-timization in Distributed Database Sys-tems: A review

    Get PDF
    Evolutionary Algorithms are bio-inspired optimization problem-solving approaches that exploit principles of biological evolution. , such as natural selection and genetic inheritance. This review paper provides the application of evolutionary and swarms intelligence based query optimization strategies in Distributed Database Systems. The query optimization in a distributed environment is challenging task and hard problem. However, Evolutionary approaches are promising for the optimization problems. The problem of query optimization in a distributed database environment is one of the complex problems. There are several techniques which exist and are being used for query optimization in a distributed database. The intention of this research is to focus on how bio-inspired computational algorithms are used in a distributed database environment for query optimization. This paper provides working of bio-inspired computational algorithms in distributed database query optimization which includes genetic algorithms, ant colony algorithm, particle swarm optimization and Memetic Algorithms

    2014 Summer Research Symposium Abstract Book

    Get PDF
    2014 Summer volume of abstracts for science research projects conducted by students at Trinity College

    A clustering based transfer function for volume rendering using gray-gradient mode histogram

    Get PDF
    Volume rendering is an emerging technique widely used in the medical field to visualize human organs using tomography image slices. In volume rendering, sliced medical images are transformed into attributes, such as color and opacity through transfer function. Thus, the design of the transfer function directly affects the result of medical images visualization. A well-designed transfer function can improve both the image quality and visualization speed. In one of our previous paper, we designed a multi-dimensional transfer function based on region growth to determine the transparency of a voxel, where both gray threshold and gray change threshold are used to calculate the transparency. In this paper, a new approach of the transfer function is proposed based on clustering analysis of gray-gradient mode histogram, where volume data is represented in a two-dimensional histogram. Clustering analysis is carried out based on the spatial information of volume data in the histogram, and the transfer function is automatically generated by means of clustering analysis of the spatial information. The dataset of human thoracic is used in our experiment to evaluate the performance of volume rendering using the proposed transfer function. By comparing with the original transfer function implemented in two popularly used volume rendering systems, visualization toolkit (VTK) and RadiAnt DICOM Viewer, the effectiveness and performance of the proposed transfer function are demonstrated in terms of the rendering efficiency and image quality, where more accurate and clearer features are presented rather than a blur red area. Furthermore, the complex operations on the two-dimensional histogram are avoided in our proposed approach and more detailed information can be seen from our final visualized image

    Can Deep Learning Improve Technical Analysis of Forex Data to Predict Future Price Movements?

    Get PDF
    The foreign exchange market (Forex) is the world's largest market for trading foreign money, with a trading volume of over 5.1 trillion dollars per day. It is known to be very complicated and volatile. Technical analysis is the observation of past market movements with the aim of predicting future prices and dealing with the effects of market movements. A trading system is based on technical indicators derived from technical analysis. In our work, a complete trading system with a combination of trading rules on Forex time series data is developed and made available to the scientific community. The system is implemented in two phases: In the first phase, each trading rule, both the AI-based rule and the trading rules from the technical indicators, is tested for selection; in the second phase, profitable rules are selected among the qualified rules and combined. Training data is used in the training phase of the trading system. The proposed trading system was extensively trained and tested on historical data from 2010 to 2021. To determine the effectiveness of the proposed method, we also conducted experiments with datasets and methodologies used in recent work by Hernandez-Aguila et al., 2021 and by Munkhdalai et al., 2019. Our method outperforms all other methodologies for almost all Forex markets, with an average percentage gain of 20.2%. A particular focus was on training our AI-based rule with two different architectures: the first is a widely used convolutional network for image classification, i.e. ResNet50; the second is an attention-based network Vision Transformer (ViT). The results provide a clear answer to the main question that guided our research and which is the title of this paper

    Mutable composite firefly algorithm for gene selection in microarray based cancer classification

    Get PDF
    Cancer classification is critical due to the strenuous effort required in cancer treatment and the rising cancer mortality rate. Recent trends with high throughput technologies have led to discoveries in terms of biomarkers that successfully contributed to cancerrelated issues. A computational approach for gene selection based on microarray data analysis has been applied in many cancer classification problems. However, the existing hybrid approaches with metaheuristic optimization algorithms in feature selection (specifically in gene selection) are not generalized enough to efficiently classify most cancer microarray data while maintaining a small set of genes. This leads to the classification accuracy and genes subset size problem. Hence, this study proposed to modify the Firefly Algorithm (FA) along with the Correlation-based Feature Selection (CFS) filter for the gene selection task. An improved FA was proposed to overcome FA slow convergence by generating mutable size solutions for the firefly population. In addition, a composite position update strategy was designed for the mutable size solutions. The proposed strategy was to balance FA exploration and exploitation in order to address the local optima problem. The proposed hybrid algorithm known as CFS-Mutable Composite Firefly Algorithm (CFS-MCFA) was evaluated on cancer microarray data for biomarker selection along with the deployment of Support Vector Machine (SVM) as the classifier. Evaluation was performed based on two metrics: classification accuracy and size of feature set. The results showed that the CFS-MCFA-SVM algorithm outperforms benchmark methods in terms of classification accuracy and genes subset size. In particular, 100 percent accuracy was achieved on all four datasets and with only a few biomarkers (between one and four). This result indicates that the proposed algorithm is one of the competitive alternatives in feature selection, which later contributes to the analysis of microarray data

    Location-aware deep learning-based framework for optimizing cloud consumer quality of service-based service composition

    Get PDF
    The expanding propensity of organization users to utilize cloud services urges to deliver services in a service pool with a variety of functional and non-functional attributes from online service providers. brokers of cloud services must intense rivalry competing with one another to provide quality of service (QoS) enhancements. Such rivalry prompts a troublesome and muddled providing composite services on the cloud using a simple service selection and composition approach. Therefore, cloud composition is considered a non-deterministic polynomial (NP-hard) and economically motivated problem. Hence, developing a reliable economic model for composition is of tremendous interest and to have importance for the cloud consumer. This paper provides “A location-aware deep learning framework for improving the QoS-based service composition for cloud consumers”. The proposed framework is firstly reducing the dimensions of data. Secondly, it applies a combination of the deep learning long short-term memory network and particle swarm optimization algorithm additionally to considering the location parameter to correctly forecast the QoS provisioned values. Finally, it composes the ideal services need to reduce the customer cost function. The suggested framework's performance has been demonstrated using a real dataset, proving that it superior the current models in terms of prediction and composition accuracy

    A Multi -Perspective Evaluation of MA and GA for Collaborative Filtering Recommender System

    Full text link

    Blockchain for secured IoT and D2D applications over 5G cellular networks : a thesis by publications presented in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Computer and Electronics Engineering, Massey University, Albany, New Zealand

    Get PDF
    Author's Declaration: "In accordance with Sensors, SpringerOpen, and IEEE’s copyright policy, this thesis contains the accepted and published version of each manuscript as the final version. Consequently, the content is identical to the published versions."The Internet of things (IoT) is in continuous development with ever-growing popularity. It brings significant benefits through enabling humans and the physical world to interact using various technologies from small sensors to cloud computing. IoT devices and networks are appealing targets of various cyber attacks and can be hampered by malicious intervening attackers if the IoT is not appropriately protected. However, IoT security and privacy remain a major challenge due to characteristics of the IoT, such as heterogeneity, scalability, nature of the data, and operation in open environments. Moreover, many existing cloud-based solutions for IoT security rely on central remote servers over vulnerable Internet connections. The decentralized and distributed nature of blockchain technology has attracted significant attention as a suitable solution to tackle the security and privacy concerns of the IoT and device-to-device (D2D) communication. This thesis explores the possible adoption of blockchain technology to address the security and privacy challenges of the IoT under the 5G cellular system. This thesis makes four novel contributions. First, a Multi-layer Blockchain Security (MBS) model is proposed to protect IoT networks while simplifying the implementation of blockchain technology. The concept of clustering is utilized to facilitate multi-layer architecture deployment and increase scalability. The K-unknown clusters are formed within the IoT network by applying a hybrid Evolutionary Computation Algorithm using Simulated Annealing (SA) and Genetic Algorithms (GA) to structure the overlay nodes. The open-source Hyperledger Fabric (HLF) Blockchain platform is deployed for the proposed model development. Base stations adopt a global blockchain approach to communicate with each other securely. The quantitative arguments demonstrate that the proposed clustering algorithm performs well when compared to the earlier reported methods. The proposed lightweight blockchain model is also better suited to balance network latency and throughput compared to a traditional global blockchain. Next, a model is proposed to integrate IoT systems and blockchain by implementing the permissioned blockchain Hyperledger Fabric. The security of the edge computing devices is provided by employing a local authentication process. A lightweight mutual authentication and authorization solution is proposed to ensure the security of tiny IoT devices within the ecosystem. In addition, the proposed model provides traceability for the data generated by the IoT devices. The performance of the proposed model is validated with practical implementation by measuring performance metrics such as transaction throughput and latency, resource consumption, and network use. The results indicate that the proposed platform with the HLF implementation is promising for the security of resource-constrained IoT devices and is scalable for deployment in various IoT scenarios. Despite the increasing development of blockchain platforms, there is still no comprehensive method for adopting blockchain technology on IoT systems due to the blockchain's limited capability to process substantial transaction requests from a massive number of IoT devices. The Fabric comprises various components such as smart contracts, peers, endorsers, validators, committers, and Orderers. A comprehensive empirical model is proposed that measures HLF's performance and identifies potential performance bottlenecks to better meet blockchain-based IoT applications' requirements. The implementation of HLF on distributed large-scale IoT systems is proposed. The performance of the HLF is evaluated in terms of throughput, latency, network sizes, scalability, and the number of peers serviceable by the platform. The experimental results demonstrate that the proposed framework can provide a detailed and real-time performance evaluation of blockchain systems for large-scale IoT applications. The diversity and the sheer increase in the number of connected IoT devices have brought significant concerns about storing and protecting the large IoT data volume. Dependencies of the centralized server solution impose significant trust issues and make it vulnerable to security risks. A layer-based distributed data storage design and implementation of a blockchain-enabled large-scale IoT system is proposed to mitigate these challenges by using the HLF platform for distributed ledger solutions. The need for a centralized server and third-party auditor is eliminated by leveraging HLF peers who perform transaction verification and records audits in a big data system with the help of blockchain technology. The HLF blockchain facilitates storing the lightweight verification tags on the blockchain ledger. In contrast, the actual metadata is stored in the off-chain big data system to reduce the communication overheads and enhance data integrity. Finally, experiments are conducted to evaluate the performance of the proposed scheme in terms of throughput, latency, communication, and computation costs. The results indicate the feasibility of the proposed solution to retrieve and store the provenance of large-scale IoT data within the big data ecosystem using the HLF blockchain

    Multiple Relevant Feature Ensemble Selection Based on Multilayer Co-Evolutionary Consensus MapReduce

    Full text link
    IEEE Although feature selection for large data has been intensively investigated in data mining, machine learning, and pattern recognition, the challenges are not just to invent new algorithms to handle noisy and uncertain large data in applications, but rather to link the multiple relevant feature sources, structured, or unstructured, to develop an effective feature reduction method. In this paper, we propose a multiple relevant feature ensemble selection (MRFES) algorithm based on multilayer co-evolutionary consensus MapReduce (MCCM). We construct an effective MCCM model to handle feature ensemble selection of large-scale datasets with multiple relevant feature sources, and explore the unified consistency aggregation between the local solutions and global dominance solutions achieved by the co-evolutionary memeplexes, which participate in the cooperative feature ensemble selection process. This model attempts to reach a mutual decision agreement among co-evolutionary memeplexes, which calls for the need for mechanisms to detect some noncooperative co-evolutionary behaviors and achieve better Nash equilibrium resolutions. Extensive experimental comparative studies substantiate the effectiveness of MRFES to solve large-scale dataset problems with the complex noise and multiple relevant feature sources on some well-known benchmark datasets. The algorithm can greatly facilitate the selection of relevant feature subsets coming from the original feature space with better accuracy, efficiency, and interpretability. Moreover, we apply MRFES to human cerebral cortex-based classification prediction. Such successful applications are expected to significantly scale up classification prediction for large-scale and complex brain data in terms of efficiency and feasibility
    corecore