7 research outputs found

    Solving the Uncapacitated Single Allocation p-Hub Median Problem on GPU

    Full text link
    A parallel genetic algorithm (GA) implemented on GPU clusters is proposed to solve the Uncapacitated Single Allocation p-Hub Median problem. The GA uses binary and integer encoding and genetic operators adapted to this problem. Our GA is improved by generated initial solution with hubs located at middle nodes. The obtained experimental results are compared with the best known solutions on all benchmarks on instances up to 1000 nodes. Furthermore, we solve our own randomly generated instances up to 6000 nodes. Our approach outperforms most well-known heuristics in terms of solution quality and time execution and it allows hitherto unsolved problems to be solved

    A δ-constraint multi-objective optimization framework for operation planning of smart grids

    No full text
    With the increasing penetration of renewable energies into the smart grid, satisfying load and avoiding energy shortage have become a challenging issue for energy providers, owing to the intermittent nature of renewable energies. The need to reduce energy shortage and pollutant gas emissions has led to a large-scale development of energy management programs such as demand side management (DSM) to control the peak energy demand on a smart grid. The implementation of DSM programs makes the operation planning of smart grids more difficult with an additional objective of customer stakeholder. The operation plan should satisfy all the conflicting objectives of stakeholders (i.e., minimization of the total cost, minimization of the GHG emissions, and maximization of customer satisfaction). In this paper, we present a novel multi-objective optimization framework for energy management in the smart grid to significantly reduce the peak load demand and reshape the load profile. The proposed framework is comprised of four main components: (1) a forecasting model that predicts the 24-h-ahead energy load, (2) a load shifting DSM program that reduces the energy load during peak demand, (3) a piecewise linear approximation method that linearizes the non-linear objective functions and constraints, and (4) a δ-constraint multi-objective optimization method that efficiently finds the Pareto frontier solutions. The capabilities of the proposed framework demonstrated on a synthetic smart grid case study with 50 buildings. The results reveal that the proposed framework has successfully met the desired load curve while obtaining a significantly larger Pareto frontier solution set (with more non-dominated solutions) in less computational time

    Microgrid Operational Planning Using Deviation Clustering Within a DDDAS Framework

    No full text
    As climate change progresses and the global population continues to increase, meeting the energy demand is an issue that has been brought to the forefront of the conversation. Microgrids (MGs) are groundbreaking tools that have risen in popularity to combat this crisis by capitalizing on renewable, distributed energy resources to efficiently satisfy the energy demand from environmental sensors via telemetry. In this work, we present a deviation clustering (DC) algorithm within a dynamic data-driven application systems (DDDAS) framework to reduce the length of the MG dispatch model’s planning horizon while retaining the temporal characteristics of the initial load profile. The DDDAS framework allows for the adjustment of the current dispatch decisions in near real-time. We develop two modules embedded within this framework; the first is a proposed rule-based policy (RBP) that modifies the sensing strategy and the second is the DC algorithm which reduces the execution time of the MG simulation. Numerical analysis was conducted on the IEEE-18 bus test network to assess the performance of the proposed framework and determine an appropriate threshold for clustering. The limitations of the presented framework were also determined by comparing the tradeoff between its the speed of the solver’s solution time and the accuracy of the resulting solution. The results indicate a decrease in solution time within the desired accuracy limits when using the proposed approach as opposed to traditional load dispatch

    Dynamic Data Driven Application Systems for Identification of Biomarkers in DNA Methylation

    No full text
    The term ‘epigenetic’ refers to all heritable alterations that occur in a given gene function without having any change on the DeoxyriboNucleic Acid (DNA) sequence. Epigenetic modifications play a crucial role in development and differentiation of various diseases including cancer. The specific epigenetic alteration that has garnered a great deal of attention is DNA methylation, i.e., the addition of a methyl-group to cytosine. Recent studies have shown that different tumor types have distinct methylation profiles. Identifying idiosyncratic DNA methylation profiles of different tumor types and subtypes can provide invaluable insights for accurate diagnosis, early detection, and tailoring of the related treatment for cancer. In this study, our goal is to identify the informative genes (biomarkers) whose methylation level change correlates with a specific cancer type or subtype. To achieve this goal, we propose a novel high dimensional learning framework inspired by the dynamic data driven application systems paradigm to identify the biomarkers, determine the outlier(s) and improve the quality of the resultant disease detection. The proposed framework starts with a principal component analysis (PCA) followed by hierarchical clustering (HCL) of observations and determination of informative genes based on the HCL predictions. The capabilities and performance of the proposed framework are demonstrated using a DNA methylation dataset stored in Gene Expression Omnibus (GEO) DataSets on lung cancer. The preliminary results demonstrate that our framework outperforms the conventional clustering algorithms with embedded dimension reduction methods, in its efficiency to identify informative genes and outliers, and removal of their contaminating effects at the expense of reasonable computational cost
    corecore