7 research outputs found
Solving the Uncapacitated Single Allocation p-Hub Median Problem on GPU
A parallel genetic algorithm (GA) implemented on GPU clusters is proposed to
solve the Uncapacitated Single Allocation p-Hub Median problem. The GA uses
binary and integer encoding and genetic operators adapted to this problem. Our
GA is improved by generated initial solution with hubs located at middle nodes.
The obtained experimental results are compared with the best known solutions on
all benchmarks on instances up to 1000 nodes. Furthermore, we solve our own
randomly generated instances up to 6000 nodes. Our approach outperforms most
well-known heuristics in terms of solution quality and time execution and it
allows hitherto unsolved problems to be solved
A δ-constraint multi-objective optimization framework for operation planning of smart grids
With the increasing penetration of renewable energies into the smart grid, satisfying load and avoiding energy shortage have become a challenging issue for energy providers, owing to the intermittent nature of renewable energies. The need to reduce energy shortage and pollutant gas emissions has led to a large-scale development of energy management programs such as demand side management (DSM) to control the peak energy demand on a smart grid. The implementation of DSM programs makes the operation planning of smart grids more difficult with an additional objective of customer stakeholder. The operation plan should satisfy all the conflicting objectives of stakeholders (i.e., minimization of the total cost, minimization of the GHG emissions, and maximization of customer satisfaction). In this paper, we present a novel multi-objective optimization framework for energy management in the smart grid to significantly reduce the peak load demand and reshape the load profile. The proposed framework is comprised of four main components: (1) a forecasting model that predicts the 24-h-ahead energy load, (2) a load shifting DSM program that reduces the energy load during peak demand, (3) a piecewise linear approximation method that linearizes the non-linear objective functions and constraints, and (4) a δ-constraint multi-objective optimization method that efficiently finds the Pareto frontier solutions. The capabilities of the proposed framework demonstrated on a synthetic smart grid case study with 50 buildings. The results reveal that the proposed framework has successfully met the desired load curve while obtaining a significantly larger Pareto frontier solution set (with more non-dominated solutions) in less computational time
Recommended from our members
Machine learning models for estimating contamination across different curbside collection strategies
Contaminated recyclables, which are frequently discarded as waste, pose a significant challenge to the implementation of a circular economy. These contaminated recyclables impede the circulation of resources, resulting in higher processing costs at material recovery facilities (MRFs). Over the past few decades, machine learning (ML) models such as linear regression (LR), support vector machine (SVM), and random forest (RF) have evolved to provide new methods for predicting inbound contamination rates in addition to traditional statistical models. In this study, we applied ML models to predict inbound contamination rates using demographic features from 15 counties in the U.S. with different curbside collection strategies. In general, we found that ML models outperformed linear mixed models. Specifically, SVM models had the highest performance (R2Â =Â 0.75; mean absolute error (MAE)Â =Â 0.06), which may be due to their ability to model nonlinear relationships between features and inbound contamination rates. The key predictor was population, with poverty rate being positively correlated and median age negatively correlated with inbound contamination rates. To improve the management of contamination and enhance the implementation of a circular economy, better models are needed to understand and estimate inbound contamination rates as well as identify critical factors in the present and future.
•We examine the factors influencing inbound paper contamination in MRFs.•Importance of the factors are determined by three machine learning models.•The contamination levels in single stream were found to be higher than that of commingled.•SVM regressor obtained the best performance.•Population size is determined to be the most significant factor by all three ML methods
Recommended from our members
Recovering value from single stream material recovery facilities – An outbound contamination analysis in Florida
•We investigate the contamination issue in SSR in Florida analyzing the audit data.•High reject rate in inbound streams is the main driver of the contamination issue.•Grit/fines/sweepings, mixed plastics, scrap metals are the main contaminants.•Contamination rates in glass and colored HDPE streams can be up to 45%.•None of the obtained 266 ONP and only 31.4% OCC samples meet the mill standards.
The single stream recycling (SSR) program is a process in which all recyclable materials are deposited into a single collection bin. SSR has gained popularity in the U.S. due to its inherent abilities in waste collection, and specifically, in Florida, more than twenty counties have recently switched their recycling program from dual stream recycling (DSR) to SSR. Despite a more efficient collection process, mixing all recyclable materials into a single bin can lead to cross contamination even before reaching material recovery facilities (MRFs). This study aims to provide a better understanding of the sorting process and equipment in MRFs, and the impact of the SSR program on contamination rates in outbound materials that were processed through Florida’s recycling systems. First, we investigate the audit data obtained from a currently operating MRF in Florida using mass flow analysis to identify the most problematic recyclable streams and the processes with low efficiency and high false separation rates. According to our results, the sorting rates of mixed paper, glass and plastics are under the industry standards. Moreover, we investigate the outbound contamination rates of 35 old corrugated cardboard (OCC) and 266 old newsprints (ONP) samples obtained from four currently operating MRFs in Florida. Based on the results, only 31.4% of OCC samples and none of the ONP samples were within the accepted mills’ standards for contamination rates. This study provides valuable insights for lowering contamination and raising the end-product quality by identifying the problematic contaminants and processes in sorting and separation in MRFs
Microgrid Operational Planning Using Deviation Clustering Within a DDDAS Framework
As climate change progresses and the global population continues to increase, meeting the energy demand is an issue that has been brought to the forefront of the conversation. Microgrids (MGs) are groundbreaking tools that have risen in popularity to combat this crisis by capitalizing on renewable, distributed energy resources to efficiently satisfy the energy demand from environmental sensors via telemetry. In this work, we present a deviation clustering (DC) algorithm within a dynamic data-driven application systems (DDDAS) framework to reduce the length of the MG dispatch model’s planning horizon while retaining the temporal characteristics of the initial load profile. The DDDAS framework allows for the adjustment of the current dispatch decisions in near real-time. We develop two modules embedded within this framework; the first is a proposed rule-based policy (RBP) that modifies the sensing strategy and the second is the DC algorithm which reduces the execution time of the MG simulation. Numerical analysis was conducted on the IEEE-18 bus test network to assess the performance of the proposed framework and determine an appropriate threshold for clustering. The limitations of the presented framework were also determined by comparing the tradeoff between its the speed of the solver’s solution time and the accuracy of the resulting solution. The results indicate a decrease in solution time within the desired accuracy limits when using the proposed approach as opposed to traditional load dispatch
Dynamic Data Driven Application Systems for Identification of Biomarkers in DNA Methylation
The term ‘epigenetic’ refers to all heritable alterations that occur in a given gene function without having any change on the DeoxyriboNucleic Acid (DNA) sequence. Epigenetic modifications play a crucial role in development and differentiation of various diseases including cancer. The specific epigenetic alteration that has garnered a great deal of attention is DNA methylation, i.e., the addition of a methyl-group to cytosine. Recent studies have shown that different tumor types have distinct methylation profiles. Identifying idiosyncratic DNA methylation profiles of different tumor types and subtypes can provide invaluable insights for accurate diagnosis, early detection, and tailoring of the related treatment for cancer. In this study, our goal is to identify the informative genes (biomarkers) whose methylation level change correlates with a specific cancer type or subtype. To achieve this goal, we propose a novel high dimensional learning framework inspired by the dynamic data driven application systems paradigm to identify the biomarkers, determine the outlier(s) and improve the quality of the resultant disease detection. The proposed framework starts with a principal component analysis (PCA) followed by hierarchical clustering (HCL) of observations and determination of informative genes based on the HCL predictions. The capabilities and performance of the proposed framework are demonstrated using a DNA methylation dataset stored in Gene Expression Omnibus (GEO) DataSets on lung cancer. The preliminary results demonstrate that our framework outperforms the conventional clustering algorithms with embedded dimension reduction methods, in its efficiency to identify informative genes and outliers, and removal of their contaminating effects at the expense of reasonable computational cost