23 research outputs found

    ACCPndn: Adaptive Congestion Control Protocol in Named Data Networking by learning capacities using optimized Time-Lagged Feedforward Neural Network

    Get PDF
    Named Data Networking (NDN) is a promising network architecture being considered as a possible replacement for the current IP-based Internet infrastructure. However, NDN is subject to congestion when the number of data packets that reach one or various routers in a certain period of time is so high than its queue gets overflowed. To address this problem many congestion control protocols have been proposed in the literature which, however, they are highly sensitive to their control parameters as well as unable to predict congestion traffic well enough in advance. This paper develops an Adaptive Congestion Control Protocol in NDN (ACCPndn) by learning capacities in two phases to control congestion traffics before they start impacting the network performance. In the first phase – adaptive training – we propose a Time-Lagged Feedforward Network (TLFN) optimized by hybridization of particle swarm optimization and genetic algorithm to predict the source of congestion together with the amount of congestion. In the second phase -fuzzy avoidance- we employ a non-linear fuzzy logic-based control system to make a proactive decision based on the outcomes of first phase in each router per interface to control and/or prevent packet drop well enough in advance. Extensive simulations and results show that ACCPndn sufficiently satisfies the applied performance metrics and outperforms two previous proposals such as NACK and HoBHIS in terms of the minimal packet drop and high-utilization (retrying alternative paths) in bottleneck links to mitigate congestion traffics

    The Integration of Maintenance Decisions and Flow Shop Scheduling

    Get PDF
    In the conventional production and service scheduling problems, it is assumed that the machines can continuously process the jobs and the information is complete and certain. However, in practice the machines must stop for preventive or corrective maintenance, and the information available to the planners can be both incomplete and uncertain. In this dissertation, the integration of maintenance decisions and production scheduling is studied in a permutation flow shop setting. Several variations of the problem are modeled as (stochastic) mixed-integer programs. In these models, some technical nuances are considered that increase the practicality of the models: having various types of maintenance, combining maintenance activities, and the impact of maintenance on the processing times of the production jobs. The solution methodologies involve studying the solution space of the problems, genetic algorithms, stochastic optimization, multi-objective optimization, and extensive computational experiments. The application of the problems and managerial implications are demonstrated through a case study in the earthmoving operations in construction projects

    Enhancing the bees algorithm using the traplining metaphor

    Get PDF
    This work aims to improve the performance of the Bees Algorithm (BA), particularly in terms of simplicity, accuracy, and convergence. Three improvements were made in this study as a result of bees’ traplining behaviour. The first improvement was the parameter reduction of the Bees Algorithm. This strategy recruits and assigns worker bees to exploit and explore all patches. Both searching processes are assigned using the Triangular Distribution Random Number Generator. The most promising patches have more workers and are subject to more exploitation than the less productive patches. This technique reduced the original parameters into two parameters. The results show that the Bi-BA is just as efficient as the basic BA, although it has fewer parameters. Following that, another improvement was proposed to increase the diversification performance of the Combinatorial Bees Algorithm (CBA). The technique employs a novel constructive heuristic that considers the distance and the turning angle of the bees’ flight. When foraging for honey, bees generally avoid making a sharp turn. By including this turning angle as the second consideration, it can control CBA’s initial solution diversity. Third, the CBA is strengthened to enable an intensification strategy that avoids falling into a local optima trap. The approach is based on the behaviour of bees when confronted with threats. They will keep away from re-visiting those flowers during the next bout for reasons like predators, rivals, or honey run out. The approach will remove temporarily threatened flowers from the whole tour, eliminating the sharp turn, and reintroduces them again to the habitual tour’s nearest edge. The technique could effectively achieve an equilibrium between exploration and exploitation mechanisms. The results show that the strategy is very competitive compared to other population-based nature-inspired algorithms. Finally, the enhanced Bees Algorithms are demonstrated on two real-world engineering problems, namely, Printed Circuit Board insertion sequencing and vehicles routing problem

    Novel Tornado-Like Vortex Generator with Intelligent Controller

    Get PDF
    Cooking fumes may cause multiple adverse health effects, and range hoods play central roles in controlling indoor air pollution caused by cooking fumes. However, the traditional design of the range hoods has a low efficiency due to its working principle, and the efficiency decreases rapidly as the mounting height of the exhaust hood increases. This thesis is aimed at design and building a novel tornado-like vortex generator (TLVG) with an intelligent controller to enhance the efficiency of traditional range hood. Both experimental results and numerical simulation indicate that most of the cooking fumes are spreading to surrounding areas when the traditional range hood is working alone, while the cooking fumes are drawn into the tornado-like vortex and exhausted through the range hood when the novel TLVG is on. The effects of various factors on the efficiency of sucking cooking fumes are analyzed by orthogonal experiment design. The results show that the key factor affecting the performance of the TLVG is the horizontal jet angle. A higher jet velocity results in a lower negative pressure, which helps concentrate and exhaust the fume. The results also reveal that the exhaust flow velocity marginally affects the pressure around the source of cooking fumes, but the tornado-like vortex cannot be produced when the value of the exhaust flow velocity is too high. In addition, the figures of the velocity field, pressure field, and tracking particle field are plotted and analyzed. In this thesis, an intelligent controller of TLVG is designed and simulated to adapt to various types of range hoods. Adaptive-Network-based Fuzzy Inference System (ANFIS) is used in this intelligent controller, which combines the merits of both Fuzzy Inference Systems and Neural Networks. The results from the numerical simulation of the TLVG can be used to train and test the neural fuzzy system. Besides, Particle Swarm Optimization (PSO) is used for effective training in ANFIS networks. Digital simulation results demonstrate that the designed ANFIS-Swarm controller realizes a better prediction of the checking data than that from a basic ANFIS controller. This study provides information for improving the kitchen environment, and it can also be applied to different types of range hood, exhaust ventilation system, and air pollution control

    Improving the performance of deep learning techniques using nature inspired algorithms and applying them in porosity prediction

    Get PDF
    Within the field of Artificial Intelligence (AI), Deep Learning (DL) based on Convolutional Neural Network (CNN) can be used for analysing images. However, the performance of the DL models depends on the design of the CNN topology to achieve their best performance. Hence, firstly, this work addresses this problem by proposing a novel nature inspired hybrid algorithm called BA-CNN where a swarm based Bees Algorithm (BA) is used to optimize the CNN parameters. In addition, another algorithm called BABO-CNN is proposed that combines the BA with Bayesian Optimization (BO) to increase the CNN performance and that of BA-CNN and BO-CNN. This study shows that applying the hybrid BA-CNN to the ‘Cifar10DataDir’ benchmark image did not improve the validation and testing accuracy compared to the existing CNN and BO-CNN. However, the hybrid BA-BO-CNN achieved better validation accuracy of 82.22% compared to 80.34% and 80.72% for the CNN and BO-CNN, and also with a better testing accuracy of 80.74% compared to 80.54% and 80.69% for the CNN and BO-CNN respectively. The BA-BOCNN achieved lower computational time than the BO-CNN algorithm by 2 minutes and 11 seconds. Although applying both algorithms to the ‘digits’ dataset produced almost similar accuracies with a difference of 0.01% between BA-CNN and BO-CNN, the BA-CNN achieved a computational time reduction of 4 minutes and 14 seconds compared to the BOCNN, making it the best algorithm in terms of cost-effectiveness. Applying BA-CNN and BA-BO-CNN to identify ‘concrete cracks’ images produced almost similar results to some of the other existing algorithms with a difference of 0.02% between BA-CNN and original CNN. Finally, applying them to the ‘ECG’ images improved the testing accuracy from 90% for the BO-CNN to 92.50% for the BA-CNN and 95% for the BA-BO-CNN with a similar trend for validation accuracy and computational time. Secondly, the CNN that was adopted for the purpose of regression which is called RCNN was applied in the manufacturing context, particularly to predict the percent of porosity in the finished Selective Laser Melting (SLM) parts. Because testing the performance of the RCNN algorithm requires a large amount of experimental data which is generally difficult to obtain, in this study an artificial porosity image creation method is proposed where 3000 artificial porosity images were created mimicking real CT scan slices of the SLM part with a similarity index of 0.9976. Applying the RCNN to the 3000 artificial ii porosity images slices showed the porosity prediction accuracy to improve from 68.60% for the image binarization method to 75.50% for the RCNN, while the proposed novel hybrid BA-BO-RCNN and BA-RCNN yielded better prediction accuracies of 83% and 85.33% respectively. Thirdly, in order to improve the performance even further, this study proposes to add Long Short Term Memory (LSTM) to BA-CNN because of their ability to deal with sequential data to produce another novel hybrid algorithm called BA-CNN-LSTM and the results showed an increase in the prediction accuracy reaching 95.50

    Expanding Dimensionality in Cinema Color: Impacting Observer Metamerism through Multiprimary Display

    Get PDF
    Television and cinema display are both trending towards greater ranges and saturation of reproduced colors made possible by near-monochromatic RGB illumination technologies. Through current broadcast and digital cinema standards work, system designs employing laser light sources, narrow-band LED, quantum dots and others are being actively endorsed in promotion of Wide Color Gamut (WCG). Despite artistic benefits brought to creative content producers, spectrally selective excitations of naturally different human color response functions exacerbate variability of observer experience. An exaggerated variation in color-sensing is explicitly counter to the exhaustive controls and calibrations employed in modern motion picture pipelines. Further, singular standard observer summaries of human color vision such as found in the CIE’s 1931 and 1964 color matching functions and used extensively in motion picture color management are deficient in recognizing expected human vision variability. Many researchers have confirmed the magnitude of observer metamerism in color matching in both uniform colors and imagery but few have shown explicit color management with an aim of minimized difference in observer perception variability. This research shows that not only can observer metamerism influences be quantitatively predicted and confirmed psychophysically but that intentionally engineered multiprimary displays employing more than three primaries can offer increased color gamut with drastically improved consistency of experience. To this end, a seven-channel prototype display has been constructed based on observer metamerism models and color difference indices derived from the latest color vision demographic research. This display has been further proven in forced-choice paired comparison tests to deliver superior color matching to reference stimuli versus both contemporary standard RGB cinema projection and recently ratified standard laser projection across a large population of color-normal observers

    Evolutionary Algorithms in Engineering Design Optimization

    Get PDF
    Evolutionary algorithms (EAs) are population-based global optimizers, which, due to their characteristics, have allowed us to solve, in a straightforward way, many real world optimization problems in the last three decades, particularly in engineering fields. Their main advantages are the following: they do not require any requisite to the objective/fitness evaluation function (continuity, derivability, convexity, etc.); they are not limited by the appearance of discrete and/or mixed variables or by the requirement of uncertainty quantification in the search. Moreover, they can deal with more than one objective function simultaneously through the use of evolutionary multi-objective optimization algorithms. This set of advantages, and the continuously increased computing capability of modern computers, has enhanced their application in research and industry. From the application point of view, in this Special Issue, all engineering fields are welcomed, such as aerospace and aeronautical, biomedical, civil, chemical and materials science, electronic and telecommunications, energy and electrical, manufacturing, logistics and transportation, mechanical, naval architecture, reliability, robotics, structural, etc. Within the EA field, the integration of innovative and improvement aspects in the algorithms for solving real world engineering design problems, in the abovementioned application fields, are welcomed and encouraged, such as the following: parallel EAs, surrogate modelling, hybridization with other optimization techniques, multi-objective and many-objective optimization, etc
    corecore