6,620 research outputs found

    Predictive modeling of die filling of the pharmaceutical granules using the flexible neural tree

    Get PDF
    In this work, a computational intelligence (CI) technique named flexible neural tree (FNT) was developed to predict die filling performance of pharmaceutical granules and to identify significant die filling process variables. FNT resembles feedforward neural network, which creates a tree-like structure by using genetic programming. To improve accuracy, FNT parameters were optimized by using differential evolution algorithm. The performance of the FNT-based CI model was evaluated and compared with other CI techniques: multilayer perceptron, Gaussian process regression, and reduced error pruning tree. The accuracy of the CI model was evaluated experimentally using die filling as a case study. The die filling experiments were performed using a model shoe system and three different grades of microcrystalline cellulose (MCC) powders (MCC PH 101, MCC PH 102, and MCC DG). The feed powders were roll-compacted and milled into granules. The granules were then sieved into samples of various size classes. The mass of granules deposited into the die at different shoe speeds was measured. From these experiments, a dataset consisting true density, mean diameter (d50), granule size, and shoe speed as the inputs and the deposited mass as the output was generated. Cross-validation (CV) methods such as 10FCV and 5x2FCV were applied to develop and to validate the predictive models. It was found that the FNT-based CI model (for both CV methods) performed much better than other CI models. Additionally, it was observed that process variables such as the granule size and the shoe speed had a higher impact on the predictability than that of the powder property such as d50. Furthermore, validation of model prediction with experimental data showed that the die filling behavior of coarse granules could be better predicted than that of fine granules

    Improving soil stability with alum sludge : an ai-enabled approach for accurate prediction of california bearing ratio

    Get PDF
    Alum sludge is a byproduct of water treatment plants, and its use as a soil stabilizer has gained increasing attention due to its economic and environmental benefits. Its application has been shown to improve the strength and stability of soil, making it suitable for various engineering applications. However, to go beyond just measuring the effects of alum sludge as a soil stabilizer, this study investigates the potential of artificial intelligence (AI) methods for predicting the California bearing ratio (CBR) of soils stabilized with alum sludge. Three AI methods, including two black box methods (artificial neural network and support vector machines) and one grey box method (genetic programming), were used to predict CBR, based on a database with nine input parameters. The results demonstrate the effectiveness of AI methods in predicting CBR with good accuracy (R2 values ranging from 0.94 to 0.99 and MAE values ranging from 0.30 to 0.51). Moreover, a novel approach, using genetic programming, produced an equation that accurately estimated CBR, incorporating seven inputs. The analysis of parameter sensitivity and importance, revealed that the number of hammer blows for compaction was the most important parameter, while the parameters for maximum dry density of soil and mixture were the least important. This study highlights the potential of AI methods as a useful tool for predicting the performance of alum sludge as a soil stabilizer. © 2023 by the authors

    Investigating the Impact of Independent Rule Fitnesses in a Learning Classifier System

    Get PDF
    Achieving at least some level of explainability requires complex analyses for many machine learning systems, such as common black-box models. We recently proposed a new rule-based learning system, SupRB, to construct compact, interpretable and transparent models by utilizing separate optimizers for the model selection tasks concerning rule discovery and rule set composition.This allows users to specifically tailor their model structure to fulfil use-case specific explainability requirements. From an optimization perspective, this allows us to define clearer goals and we find that -- in contrast to many state of the art systems -- this allows us to keep rule fitnesses independent. In this paper we investigate this system's performance thoroughly on a set of regression problems and compare it against XCSF, a prominent rule-based learning system. We find the overall results of SupRB's evaluation comparable to XCSF's while allowing easier control of model structure and showing a substantially smaller sensitivity to random seeds and data splits. This increased control can aid in subsequently providing explanations for both training and final structure of the model.Comment: arXiv admin note: substantial text overlap with arXiv:2202.0167

    Soft Computing Approach To Automatic Test Pattern Generation For Sequential Vlsi Circuit

    Get PDF
    Due to the constant development in the integrated circuits, the automatic test pattern generation problem become more vital for sequential vlsi circuits in these days. Also testing of integrating circuits and systems has become a difficult problem. In this paper we have discussed the problem of the automatic test sequence generation using particle swarm optimization(PSO) and technique for structure optimization of a deterministic test pattern generator using genetic algorithm(GA)

    Method for optimal vertical alignment of highways

    Full text link
    This paper presents a methodology to consider vague soil parameters required for earthwork optimisation, and to develop a genetic algorithm-based constrained curve-fitting technique required for highway vertical alignment process. The weighted ground line method is an earthwork optimisation methodology based on a hypothetical reference line and taking into account three soil properties to calculate realistic cut-fill volumes, namely swelling potential, compactibility percentage, and material appropriateness percentage. In this study, fuzzy rule-based inference methodology, which utilises previous experiences that can be expressed with linguistic terms, is employed to characterise swelling/shrinkage behaviour. In addition, material appropriateness concept is also adopted into developed optimisation methodology by a parametric algorithm using technical specifications and geotechnical data. Consequently, the genetic algorithm approach is employed for the determination of final grade line considering weighted ground elevations. The method involving an algorithm to consider the soil parameters as well as an evolutionary computation-based constrained curve-fitting technique produces outstanding geometric alignment

    USING THE VEHICLE ROUTING PROBLEM (VRP) TO PROVIDE LOGISTICS SOLUTIONS IN AGRICULTURE

    Get PDF
    Agricultural producers consider utilizing multiple machines to reduce field completion times for improving effective field capacity. Using a number of smaller machines rather than a single big machine also has benefits such as sustainability via less compaction risk, redundancy in the event of an equipment failure, and more flexibility in machinery management. However, machinery management is complicated due to logistics issues. In this work, the allocation and ordering of field paths among a number of available machines have been transformed into a solvable Vehicle Routing Problem (VRP). A basic heuristic algorithm (a modified form of the Clarke-Wright algorithm) and a meta-heuristic algorithm, Tabu Search, were employed to solve the VRP. The solution considered optimization of field completion time as well as improving the field efficiency. Both techniques were evaluated through computer simulations with 2, 3, 5, or 10 vehicles working simultaneously to complete the same operation. Furthermore, the parameters of the VRP were changed into a dynamic, multi-depot representation to enable the re-route of vehicles while the operation is ongoing. The results proved both the Clarke-Wright and Tabu Search algorithms always generated feasible solutions. The Tabu Search solutions outperformed the solutions provided by the Clarke-Wright algorithm. As the number of the vehicles increased, or the field shape became more complex, the Tabu Search generated better results in terms of reducing the field completion times. With 10 vehicles working together in a real-world field, the benefit provided by the Tabu Search over the Modified Clarke-Wright solution was 32% reduction in completion time. In addition, changes in the parameters of the VRP resulted in a Dynamic, Multi-Depot VRP (DMDVRP) to reset the routes allocated to each vehicle even as the operation was in progress. In all the scenarios tested, the DMDVRP was able to produce new optimized routes, but the impact of these routes varied for each scenario. The ability of this optimization procedure to reduce field work times were verified through real-world experiments using three tractors during a rotary mowing operation. The time to complete the field work was reduced by 17.3% and the total operating time for all tractors was reduced by 11.5%. The task of a single large machine was also simulated as a task for 2 or 3 smaller machines through computer simulations. Results revealed up to 11% reduction in completion time using three smaller machines. This time reduction improved the effective field capacity

    Surrogate models to predict maximum dry unit weight, optimum moisture content and California bearing ratio form grain size distribution curve

    Get PDF
    This study evaluates the applicability of using a robust, novel, data-driven method in proposing surrogate models to predict the maximum dry unit weight, optimum moisture content, and California bearing ratio of coarse-grained soils using only the results of the grain size distribution analysis. The data-driven analysis has been conducted using evolutionary polynomial regression analysis (MOGA-EPR), employing a comprehensive database. The database included the particle diameter corresponding to a percentage of the passing of 10%, 30%, 50%, and 60%, coefficient of uniformity, coefficient of curvature, dry unit weight, optimum moisture content, and California bearing ratio. The statistical assessment results illustrated that the MOGA-EPR provides robust models to predict the maximum dry unit weight, optimum moisture content, and California bearing ratio. The new models’ performance has also been compared with the empirical models proposed by different researchers. It was found from the comparisons that the new models provide enhanced accuracy in predictions as these models scored lower mean absolute error and root mean square error, mean values closer to one, and higher a20−index and coefficient of correlation. Therefore, the new models can be used to ensure more optimised and robust design calculations

    Application of Machine Learning in Well Performance Prediction, Design Optimization and History Matching

    Get PDF
    Finite difference based reservoir simulation is commonly used to predict well rates in these reservoirs. Such detailed simulation requires an accurate knowledge of reservoir geology. Also, these reservoir simulations may be very costly in terms of computational time. Recently, some studies have used the concept of machine learning to predict mean or maximum production rates for new wells by utilizing available well production and completion data in a given field. However, these studies cannot predict well rates as a function of time. This dissertation tries to fill this gap by successfully applying various machine learning algorithms to predict well decline rates as a function of time. This is achieved by utilizing available multiple well data (well production, completion and location data) to build machine learning models for making rate decline predictions for the new wells. It is concluded from this study that well completion and location variables can be successfully correlated to decline curve model parameters and Estimated Ultimate Recovery (EUR) with a reasonable accuracy. Among the various machine learning models studied, the Support Vector Machine (SVM) algorithm in conjunction with the Stretched Exponential Decline Model (SEDM) was concluded to be the best predictor for well rate decline. This machine learning method is very fast compared to reservoir simulation and does not require a detailed reservoir information. Also, this method can be used to fast predict rate declines for more than one well at the same time. This dissertation also investigates the problem of hydraulic fracture design optimization in unconventional reservoirs. Previous studies have concentrated mainly on optimizing hydraulic fractures in a given permeability field which may not be accurately known. Also, these studies do not take into account the trade-off between the revenue generated from a given fracture design and the cost involved in having that design. This dissertation study fills these gaps by utilizing a Genetic Algorithm (GA) based workflow which can find the most suitable fracturing design (fracture locations, half-lengths and widths) for a given unconventional reservoir by maximizing the Net Present Value (NPV). It is concluded that this method can optimize hydraulic fracture placement in the presence of natural fracture/permeability uncertainty. It is also concluded that this method results in a much higher NPV compared to an equally spaced hydraulic fractures with uniform fracture dimensions. Another problem under investigation in this dissertation is that of field scale history matching in unconventional shale oil reservoirs. Stochastic optimization methods are commonly used in history matching problems requiring a large number of forward simulations due to the presence of a number of uncertain variables with unrefined variable ranges. Previous studies commonly used a single stage history matching. This study presents a method utilizing multiple stages of GA. Most significant variables are separated out from the rest of the variables in the first GA stage. Next, best models with refined variable ranges are utilized with previously eliminated variables to conduct GA for next stage. This method results in faster convergence of the problem

    Anchoring linkage groups of the Rosa genetic map to physical chromosomes with tyramide-FISH and EST-SNP markers

    Get PDF
    In order to anchor Rosa linkage groups to physical chromosomes, a combination of the Tyramide-FISH technology and the modern molecular marker system based on High Resolution Melting (HRM) is an efficient approach. Although, Tyramide-FISH is a very promising technique for the visualization of short DNA probes, it is very challenging for plant species with small chromosomes such as Rosa. In this study, we successfully applied the Tyramide-FISH technique for Rosa and compared different detection systems. An indirect detection system exploiting biotinylated tyramides was shown to be the most suitable technique for reliable signal detection. Three gene fragments with a size of 1100 pb-1700 bp (Phenylalanine Ammonia Lyase, Pyrroline-5-Carboxylate Synthase and Orcinol O-Methyl Transferase) have been physically mapped on chromosomes 7, 4 and 1, respectively, of Rosa wichurana. The signal frequency was between 25% and 40%. HRM markers of these 3 gene fragments were used to include the gene fragments on the existing genetic linkage map of Rosa wichurana. As a result, three linkage groups could be anchored to their physical chromosomes. The information was used to check for synteny between the Rosa chromosomes and Fragaria
    corecore