3 research outputs found

    Parameter Estimation of The Blumberg Model Using Simulated Annealing Algorithm: Case Study of Broiler Body Weight

    Get PDF
    The Blumberg model is one of the logistic models. The advantage of the Blumberg model is the flexibility of the inflection point. The Blumberg model is believed to be suitable for modeling the growth of living organs. In this article, we estimate the parameters of the Blumberg model using simulated annealing algorithm. The simulated annealing algorithm is a heuristic optimization method based on the metal annealing process. The data used is Broiler  daily weight data. The model obtained fits the daily weight data of Broiler. Our results show that the closer the cooling schedule factor to 1, the smaller the error. In addition, we must carefully select the initial temperature. The selection of the initial temperature that is not suitable drives the error to enlarge

    Improved feature selection using a hybrid side-blotched lizard algorithm and genetic algorithm approach

    Get PDF
    Feature selection entails choosing the significant features among a wide collection of original features that are essential for predicting test data using a classifier. Feature selection is commonly used in various applications, such as bioinformatics, data mining, and the analysis of written texts, where the dataset contains tens or hundreds of thousands of features, making it difficult to analyze such a large feature set. Removing irrelevant features improves the predictor performance, making it more accurate and cost-effective. In this research, a novel hybrid technique is presented for feature selection that aims to enhance classification accuracy. A hybrid binary version of side-blotched lizard algorithm (SBLA) with genetic algorithm (GA), namely SBLAGA, which combines the strengths of both algorithms is proposed. We use a sigmoid function to adapt the continuous variables values into a binary one, and evaluate our proposed algorithm on twenty-three standard benchmark datasets. Average classification accuracy, average number of selected features and average fitness value were the evaluation criteria. According to the experimental results, SBLAGA demonstrated superior performance compared to SBLA and GA with regards to these criteria. We further compare SBLAGA with four wrapper feature selection methods that are widely used in the literature, and find it to be more efficient

    Comparative Study of Derivative Free Optimization Algorithms

    No full text
    corecore