2,168 research outputs found

    HYBRYDOWY, BINARNY ALGORYTM WOA OPARTY NA TRANSMITANCJI STOŻKOWEJ DO PROGNOZOWANIA DEFEKTÓW OPROGRAMOWANIA

    Get PDF
    Reliability is one of the key factors used to gauge software quality. Software defect prediction (SDP) is one of the most important factors which affects measuring software's reliability. Additionally, the high dimensionality of the features has a direct effect on the accuracy of SDP models. The objective of this paper is to propose a hybrid binary whale optimization algorithm (BWOA) based on taper-shape transfer functions for solving feature selection problems and dimension reduction with a KNN classifier as a new software defect prediction method. In this paper, the values of a real vector that represents the individual encoding have been converted to binary vector by using the four types of Taper-shaped transfer functions to enhance the performance of BWOA to reduce the dimension of the search space. The performance of the suggested method (T-BWOA-KNN) was evaluated using eleven standard software defect prediction datasets from the PROMISE and NASA repositories depending on the K-Nearest Neighbor (KNN) classifier. Seven evaluation metrics have been used to assess the effectiveness of the suggested method. The experimental results have shown that the performance of T-BWOA-KNN produced promising results compared to other methods including ten methods from the literature, four types of T-BWOA with the KNN classifier. In addition, the obtained results are compared and analyzed with other methods from the literature in terms of the average number of selected features (SF) and accuracy rate (ACC) using the Kendall W test. In this paper, a new hybrid software defect prediction method called T-BWOA-KNN has been proposed which is concerned with the feature selection problem. The experimental results have proved that T-BWOA-KNN produced promising performance compared with other methods for most datasets.Niezawodność jest jednym z kluczowych czynników stosowanych do oceny jakości oprogramowania. Przewidywanie defektów oprogramowania SDP (ang. Software Defect Prediction) jest jednym z najważniejszych czynników wpływających na pomiar niezawodności oprogramowania. Dodatkowo, wysoka wymiarowość cech ma bezpośredni wpływ na dokładność modeli SDP. Celem artykułu jest zaproponowanie hybrydowego algorytmu optymalizacji BWOA (ang. Binary Whale Optimization Algorithm) w oparciu o transmitancję stożkową do rozwiązywania problemów selekcji cech i redukcji wymiarów za pomocą klasyfikatora KNN jako nowej metody przewidywania defektów oprogramowania. W artykule, wartości wektora rzeczywistego, reprezentującego indywidualne kodowanie zostały przekonwertowane na wektor binarny przy użyciu czterech typów funkcji transferu w kształcie stożka w celu zwiększenia wydajności BWOA i zmniejszenia wymiaru przestrzeni poszukiwań. Wydajność sugerowanej metody (T-BWOA-KNN) oceniano przy użyciu jedenastu standardowych zestawów danych do przewidywania defektów oprogramowania z repozytoriów PROMISE i NASA w zależności od klasyfikatora KNN. Do oceny skuteczności sugerowanej metody wykorzystano siedem wskaźników ewaluacyjnych. Wyniki eksperymentów wykazały, że działanie rozwiązania T-BWOA-KNN pozwoliło uzyskać obiecujące wyniki w porównaniu z innymi metodami, w tym dziesięcioma metodami na podstawie literatury, czterema typami T-BWOA z klasyfikatorem KNN. Dodatkowo, otrzymane wyniki zostały porównane i przeanalizowane innymi metodami z literatury pod kątem średniej liczby wybranych cech (SF) i współczynnika dokładności (ACC), z wykorzystaniem testu W. Kendalla. W pracy, zaproponowano nową hybrydową metodę przewidywania defektów oprogramowania, nazwaną T-BWOA-KNN, która dotyczy problemu wyboru cech. Wyniki eksperymentów wykazały, że w przypadku większości zbiorów danych T-BWOA-KNN uzyskała obiecującą wydajność w porównaniu z innymi metodami

    GP-NAS-ensemble: a model for NAS Performance Prediction

    Full text link
    It is of great significance to estimate the performance of a given model architecture without training in the application of Neural Architecture Search (NAS) as it may take a lot of time to evaluate the performance of an architecture. In this paper, a novel NAS framework called GP-NAS-ensemble is proposed to predict the performance of a neural network architecture with a small training dataset. We make several improvements on the GP-NAS model to make it share the advantage of ensemble learning methods. Our method ranks second in the CVPR2022 second lightweight NAS challenge performance prediction track

    Influence of initialization on the performance of metaheuristic optimizers

    Get PDF
    All metaheuristic optimization algorithms require some initialization, and the initialization for such optimizers is usually carried out randomly. However, initialization can have some significant influence on the performance of such algorithms. This paper presents a systematic comparison of 22 different initialization methods on the convergence and accuracy of five optimizers: differential evolution (DE), particle swarm optimization (PSO), cuckoo search (CS), artificial bee colony (ABC) algorithm and genetic algorithm (GA). We have used 19 different test functions with different properties and modalities to compare the possible effects of initialization, population sizes and the numbers of iterations. Rigorous statistical ranking tests indicate that 43.37% of the functions using the DE algorithm show significant differences for different initialization methods, while 73.68% of the functions using both PSO and CS algorithms are significantly affected by different initialization methods. The simulations show that DE is less sensitive to initialization, while both PSO and CS are more sensitive to initialization. In addition, under the condition of the same maximum number of function evaluations (FEs), the population size can also have a strong effect. Particle swarm optimization usually requires a larger population, while the cuckoo search needs only a small population size. Differential evolution depends more heavily on the number of iterations, a relatively small population with more iterations can lead to better results. Furthermore, ABC is more sensitive to initialization, while such initialization has little effect on GA. Some probability distributions such as the beta distribution, exponential distribution and Rayleigh distribution can usually lead to better performance. The implications of this study and further research topics are also discussed in detail

    Models and algorithms for promoting diverse and fair query results

    Get PDF
    Ensuring fairness and diversity in search results are two key concerns in compelling search and recommendation applications. This work explicitly studies these two aspects given multiple users\u27 preferences as inputs, in an effort to create a single ranking or top-k result set that satisfies different fairness and diversity criteria. From group fairness standpoint, it adapts demographic parity like group fairness criteria and proposes new models that are suitable for ranking or producing top-k set of results. This dissertation also studies equitable exposure of individual search results in long tail data, a concept related to individual fairness. First, the dissertation focuses on aggregating ranks while achieving proportionate fairness (ensures proportionate representation of every group) for multiple protected groups. Then, the dissertation explores how to minimally modify original users\u27 preferences under plurality voting, aiming to produce top-k result set that satisfies complex fairness constraints. A concept referred to as manipulation by modifications is introduced, which involves making minimal changes to the original user preferences to ensure query satisfaction. This problem is formalized as the margin finding problem. A follow up work studies this problem considering a popular ranked choice voting mechanism, namely, the Instant Run-off Voting or IRV, as the preference aggregation method. From the standpoint of individual fairness, this dissertation studies an exposure concern that top-k set based algorithms exhibit when the underlying data has long tail properties, and designs techniques to make those results equitable. For result diversification, the work studies efficiency opportunities in existing diversification algorithms, and designs a generic access primitive called DivGetBatch() to enable that. The contributions of this dissertation lie in (a) formalizing principal problems and studying them analytically. (b) designing scalable algorithms with theoretical guarantees, and (c) extensive experimental study to evaluate the efficacy and scalability of the designed solutions by comparing them with the state-of-the-art solutions using large-scale datasets

    Evolving generalist controllers to handle a wide range of morphological variations

    Full text link
    Neuro-evolutionary methods have proven effective in addressing a wide range of tasks. However, the study of the robustness and generalisability of evolved artificial neural networks (ANNs) has remained limited. This has immense implications in the fields like robotics where such controllers are used in control tasks. Unexpected morphological or environmental changes during operation can risk failure if the ANN controllers are unable to handle these changes. This paper proposes an algorithm that aims to enhance the robustness and generalisability of the controllers. This is achieved by introducing morphological variations during the evolutionary process. As a results, it is possible to discover generalist controllers that can handle a wide range of morphological variations sufficiently without the need of the information regarding their morphologies or adaptation of their parameters. We perform an extensive experimental analysis on simulation that demonstrates the trade-off between specialist and generalist controllers. The results show that generalists are able to control a range of morphological variations with a cost of underperforming on a specific morphology relative to a specialist. This research contributes to the field by addressing the limited understanding of robustness and generalisability in neuro-evolutionary methods and proposes a method by which to improve these properties

    Multi-platform arabinoxylan scaffolds as potential wound dressing materials

    Get PDF
    Biopolymers are becoming more attractive as advanced wound dressings because of their naturally derived origin, abundance, low cost and high compatibility with the wound environment. Arabinoxylan (AX) is a class of polysaccharide polymers derived from cereal grains that are primarily used in food products and cosmetic additives. Its application as a wound dressing material has yet to be realized. In this two-pronged project, arabinoxylan ferulate (AXF) was fabricated into electrospun fibers and gel foams to be evaluated as platforms for wound dressing materials. In the first study, AXF was electrospun with varying amounts of gelatin. In the second study, AXF was dissolved in water, enzymatically crosslinked and lyophilized to form gel foams. The morphology, mechanical properties, porosity, drug release kinetics, fibroblast cell response and anti-microbial properties were examined for both platforms. Carbohydrate assay was conducted to validate the presence of arabinoxylan ferulate in the electrospun GEL-AXF fibers. Swelling and endotoxin quantification studies were done to evaluate the absorptive capacity and sterilization agent efficacy respectively in AXF foams. The results indicated successful fabrication of both platforms which validated the porous, absorptive, biocompatibility and drug release properties. The results also exhibited that silver impregnated AXF scaffolds inhibited growth of Pseudomonas aeruginosa, Staphylococcus aureus and Enterococcus faecalis bacteria species, anti-microbial properties necessary to function as advanced wound dressing materials. Future work will be done to improve the stability of both platforms as well as evaluate its applications in vivo

    Scene representation and matching for visual localization in hybrid camera scenarios

    Get PDF
    Scene representation and matching are crucial steps in a variety of tasks ranging from 3D reconstruction to virtual/augmented/mixed reality applications, to robotics, and others. While approaches exist that tackle these tasks, they mostly overlook the issue of efficiency in the scene representation, which is fundamental in resource-constrained systems and for increasing computing speed. Also, they normally assume the use of projective cameras, while performance on systems based on other camera geometries remains suboptimal. This dissertation contributes with a new efficient scene representation method that dramatically reduces the number of 3D points. The approach sets up an optimization problem for the automated selection of the most relevant points to retain. This leads to a constrained quadratic program, which is solved optimally with a newly introduced variant of the sequential minimal optimization method. In addition, a new initialization approach is introduced for the fast convergence of the method. Extensive experimentation on public benchmark datasets demonstrates that the approach produces a compressed scene representation quickly while delivering accurate pose estimates. The dissertation also contributes with new methods for scene matching that go beyond the use of projective cameras. Alternative camera geometries, like fisheye cameras, produce images with very high distortion, making current image feature point detectors and descriptors less efficient, since designed for projective cameras. New methods based on deep learning are introduced to address this problem, where feature detectors and descriptors can overcome distortion effects and more effectively perform feature matching between pairs of fisheye images, and also between hybrid pairs of fisheye and perspective images. Due to the limited availability of fisheye-perspective image datasets, three datasets were collected for training and testing the methods. The results demonstrate an increase of the detection and matching rates which outperform the current state-of-the-art methods

    Electrohydrodynamic Processing for Preparation of Advanced Drug Delivery Systems

    Get PDF
    This research explores the feasibility of the electrohydrodynamic processing using single and co-axial set-up as a single step processing tool for preparation of advanced drug delivery systems. A number of synthetic biodegradable and non-biodegradable polymers were used in order to prepare formulations incorporating drugs of different physicochemical characteristics. Based on the focus and the desired applications, the polymeric carrier and solvent system as well as the model drug of interest were selected to develop the drug delivery systems. Firstly, core-shell microparticles were prepared and optimized using co-axial electrohdrodynamic processing with precise control over the averaged particle size and size distribution. This was followed by integration of model drugs with different water-solubility. In this study, the release characteristics of the developed particles were investigated with single and simultaneous encapsulation of the drugs. Successful preparation of fixed dose combination formulation with high processing yield and encapsulation efficiency was reported. Secondly, single and co-axial electrohydrodynamic processing was utilized for preparation of smart drug delivery system for targeted release of prednisolone. Colon targeted drug delivery systems were developed using a pH-responsive polymer. Varying polymer drug ratio was applied to further enhance the release profiles and obtain an efficient delivery system whereby local delivery of prednisolone is made possible. Finally, microspheres were developed for co-encapsulation of anti-diabetic drugs with different water-solubility. The successfully developed sustained release formulations have the potential to overcome the existing limitations of conventional formulations by enhancing patient compliance and efficacy of the treatment of any chronic conditions
    corecore