3 research outputs found

    Feature extraction and classification of movie reviews

    Get PDF

    Optimizing the Simplicial-Map Neural Network Architecture

    Get PDF
    Simplicial-map neural networks are a recent neural network architecture induced by simplicial maps defined between simplicial complexes. It has been proved that simplicial-map neural networks are universal approximators and that they can be refined to be robust to adversarial attacks. In this paper, the refinement toward robustness is optimized by reducing the number of simplices (i.e., nodes) needed. We have shown experimentally that such a refined neural network is equivalent to the original network as a classification tool but requires much less storage.Agencia Estatal de Investigación PID2019-107339GB-10

    Feature Selection Using Genetic Algorithms and Genetic Programming

    Get PDF
    Rodrigues, N. M., Batista, J. E., La Cava, W., Vanneschi, L., & Silva, S. (2024). Exploring SLUG: Feature Selection Using Genetic Algorithms and Genetic Programming. SN Computer Science, 5(1), 1-17. [91]. https://doi.org/10.1007/s42979-023-02106-3 --- Open access funding provided by FCT|FCCN (b-on). This work was partially supported by the FCT, Portugal, through funding of the LASIGE Research Unit (UIDB/00408/2020 and UIDP/00408/2020); MAR2020 program via project MarCODE (MAR-01.03.01-FEAMP-0047); project AICE (DSAIPA/DS/0113/2019). Nuno Rodrigues and João Batista were supported by PhD Grants 2021/05322/BD and SFRH/BD/143972/2019, respectively; William La Cava was supported by the National Library Of Medicine of the National Institutes of Health under Award Number R00LM012926We present SLUG, a recent method that uses genetic algorithms as a wrapper for genetic programming and performs feature selection while inducing models. SLUG was shown to be successful on different types of classification tasks, achieving state-of-the-art results on the synthetic datasets produced by GAMETES, a tool for embedding epistatic gene–gene interactions into noisy datasets. SLUG has also been studied and modified to demonstrate that its two elements, wrapper and learner, are the right combination that grants it success. We report these results and test SLUG on an additional six GAMETES datasets of increased difficulty, for a total of four regular and 16 epistatic datasets. Despite its slowness, SLUG achieves the best results and solves all but the most difficult classification tasks. We perform further explorations of its inner dynamics and discover how to improve the feature selection by enriching the communication between wrapper and learner, thus taking the first step toward a new and more powerful SLUG.publishersversionpublishe
    corecore