7 research outputs found

    The role of IGF-1 in exercise to improve obesity-related cognitive dysfunction

    Get PDF
    Obesity is an important factor that threatens human health. The occurrence of many chronic diseases is related to obesity, and cognitive function decline often occurs with the onset of obesity. With the further prevalence of obesity, it is bound to lead to a wider range of cognitive dysfunction (ORCD). Therefore, it is crucial to suppress ORCD through intervention. In this regard, exercise has been shown to be effective in preventing obesity and improving cognitive function as a non-drug treatment. There is sufficient evidence that exercise has a regulatory effect on a growth factor closely related to cognitive function—insulin-like growth factor 1 (IGF-1). IGF-1 may be an important mediator in improving ORCD through exercise. This article reviews the effects of obesity and IGF-1 on cognitive function and the regulation of exercise on IGF-1. It analyzes the mechanism by which exercise can improve ORCD by regulating IGF-1. Overall, this review provides evidence from relevant animal studies and human studies, showing that exercise plays a role in improving ORCD. It emphasizes the importance of IGF-1, which helps to understand the health effects of exercise and promotes research on the treatment of ORCD

    FOLD-TR: A Scalable and Efficient Inductive Learning Algorithm for Learning To Rank

    Full text link
    FOLD-R++ is a new inductive learning algorithm for binary classification tasks. It generates an (explainable) normal logic program for mixed type (numerical and categorical) data. We present a customized FOLD-R++ algorithm with the ranking framework, called FOLD-TR, that aims to rank new items following the ranking pattern in the training data. Like FOLD-R++, the FOLD-TR algorithm is able to handle mixed-type data directly and provide native justification to explain the comparison between a pair of items.Comment: arXiv admin note: substantial text overlap with arXiv:2202.06913. text overlap with arXiv:2110.0784

    FOLD-RM: A Scalable, Efficient, and Explainable Inductive Learning Algorithm for Multi-Category Classification of Mixed Data

    Full text link
    FOLD-RM is an automated inductive learning algorithm for learning default rules for mixed (numerical and categorical) data. It generates an (explainable) answer set programming (ASP) rule set for multi-category classification tasks while maintaining efficiency and scalability. The FOLD-RM algorithm is competitive in performance with the widely-used, state-of-the-art algorithms such as XGBoost and multi-layer perceptrons (MLPs), however, unlike these algorithms, the FOLD-RM algorithm produces an explainable model. FOLD-RM outperforms XGBoost on some datasets, particularly large ones. FOLD-RM also provides human-friendly explanations for predictions.Comment: Paper presented at the 38th International Conference on Logic Programming (ICLP 2022), 16 page

    NeSyFOLD: Extracting Logic Programs from Convolutional Neural Networks

    Full text link
    We present a novel neurosymbolic framework called NeSyFOLD to extract logic rules from a CNN and create a NeSyFOLD model to classify images. NeSyFOLD's learning pipeline is as follows: (i) We first pre-train a CNN on the input image dataset and extract activations of the last layer kernels as binary values; (ii) Next, we use the FOLD-SE-M rule-based machine learning algorithm to generate a logic program that can classify an image -- represented as a vector of binary activations corresponding to each kernel -- while producing a logical explanation. The rules generated by the FOLD-SE-M algorithm have kernel numbers as predicates. We have devised a novel algorithm for automatically mapping the CNN kernels to semantic concepts in the images. This mapping is used to replace predicate names (kernel numbers) in the rule-set with corresponding semantic concept labels. The resulting rule-set is interpretable, and can be intuitively understood by humans. We compare our NeSyFOLD framework with the ERIC system that uses a decision-tree like algorithm to obtain the rules. Our framework has the following advantages over ERIC: (i) In most cases, NeSyFOLD generates smaller rule-sets without compromising on the accuracy and fidelity; (ii) NeSyFOLD generates the mapping of filter numbers to semantic labels automatically

    Logic-Based Explainable and Incremental Machine Learning

    No full text
    Mainstream machine learning methods lack interpretability, explainability, incrementality, and data-economy. We propose using logic programming (LP) to rectify these problems. We discuss the FOLD family of rule-based machine learning algorithms that learn models from relational datasets as a set of default rules. These models are competitive with state-of-the-art machine learning systems in terms of accuracy and execution efficiency. We also motivate how logic programming can be useful for theory revision and explanation based learning
    corecore