12,049 research outputs found

    Incremental Learning Using a Grow-and-Prune Paradigm with Efficient Neural Networks

    Full text link
    Deep neural networks (DNNs) have become a widely deployed model for numerous machine learning applications. However, their fixed architecture, substantial training cost, and significant model redundancy make it difficult to efficiently update them to accommodate previously unseen data. To solve these problems, we propose an incremental learning framework based on a grow-and-prune neural network synthesis paradigm. When new data arrive, the neural network first grows new connections based on the gradients to increase the network capacity to accommodate new data. Then, the framework iteratively prunes away connections based on the magnitude of weights to enhance network compactness, and hence recover efficiency. Finally, the model rests at a lightweight DNN that is both ready for inference and suitable for future grow-and-prune updates. The proposed framework improves accuracy, shrinks network size, and significantly reduces the additional training cost for incoming data compared to conventional approaches, such as training from scratch and network fine-tuning. For the LeNet-300-100 and LeNet-5 neural network architectures derived for the MNIST dataset, the framework reduces training cost by up to 64% (63%) and 67% (63%) compared to training from scratch (network fine-tuning), respectively. For the ResNet-18 architecture derived for the ImageNet dataset and DeepSpeech2 for the AN4 dataset, the corresponding training cost reductions against training from scratch (network fine-tunning) are 64% (60%) and 67% (62%), respectively. Our derived models contain fewer network parameters but achieve higher accuracy relative to conventional baselines

    Photochemical transformation of perfluoroalkyl acid precursors in water using engineered nanomaterials

    Get PDF
    The production of perfluoroalkyl acids (PFAAs) has been phased out over recent decades; however, no significant decline in their environmental concentrations has been observed. This is partly due to the photochemical decomposition of PFAAs precursors (PrePFAAs) which remain in extensive use. The decomposition of PrePFAAs may be accelerated by the light-activated engineered nanomaterials (ENMs) in water. In light of this hypothesis, we investigated the photochemical transformation of three PrePFAAs, which are 8:2 fluorotelomer sulfonic acid (8:2 FTSA), 8:2 fluorotelomer alcohol (8:2 FTOH), and 2-(N-ethylperfluorooctane-1-sulfonamido ethyl] phosphate (SAmPAP), in the presence of six ENMs under simulated sunlight irradiation. The transformation rates of 8:2 FTSA and 8:2 FTOH were increased by 2–6 times when in the presence of six ENMs. However, most of ENMs appeared to inhibit the decomposition of SAmPAP. The transformation rates of PrePFAAs were found to depend on the yield of reactive oxygen species generated by ENMs, but the rates were also related to compound photo-stability, adsorption to surfaces, and photo-shielding effects. The PrePFAAs are transformed to perfluorooctanoic acid (PFOA) or/and perfluorooctane sulfonate (PFOS) with higher toxicity and longer half-life, PFOA or PFOS and a few PFAAs having shorter carbon chain lengths. Higher concentrations of the PFAAs photodegradation products were observed in the presence of most of the ENMs

    Veech's Theorem of GG acting freely on GLUCG^{\textrm{LUC}} and Structure Theorem of a.a. flows

    Full text link
    Veech's Theorem claims that if GG is a locally compact\,(LC) Hausdorff topological group, then it may act freely on GLUCG^{\textrm{LUC}}. We prove Veech's Theorem for GG being only locally quasi-totally bounded, not necessarily LC. And we show that the universal a.a. flow is the maximal almost 1-1 extension of the universal minimal a.p. flow and is unique up to almost 1-1 extensions. In particular, every endomorphism of Veech's hull flow induced by an a.a. function is almost 1-1; for G=ZG=\mathbb{Z} or R\mathbb{R}, GG acts freely on its canonical universal a.a. space. Finally, we characterize Bochner a.a. functions on a LC group GG in terms of Bohr a.a. function on GG (due to Veech 1965 for the special case that GG is abelian, LC, σ\sigma-compact, and first countable).Comment: 54 page
    • …
    corecore