6,380 research outputs found

    Memristors for the Curious Outsiders

    Full text link
    We present both an overview and a perspective of recent experimental advances and proposed new approaches to performing computation using memristors. A memristor is a 2-terminal passive component with a dynamic resistance depending on an internal parameter. We provide an brief historical introduction, as well as an overview over the physical mechanism that lead to memristive behavior. This review is meant to guide nonpractitioners in the field of memristive circuits and their connection to machine learning and neural computation.Comment: Perpective paper for MDPI Technologies; 43 page

    Reinforcing POD-based model reduction techniques in reaction-diffusion complex networks using stochastic filtering and pattern recognition

    Full text link
    Complex networks are used to model many real-world systems. However, the dimensionality of these systems can make them challenging to analyze. Dimensionality reduction techniques like POD can be used in such cases. However, these models are susceptible to perturbations in the input data. We propose an algorithmic framework that combines techniques from pattern recognition (PR) and stochastic filtering theory to enhance the output of such models. The results of our study show that our method can improve the accuracy of the surrogate model under perturbed inputs. Deep Neural Networks (DNNs) are susceptible to adversarial attacks. However, recent research has revealed that Neural Ordinary Differential Equations (neural ODEs) exhibit robustness in specific applications. We benchmark our algorithmic framework with the neural ODE-based approach as a reference.Comment: 19 pages, 6 figure

    Research reports: 1990 NASA/ASEE Summer Faculty Fellowship Program

    Get PDF
    Reports on the research projects performed under the NASA/ASEE Summer Faculty Fellowship Program are presented. The program was conducted by The University of Alabama and MSFC during the period from June 4, 1990 through August 10, 1990. Some of the topics covered include: (1) Space Shuttles; (2) Space Station Freedom; (3) information systems; (4) materials and processes; (4) Space Shuttle main engine; (5) aerospace sciences; (6) mathematical models; (7) mission operations; (8) systems analysis and integration; (9) systems control; (10) structures and dynamics; (11) aerospace safety; and (12) remote sensin

    MR image reconstruction using deep density priors

    Full text link
    Algorithms for Magnetic Resonance (MR) image reconstruction from undersampled measurements exploit prior information to compensate for missing k-space data. Deep learning (DL) provides a powerful framework for extracting such information from existing image datasets, through learning, and then using it for reconstruction. Leveraging this, recent methods employed DL to learn mappings from undersampled to fully sampled images using paired datasets, including undersampled and corresponding fully sampled images, integrating prior knowledge implicitly. In this article, we propose an alternative approach that learns the probability distribution of fully sampled MR images using unsupervised DL, specifically Variational Autoencoders (VAE), and use this as an explicit prior term in reconstruction, completely decoupling the encoding operation from the prior. The resulting reconstruction algorithm enjoys a powerful image prior to compensate for missing k-space data without requiring paired datasets for training nor being prone to associated sensitivities, such as deviations in undersampling patterns used in training and test time or coil settings. We evaluated the proposed method with T1 weighted images from a publicly available dataset, multi-coil complex images acquired from healthy volunteers (N=8) and images with white matter lesions. The proposed algorithm, using the VAE prior, produced visually high quality reconstructions and achieved low RMSE values, outperforming most of the alternative methods on the same dataset. On multi-coil complex data, the algorithm yielded accurate magnitude and phase reconstruction results. In the experiments on images with white matter lesions, the method faithfully reconstructed the lesions. Keywords: Reconstruction, MRI, prior probability, machine learning, deep learning, unsupervised learning, density estimationComment: Published in IEEE TMI. Main text and supplementary material, 19 pages tota

    Organic electrochemical networks for biocompatible and implantable machine learning: Organic bioelectronic beyond sensing

    Get PDF
    How can the brain be such a good computer? Part of the answer lies in the astonishing number of neurons and synapses that process electrical impulses in parallel. Part of it must be found in the ability of the nervous system to evolve in response to external stimuli and grow, sharpen, and depress synaptic connections. However, we are far from understanding even the basic mechanisms that allow us to think, be aware, recognize patterns, and imagine. The brain can do all this while consuming only around 20 Watts, out-competing any human-made processor in terms of energy-efficiency. This question is of particular interest in a historical era and technological stage where phrases like machine learning and artificial intelligence are more and more widespread, thanks to recent advances produced in the field of computer science. However, brain-inspired computation is today still relying on algorithms that run on traditional silicon-made, digital processors. Instead, the making of brain-like hardware, where the substrate itself can be used for computation and it can dynamically update its electrical pathways, is still challenging. In this work, I tried to employ organic semiconductors that work in electrolytic solutions, called organic mixed ionic-electronic conductors (OMIECs) to build hardware capable of computation. Moreover, by exploiting an electropolymerization technique, I could form conducting connections in response to electrical spikes, in analogy to how synapses evolve when the neuron fires. After demonstrating artificial synapses as a potential building block for neuromorphic chips, I shifted my attention to the implementation of such synapses in fully operational networks. In doing so, I borrowed the mathematical framework of a machine learning approach known as reservoir computing, which allows computation with random (neural) networks. I capitalized my work on demonstrating the possibility of using such networks in-vivo for the recognition and classification of dangerous and healthy heartbeats. This is the first demonstration of machine learning carried out in a biological environment with a biocompatible substrate. The implications of this technology are straightforward: a constant monitoring of biological signals and fluids accompanied by an active recognition of the presence of malign patterns may lead to a timely, targeted and early diagnosis of potentially mortal conditions. Finally, in the attempt to simulate the random neural networks, I faced difficulties in the modeling of the devices with the state-of-the-art approach. Therefore, I tried to explore a new way to describe OMIECs and OMIECs-based devices, starting from thermodynamic axioms. The results of this model shine a light on the mechanism behind the operation of the organic electrochemical transistors, revealing the importance of the entropy of mixing and suggesting new pathways for device optimization for targeted applications
    • ā€¦
    corecore