6,096 research outputs found
Transport of video over partial order connections
A Partial Order and partial reliable Connection (POC) is an end-to-end transport connection authorized to deliver objects in an order that can differ from the transmitted one. Such a connection is also authorized to lose some objects. The POC concept is motivated by the fact that heterogeneous best-effort networks such as Internet are plagued by unordered delivery of packets and losses, which tax the performances of current applications and protocols. It has been shown, in several research works, that out of order delivery is able to alleviate (with respect to CO service) the use of end systems’ communication resources. In this paper, the efficiency of out-of-sequence delivery on MPEG video streams processing is studied. Firstly, the transport constraints (in terms of order and reliability) that can be relaxed by MPEG video decoders, for improving video transport, are detailed. Then, we analyze the performance gain induced by this approach in terms of blocking times and recovered errors. We demonstrate that POC connections fill not only the conceptual gap between TCP and UDP but also provide real performance improvements for the transport of multimedia streams such MPEG video
Empirical study on the efficiency of Spiking Neural Networks with axonal delays, and algorithm-hardware benchmarking
The role of axonal synaptic delays in the efficacy and performance of
artificial neural networks has been largely unexplored. In step-based
analog-valued neural network models (ANNs), the concept is almost absent. In
their spiking neuroscience-inspired counterparts, there is hardly a systematic
account of their effects on model performance in terms of accuracy and number
of synaptic operations.This paper proposes a methodology for accounting for
axonal delays in the training loop of deep Spiking Neural Networks (SNNs),
intending to efficiently solve machine learning tasks on data with rich
temporal dependencies. We then conduct an empirical study of the effects of
axonal delays on model performance during inference for the Adding task, a
benchmark for sequential regression, and for the Spiking Heidelberg Digits
dataset (SHD), commonly used for evaluating event-driven models. Quantitative
results on the SHD show that SNNs incorporating axonal delays instead of
explicit recurrent synapses achieve state-of-the-art, over 90% test accuracy
while needing less than half trainable synapses. Additionally, we estimate the
required memory in terms of total parameters and energy consumption of
accomodating such delay-trained models on a modern neuromorphic accelerator.
These estimations are based on the number of synaptic operations and the
reference GF-22nm FDX CMOS technology. As a result, we demonstrate that a
reduced parameterization, which incorporates axonal delays, leads to
approximately 90% energy and memory reduction in digital hardware
implementations for a similar performance in the aforementioned task
Using a virtual cortical module implementing a neural field model to modulate brain rhythms in Parkinson’s disease
We propose a new method for selective modulation of cortical rhythms based on neural field theory, in which the activity of a cortical area is extensively monitored using a two-dimensional microelectrode array. The example of Parkinson’s disease illustrates the proposed method, in which a neural field model is assumed to accurately describe experimentally recorded activity. In addition, we propose a new closed-loop stimulation signal that is both space- and time- dependent. This method is especially designed to specifically modulate a targeted brain rhythm, without interfering with other rhythms. A new class of neuroprosthetic devices is also proposed, in which the multielectrode array is seen as an artificial neural network interacting with biological tissue. Such a bio-inspired approach may provide a solution to optimize interactions between the stimulation device and the cortex aiming to attenuate or augment specific cortical rhythms. The next step will be to validate this new approach experimentally in patients with Parkinson’s disease
Biologically inspired evolutionary temporal neural circuits
Biological neural networks have always motivated creation of new artificial neural networks, and in this case a new autonomous temporal neural network system. Among the more challenging problems of temporal neural networks are the design and incorporation of short and long-term memories as well as the choice of network topology and training mechanism. In general, delayed copies of network signals can form short-term memory (STM), providing a limited temporal history of events similar to FIR filters, whereas the synaptic connection strengths as well as delayed feedback loops (ER circuits) can constitute longer-term memories (LTM). This dissertation introduces a new general evolutionary temporal neural network framework (GETnet) through automatic design of arbitrary neural networks with STM and LTM. GETnet is a step towards realization of general intelligent systems that need minimum or no human intervention and can be applied to a broad range of problems. GETnet utilizes nonlinear moving average/autoregressive nodes and sub-circuits that are trained by enhanced gradient descent and evolutionary search in terms of architecture, synaptic delay, and synaptic weight spaces. The mixture of Lamarckian and Darwinian evolutionary mechanisms facilitates the Baldwin effect and speeds up the hybrid training. The ability to evolve arbitrary adaptive time-delay connections enables GETnet to find novel answers to many classification and system identification tasks expressed in the general form of desired multidimensional input and output signals. Simulations using Mackey-Glass chaotic time series and fingerprint perspiration-induced temporal variations are given to demonstrate the above stated capabilities of GETnet
Neural network for processing both spatial and temporal data with time based back-propagation
Neural networks are computing systems modeled after the paradigm of the biological brain. For years, researchers using various forms of neural networks have attempted to model the brain's information processing and decision-making capabilities. Neural network algorithms have impressively demonstrated the capability of modeling spatial information. On the other hand, the application of parallel distributed models to the processing of temporal data has been severely restricted. The invention introduces a novel technique which adds the dimension of time to the well known back-propagation neural network algorithm. In the space-time neural network disclosed herein, the synaptic weights between two artificial neurons (processing elements) are replaced with an adaptable-adjustable filter. Instead of a single synaptic weight, the invention provides a plurality of weights representing not only association, but also temporal dependencies. In this case, the synaptic weights are the coefficients to the adaptable digital filters. Novelty is believed to lie in the disclosure of a processing element and a network of the processing elements which are capable of processing temporal as well as spacial data
- …