242 research outputs found
Event-Driven Imaging in Turbid Media: A Confluence of Optoelectronics and Neuromorphic Computation
In this paper a new optical-computational method is introduced to unveil
images of targets whose visibility is severely obscured by light scattering in
dense, turbid media. The targets of interest are taken to be dynamic in that
their optical properties are time-varying whether stationary in space or
moving. The scheme, to our knowledge the first of its kind, is human vision
inspired whereby diffuse photons collected from the turbid medium are first
transformed to spike trains by a dynamic vision sensor as in the retina, and
image reconstruction is then performed by a neuromorphic computing approach
mimicking the brain. We combine benchtop experimental data in both reflection
(backscattering) and transmission geometries with support from physics-based
simulations to develop a neuromorphic computational model and then apply this
for image reconstruction of different MNIST characters and image sets by a
dedicated deep spiking neural network algorithm. Image reconstruction is
achieved under conditions of turbidity where an original image is
unintelligible to the human eye or a digital video camera, yet clearly and
quantifiable identifiable when using the new neuromorphic computational
approach
Dynamic Event-based Optical Identification and Communication
Optical identification is often done with spatial or temporal visual pattern
recognition and localization. Temporal pattern recognition, depending on the
technology, involves a trade-off between communication frequency, range and
accurate tracking. We propose a solution with light-emitting beacons that
improves this trade-off by exploiting fast event-based cameras and, for
tracking, sparse neuromorphic optical flow computed with spiking neurons. The
system is embedded in a simulated drone and evaluated in an asset monitoring
use case. It is robust to relative movements and enables simultaneous
communication with, and tracking of, multiple moving beacons. Finally, in a
hardware lab prototype, we demonstrate for the first time beacon tracking
performed simultaneously with state-of-the-art frequency communication in the
kHz range.Comment: 10 pages, 7 figures and 1 tabl
Image Sensors in Security and Medical Applications
This paper briefly reviews CMOS image sensor technology and its utilization in security and medical
applications. The role and future trends of image sensors in each of the applications are discussed. To provide
the reader deeper understanding of the technology aspects the paper concentrates on the selected applications
such as surveillance, biometrics, capsule endoscopy and artificial retina. The reasons for concentrating on these
applications are due to their importance in our daily life and because they present leading-edge applications for
imaging systems research and development. In addition, review of image sensors implementation in these
applications allows the reader to investigate image sensor technology from the technical and from other views
as well
Harnessing the Potential of Optical Communications for the Metaverse
The Metaverse is a digital world that offers an immersive virtual experience.
However, the Metaverse applications are bandwidth-hungry and delay-sensitive
that require ultrahigh data rates, ultra-low latency, and hyper-intensive
computation. To cater for these requirements, optical communication arises as a
key pillar in bringing this paradigm into reality. We highlight in this paper
the potential of optical communications in the Metaverse. First, we set forth
Metaverse requirements in terms of capacity and latency; then, we introduce
ultra-high data rates requirements for various Metaverse experiences. Then, we
put forward the potential of optical communications to achieve these data rate
requirements in backbone, backhaul, fronthaul, and access segments. Both
optical fiber and optical wireless communication (OWC) technologies, as well as
their current and future expected data rates, are detailed. In addition, we
propose a comprehensive set of configurations, connectivity, and equipment
necessary for an immersive Metaverse experience. Finally, we identify a set of
key enablers and research directions such as analog neuromorphic optical
computing, optical intelligent reflective surfaces (IRS), hollow core fiber
(HCF), and terahertz (THz)
Event-based sensor fusion in human-machine teaming
Realizing intelligent production systems where machines and human workers can team up seamlessly demands a yet unreached level of situational awareness. The machines' leverage to reach such awareness is to amalgamate a wide variety of sensor modalities through multisensor data fusion. A particularly promising direction to establishing human-like collaborations can be seen in the use of neuro-inspired sensing and computing technologies due to their resemblance with human cognitive processing. This note discusses the concept of integrating neuromorphic sensing modalities into classical sensor fusion frameworks by exploiting event-based fusion and filtering methods that combine time-periodic process models with event-triggered sensor data. Event-based sensor fusion hence adopts the operating principles of event-based sensors and even exhibits the ability to extract information from absent data. Thereby, it can be an enabler to harness the full information potential of the intrinsic spiking nature of event-driven sensors
Smart Visual Beacons with Asynchronous Optical Communications using Event Cameras
Event cameras are bio-inspired dynamic vision sensors that respond to changes
in image intensity with a high temporal resolution, high dynamic range and low
latency. These sensor characteristics are ideally suited to enable visual
target tracking in concert with a broadcast visual communication channel for
smart visual beacons with applications in distributed robotics. Visual beacons
can be constructed by high-frequency modulation of Light Emitting Diodes (LEDs)
such as vehicle headlights, Internet of Things (IoT) LEDs, smart building
lights, etc., that are already present in many real-world scenarios. The high
temporal resolution characteristic of the event cameras allows them to capture
visual signals at far higher data rates compared to classical frame-based
cameras. In this paper, we propose a novel smart visual beacon architecture
with both LED modulation and event camera demodulation algorithms. We
quantitatively evaluate the relationship between LED transmission rate,
communication distance and the message transmission accuracy for the smart
visual beacon communication system that we prototyped. The proposed method
achieves up to 4 kbps in an indoor environment and lossless transmission over a
distance of 100 meters, at a transmission rate of 500 bps, in full sunlight,
demonstrating the potential of the technology in an outdoor environment.Comment: 7 pages, 8 figures, accepted by IEEE International Conference on
Intelligent Robots and Systems (IROS) 202
Spiking Neural Networks -- Part III: Neuromorphic Communications
Synergies between wireless communications and artificial intelligence are
increasingly motivating research at the intersection of the two fields. On the
one hand, the presence of more and more wirelessly connected devices, each with
its own data, is driving efforts to export advances in machine learning (ML)
from high performance computing facilities, where information is stored and
processed in a single location, to distributed, privacy-minded, processing at
the end user. On the other hand, ML can address algorithm and model deficits in
the optimization of communication protocols. However, implementing ML models
for learning and inference on battery-powered devices that are connected via
bandwidth-constrained channels remains challenging. This paper explores two
ways in which Spiking Neural Networks (SNNs) can help address these open
problems. First, we discuss federated learning for the distributed training of
SNNs, and then describe the integration of neuromorphic sensing, SNNs, and
impulse radio technologies for low-power remote inference.Comment: Submitte
Neuromorphic hardware for somatosensory neuroprostheses
In individuals with sensory-motor impairments, missing limb functions can be restored using neuroprosthetic devices that directly interface with the nervous system. However, restoring the natural tactile experience through electrical neural stimulation requires complex encoding strategies. Indeed, they are presently limited in effectively conveying or restoring tactile sensations by bandwidth constraints. Neuromorphic technology, which mimics the natural behavior of neurons and synapses, holds promise for replicating the encoding of natural touch, potentially informing neurostimulation design. In this perspective, we propose that incorporating neuromorphic technologies into neuroprostheses could be an effective approach for developing more natural human-machine interfaces, potentially leading to advancements in device performance, acceptability, and embeddability. We also highlight ongoing challenges and the required actions to facilitate the future integration of these advanced technologies
- …