579 research outputs found
The Feeling of Color: A Haptic Feedback Device for the Visually Disabled
Tapson J, Gurari N, Diaz J, et al. The Feeling of Color: A Haptic Feedback Device for the Visually Disabled. Presented at the Biomedical Circuits and Systems Conference (BIOCAS), Baltimore, MD.We describe a sensory augmentation system designed to provide the visually disabled with a sense of color. Our system consists of a glove with short-range optical color sensors mounted on its fingertips, and a torso-worn belt on which tactors (haptic feedback actuators) are mounted. Each fingertip sensor detects the observed objectpsilas color. This information is encoded to the tactor through vibrations in respective locations and varying modulations. Early results suggest that detection of primary colors is possible with near 100% accuracy and moderate latency, with a minimum amount of training
Communication channel analysis and real time compressed sensing for high density neural recording devices
Next generation neural recording and Brain-
Machine Interface (BMI) devices call for high density or distributed
systems with more than 1000 recording sites. As the
recording site density grows, the device generates data on the
scale of several hundred megabits per second (Mbps). Transmitting
such large amounts of data induces significant power
consumption and heat dissipation for the implanted electronics.
Facing these constraints, efficient on-chip compression techniques
become essential to the reduction of implanted systems power
consumption. This paper analyzes the communication channel
constraints for high density neural recording devices. This paper
then quantifies the improvement on communication channel
using efficient on-chip compression methods. Finally, This paper
describes a Compressed Sensing (CS) based system that can
reduce the data rate by > 10x times while using power on
the order of a few hundred nW per recording channel
Arbitrated address event representation digital image sensor
80Ă—60 (1/8 VGA) address event imager in 0.6 ÎĽm CMOS converts light intensity into a one-bit code (a spike). The read-out of each spike is initiated by the pixel. The dynamic range is 200 dB for a pixel and 120 dB for the array. It uses 3.4 mW at a spike rate of 200 kHz. It is capable of 8.3 k effective frames/s
Hardware Implementation of a Visual-Motion Pixel Using Oriented Spatiotemporal Neural Filters
A pixel for measuring two-dimensional (2-D) visual motion with two one-dimensional (1-D) detectors has been implemented in very large scale integration. Based on the spatiotemporal feature extraction model of Adelson and Bergen, the pixel is realized using a general-purpose analog neural computer and a silicon retina. Because the neural computer only offers sum-and-threshold neurons, the Adelson and Bergen\u27s model is modified. The quadratic nonlinearity is replaced with a full-wave rectification, while the contrast normalization is replaced with edge detection and thresholding. Motion is extracted in two dimensions by using two 1-D detectors with spatial smoothing orthogonal to the direction of motion. Analysis shows that our pixel, although it has some limitations, has much lower hardware complexity compared to the full 2-D model. It also produces more accurate results and has a reduced aperture problem compared to the two 1-D model with no smoothing. Real-time velocity is represented as a distribution of activity of the 18 X and 18 Y velocity-tuned neural filter
Pix2HDR -- A pixel-wise acquisition and deep learning-based synthesis approach for high-speed HDR videos
Accurately capturing dynamic scenes with wide-ranging motion and light
intensity is crucial for many vision applications. However, acquiring
high-speed high dynamic range (HDR) video is challenging because the camera's
frame rate restricts its dynamic range. Existing methods sacrifice speed to
acquire multi-exposure frames. Yet, misaligned motion in these frames can still
pose complications for HDR fusion algorithms, resulting in artifacts. Instead
of frame-based exposures, we sample the videos using individual pixels at
varying exposures and phase offsets. Implemented on a pixel-wise programmable
image sensor, our sampling pattern simultaneously captures fast motion at a
high dynamic range. We then transform pixel-wise outputs into an HDR video
using end-to-end learned weights from deep neural networks, achieving high
spatiotemporal resolution with minimized motion blurring. We demonstrate
aliasing-free HDR video acquisition at 1000 FPS, resolving fast motion under
low-light conditions and against bright backgrounds - both challenging
conditions for conventional cameras. By combining the versatility of pixel-wise
sampling patterns with the strength of deep neural networks at decoding complex
scenes, our method greatly enhances the vision system's adaptability and
performance in dynamic conditions.Comment: 14 pages, 14 figure
CPU-less robotics: distributed control of biomorphs
Traditional robotics revolves around the microprocessor. All well-known demonstrations of sensory guided motor control, such as jugglers and mobile robots, require at least one CPU. Recently, the availability of fast CPUs have made real-time sensory-motor control possible, however, problems with high power consumption and lack of autonomy still remain. In fact, the best examples of real-time robotics are usually tethered or require large batteries. We present a new paradigm for robotics control that uses no explicit CPU. We use computational sensors that are directly interfaced with adaptive actuation units. The units perform motor control and have learning capabilities. This architecture distributes computation over the entire body of the robot, in every sensor and actuator. Clearly, this is similar to biological sensory- motor systems. Some researchers have tried to model the latter in software, again using CPUs. We demonstrate this idea in with an adaptive locomotion controller chip. The locomotory controller for walking, running, swimming and flying animals is based on a Central Pattern Generator (CPG). CPGs are modeled as systems of coupled non-linear oscillators that control muscles responsible for movement. Here we describe an adaptive CPG model, implemented in a custom VLSI chip, which is used to control an under-actuated and asymmetric robotic leg
- …