67 research outputs found

    Embedded Artificial Intelligence for Tactile Sensing

    Get PDF
    Electronic tactile sensing becomes an active research field whether for prosthetic applications, robotics, virtual reality or post stroke patients rehabilitation. To achieve such sensing, an array of sensors is used to retrieve human-skin like information, which is called Electronic skin (E-skin). Humans through their skins, are able to collect different types of information e.g. pressure, temperature, texture, etc. which are then passed to the nervous system, and finally to the brain in order to extract high level information from these sensory data. In order to make E-skin capable of such task, data acquired from E-skin should be filtered, processed, and then conveyed to the user (or robot). Processing these sensory information, should occur in real-time, taking in consideration the power limitation in such applications, especially prosthetic applications. The power consumption itself is related to different factors, one factor is the complexity of the algorithm e.g. number of FLOPs, and another is the memory consumption. In this thesis, I will focus on the processing of real tactile information, by 1)exploring different algorithms and methods for tactile data classification, 2)data organization and preprocessing of such tactile data and 3)hardware implementation. More precisely the focus will be on deep learning algorithms for tactile data processing mainly CNNs and RNNs, with energy-efficient embedded implementations. The proposed solution has proved less memory, FLOPs, and latency compared to the state of art (including tensorial SVM), applied to real tactile sensors data. Keywords: E-skin, tactile data processing, deep learning, CNN, RNN, LSTM, GRU, embedded, energy-efficient algorithms, edge computing, artificial intelligence

    Empiricism without Magic: Transformational Abstraction in Deep Convolutional Neural Networks

    Get PDF
    In artificial intelligence, recent research has demonstrated the remarkable potential of Deep Convolutional Neural Networks (DCNNs), which seem to exceed state-of-the-art performance in new domains weekly, especially on the sorts of very difficult perceptual discrimination tasks that skeptics thought would remain beyond the reach of artificial intelligence. However, it has proven difficult to explain why DCNNs perform so well. In philosophy of mind, empiricists have long suggested that complex cognition is based on information derived from sensory experience, often appealing to a faculty of abstraction. Rationalists have frequently complained, however, that empiricists never adequately explained how this faculty of abstraction actually works. In this paper, I tie these two questions together, to the mutual benefit of both disciplines. I argue that the architectural features that distinguish DCNNs from earlier neural networks allow them to implement a form of hierarchical processing that I call “transformational abstraction”. Transformational abstraction iteratively converts sensory-based representations of category exemplars into new formats that are increasingly tolerant to “nuisance variation” in input. Reflecting upon the way that DCNNs leverage a combination of linear and non-linear processing to efficiently accomplish this feat allows us to understand how the brain is capable of bi-directional travel between exemplars and abstractions, addressing longstanding problems in empiricist philosophy of mind. I end by considering the prospects for future research on DCNNs, arguing that rather than simply implementing 80s connectionism with more brute-force computation, transformational abstraction counts as a qualitatively distinct form of processing ripe with philosophical and psychological significance, because it is significantly better suited to depict the generic mechanism responsible for this important kind of psychological processing in the brain

    Classification of Colorectal Cancer Polyps via Transfer Learning and Vision-Based Tactile Sensing

    Full text link
    In this study, to address the current high earlydetection miss rate of colorectal cancer (CRC) polyps, we explore the potentials of utilizing transfer learning and machine learning (ML) classifiers to precisely and sensitively classify the type of CRC polyps. Instead of using the common colonoscopic images, we applied three different ML algorithms on the 3D textural image outputs of a unique vision-based surface tactile sensor (VS-TS). To collect realistic textural images of CRC polyps for training the utilized ML classifiers and evaluating their performance, we first designed and additively manufactured 48 types of realistic polyp phantoms with different hardness, type, and textures. Next, the performance of the used three ML algorithms in classifying the type of fabricated polyps was quantitatively evaluated using various statistical metrics.Comment: Accepted to IEEE Sensors 2022 Conferenc

    Embedded Machine Learning: Emphasis on Hardware Accelerators and Approximate Computing for Tactile Data Processing

    Get PDF
    Machine Learning (ML) a subset of Artificial Intelligence (AI) is driving the industrial and technological revolution of the present and future. We envision a world with smart devices that are able to mimic human behavior (sense, process, and act) and perform tasks that at one time we thought could only be carried out by humans. The vision is to achieve such a level of intelligence with affordable, power-efficient, and fast hardware platforms. However, embedding machine learning algorithms in many application domains such as the internet of things (IoT), prostheses, robotics, and wearable devices is an ongoing challenge. A challenge that is controlled by the computational complexity of ML algorithms, the performance/availability of hardware platforms, and the application\u2019s budget (power constraint, real-time operation, etc.). In this dissertation, we focus on the design and implementation of efficient ML algorithms to handle the aforementioned challenges. First, we apply Approximate Computing Techniques (ACTs) to reduce the computational complexity of ML algorithms. Then, we design custom Hardware Accelerators to improve the performance of the implementation within a specified budget. Finally, a tactile data processing application is adopted for the validation of the proposed exact and approximate embedded machine learning accelerators. The dissertation starts with the introduction of the various ML algorithms used for tactile data processing. These algorithms are assessed in terms of their computational complexity and the available hardware platforms which could be used for implementation. Afterward, a survey on the existing approximate computing techniques and hardware accelerators design methodologies is presented. Based on the findings of the survey, an approach for applying algorithmic-level ACTs on machine learning algorithms is provided. Then three novel hardware accelerators are proposed: (1) k-Nearest Neighbor (kNN) based on a selection-based sorter, (2) Tensorial Support Vector Machine (TSVM) based on Shallow Neural Networks, and (3) Hybrid Precision Binary Convolution Neural Network (BCNN). The three accelerators offer a real-time classification with monumental reductions in the hardware resources and power consumption compared to existing implementations targeting the same tactile data processing application on FPGA. Moreover, the approximate accelerators maintain a high classification accuracy with a loss of at most 5%

    Human-Machine Interfaces using Distributed Sensing and Stimulation Systems

    Get PDF
    As the technology moves towards more natural human-machine interfaces (e.g. bionic limbs, teleoperation, virtual reality), it is necessary to develop a sensory feedback system in order to foster embodiment and achieve better immersion in the control system. Contemporary feedback interfaces presented in research use few sensors and stimulation units to feedback at most two discrete feedback variables (e.g. grasping force and aperture), whereas the human sense of touch relies on a distributed network of mechanoreceptors providing a wide bandwidth of information. To provide this type of feedback, it is necessary to develop a distributed sensing system that could extract a wide range of information during the interaction between the robot and the environment. In addition, a distributed feedback interface is needed to deliver such information to the user. This thesis proposes the development of a distributed sensing system (e-skin) to acquire tactile sensation, a first integration of distributed sensing system on a robotic hand, the development of a sensory feedback system that compromises the distributed sensing system and a distributed stimulation system, and finally the implementation of deep learning methods for the classification of tactile data. It\u2019s core focus addresses the development and testing of a sensory feedback system, based on the latest distributed sensing and stimulation techniques. To this end, the thesis is comprised of two introductory chapters that describe the state of art in the field, the objectives, and the used methodology and contributions; as well as six studies that tackled the development of human-machine interfaces

    Learning to grasp in unstructured environments with deep convolutional neural networks using a Baxter Research Robot

    Get PDF
    Recent advancements in Deep Learning have accelerated the capabilities of robotic systems in terms of visual perception, object manipulation, automated navigation, and human-robot collaboration. The capability of a robotic system to manipulate objects in unstructured environments is becoming an increasingly necessary skill. Due to the dynamic nature of these environments, traditional methods, that require expert human knowledge, fail to adapt automatically. After reviewing the relevant literature a method was proposed to utilise deep transfer learning techniques to detect object grasps from coloured depth images. A grasp describes how a robotic end-effector can be arranged to securely grasp an object and successfully lift it without slippage. In this study, a ResNet-50 convolutional neural network (CNN) model is trained on the Cornell grasp dataset. The training was completed within 30 hours using a workstation PC with accelerated GPU support via an NVIDIA Titan X. The trained grasp detection model was further evaluated with a Baxter research robot and a Microsoft Kinect-v2 and a successful grasp detection accuracy of 93.91% was achieved on a diverse set of novel objects. Physical grasping trials were conducted on a set of 8 different objects. The overall system achieves an average grasp success rate of 65.0% while performing the grasp detection in under 25 milliseconds. The results analysis concluded that the objects with reasonably straight edges and moderately pronounced heights above the table are easily detected and grasped by the system

    Science of Facial Attractiveness

    Get PDF

    Varieties of Attractiveness and their Brain Responses

    Get PDF

    Embedded Electronic Systems for Electronic Skin Applications

    Get PDF
    The advances in sensor devices are potentially providing new solutions to many applications including prosthetics and robotics. Endowing upper limb prosthesis with tactile sensors (electronic/sensitive skin) can be used to provide tactile sensory feedback to the amputees. In this regard, the prosthetic device is meant to be equipped with tactile sensing system allowing the user limb to receive tactile feedback about objects and contact surfaces. Thus, embedding tactile sensing system is required for wearable sensors that should cover wide areas of the prosthetics. However, embedding sensing system involves set of challenges in terms of power consumption, data processing, real-time response and design scalability (e-skin may include large number of tactile sensors). The tactile sensing system is constituted of: (i) a tactile sensor array, (ii) an interface electronic circuit, (iii) an embedded processing unit, and (iv) a communication interface to transmit tactile data. The objective of the thesis is to develop an efficient embedded tactile sensing system targeting e-skin application (e.g. prosthetic) by: 1) developing a low power and miniaturized interface electronics circuit, operating in real-time; 2) proposing an efficient algorithm for embedded tactile data processing, affecting the system time latency and power consumption; 3) implementing an efficient communication channel/interface, suitable for large amount of data generated from large number of sensors. Most of the interface electronics for tactile sensing system proposed in the literature are composed of signal conditioning and commercial data acquisition devices (i.e. DAQ). However, these devices are bulky (PC-based) and thus not suitable for portable prosthetics from the size, power consumption and scalability point of view. Regarding the tactile data processing, some works have exploited machine learning methods for extracting meaningful information from tactile data. However, embedding these algorithms poses some challenges because of 1) the high amount of data to be processed significantly affecting the real time functionality, and 2) the complex processing tasks imposing burden in terms of power consumption. On the other hand, the literature shows lack in studies addressing data transfer in tactile sensing system. Thus, dealing with large number of sensors will pose challenges on the communication bandwidth and reliability. Therefore, this thesis exploits three approaches: 1) Developing a low power and miniaturized Interface Electronics (IE), capable of interfacing and acquiring signals from large number of tactile sensors in real-time. We developed a portable IE system based on a low power arm microcontroller and a DDC232 A/D converter, that handles an array of 32 tactile sensors. Upon touch applied to the sensors, the IE acquires and pre-process the sensor signals at low power consumption achieving a battery lifetime of about 22 hours. Then we assessed the functionality of the IE by carrying out Electrical and electromechanical characterization experiments to monitor the response of the interface electronics with PVDF-based piezoelectric sensors. The results of electrical and electromechanical tests validate the correct functionality of the proposed system. In addition, we implemented filtering methods on the IE that reduced the effect of noise in the system. Furthermore, we evaluated our proposed IE by integrating it in tactile sensory feedback system, showing effective deliver of tactile data to the user. The proposed system overcomes similar state of art solutions dealing with higher number of input channels and maintaining real time functionality. 2) Optimizing and implementing a tensorial-based machine learning algorithm for touch modality classification on embedded Zynq System-on-chip (SoC). The algorithm is based on Support Vector Machine classifier to discriminate between three input touch modality classes \u201cbrushing\u201d, \u201crolling\u201d and \u201csliding\u201d. We introduced an efficient algorithm minimizing the hardware implementation complexity in terms of number of operations and memory storage which directly affect time latency and power consumption. With respect to the original algorithm, the proposed approach \u2013 implemented on Zynq SoC \u2013 achieved reduction in the number of operations per inference from 545 M-ops to 18 M-ops and the memory storage from 52.2 KB to 1.7 KB. Moreover, the proposed method speeds up the inference time by a factor of 43 7 at a cost of only 2% loss in accuracy, enabling the algorithm to run on embedded processing unit and to extract tactile information in real-time. 3) Implementing a robust and efficient data transfer channel to transfer aggregated data at high transmission data rate and low power consumption. In this approach, we proposed and demonstrated a tactile sensory feedback system based on an optical communication link for prosthetic applications. The optical link features a low power and wide transmission bandwidth, which makes the feedback system suitable for large number of tactile sensors. The low power transmission is due to the employed UWB-based optical modulation. We implemented a system prototype, consisting of digital transmitter and receiver boards and acquisition circuits to interface 32 piezoelectric sensors. Then we evaluated the system performance by measuring, processing and transmitting data of the 32 piezoelectric sensors at 100 Mbps data rate through the optical link, at 50 pJ/bit communication energy consumption. Experimental results have validated the functionality and demonstrated the real time operation of the proposed sensory feedback system

    Signal and Information Processing Methods for Embedded Robotic Tactile Sensing Systems

    Get PDF
    The human skin has several sensors with different properties and responses that are able to detect stimuli resulting from mechanical stimulations. Pressure sensors are the most important type of receptors for the exploration and manipulation of objects. In the last decades, smart tactile sensing based on different sensing techniques have been developed as their application in robotics and prosthetics is considered of huge interest, mainly driven by the prospect of autonomous and intelligent robots that can interact with the environment. However, regarding object properties estimation on robots, hardness detection is still a major limitation due to the lack of techniques to estimate it. Furthermore, finding processing methods that can interpret the measured information from multiple sensors and extract relevant information is a Challenging task. Moreover, embedding processing methods and machine learning algorithms in robotic applications to extract meaningful information such as object properties from tactile data is an ongoing challenge, which is controlled by the device constraints (power constraint, memory constraints, etc.), the computational complexity of the processing and machine learning algorithms, the application requirements (real-time operations, high prediction performance). In this dissertation, we focus on the design and implementation of pre-processing methods and machine learning algorithms to handle the aforementioned challenges for a tactile sensing system in robotic application. First, we propose a tactile sensing system for robotic application. Then we present efficient preprocessing and feature extraction methods for our tactile sensors. Then we propose a learning strategy to reduce the computational cost of our processing unit in object classification using sensorized Baxter robot. Finally, we present a real-time robotic tactile sensing system for hardness classification on a resource-constrained devices. The first study represents a further assessment of the sensing system that is based on the PVDF sensors and the interface electronics developed in our lab. In particular, first, it presents the development of a skin patch (multilayer structure) that allows us to use the sensors in several applications such as robotic hand/grippers. Second, it shows the characterization of the developed skin patch. Third, it validates the sensing system. Moreover, we designed a filter to remove noise and detect touch. The experimental assessment demonstrated that the developed skin patch and the interface electronics indeed can detect different touch patterns and stimulus waveforms. Moreover, the results of the experiments defined the frequency range of interest and the response of the system to realistic interactions with the sensing system to grasp and release events. In the next study, we presented an easy integration of our tactile sensing system into Baxter gripper. Computationally efficient pre-processing techniques were designed to filter the signal and extract relevant information from multiple sensor signals, in addition to feature extraction methods. These processing methods aim in turn to reduce also the computational complexity of machine learning algorithms utilized for object classification. The proposed system and processing strategy were evaluated on object classification application by integrating our system into the gripper and we collected data by grasping multiple objects. We further proposed a learning strategy to accomplish a trade-off between the generalization accuracy and the computational cost of the whole processing unit. The proposed pre-processing and feature extraction techniques together with the learning strategy have led to models with extremely low complexity and very high generalization accuracy. Moreover, the support vector machine achieved the best trade-off between accuracy and computational cost on tactile data from our sensors. Finally, we presented the development and implementation on the edge of a real–time tactile sensing system for hardness classification on Baxter robot based on machine and deep learning algorithms. We developed and implemented in plain C a set of functions that provide the fundamental layer functionalities of the Machine learning and Deep Learning models (ML and DL), along with the pre–processing methods to extract the features and normalize the data. The models can be deployed to any device that supports C code since it does not rely on any of the existing libraries. Shallow ML/DL algorithms for the deployment on resource–constrained devices are designed. To evaluate our work, we collected data by grasping objects of different hardness and shape. Two classification problems were addressed: 5 levels of hardness classified on the same objects’ shape, and 5 levels of hardness classified on two different objects’ shape. Furthermore, optimization techniques were employed. The models and pre–processing were implemented on a resource constrained device, where we assessed the performance of the system in terms of accuracy, memory footprint, time latency, and energy consumption. We achieved for both classification problems a real-time inference (< 0.08 ms), low power consumption (i.e., 3.35 μJ), extremely small models (i.e., 1576 Byte), and high accuracy (above 98%)
    • …
    corecore