13,816 research outputs found

    Quantification Platform for Touch Response of Zebrafish Larvae using Machine Learning

    Get PDF
    A touch-evoked response of zebrafish larvae provides information of the mechanism of the gene functional expressions. Recently, an automated system has been developed for precise and repeated touch-response experimentation with minor human intervention. The data collected by the system are analyzed with regard to an automated quantification pipeline for scientific conclusions, including five quantification criteria: latency time, C-Bend curvature maximum, C-Bend peak time, response time, and moving distance. To quantify these criteria, we propose a larva tracking based automatic quantification pipeline by using a U-Net for initialization of tracking, a particle filter as tracking strategy, and region growing for the segmentation of larvae. Experimental data with different treatments are analyzed by using the proposed quantification platform for demonstration, and the result proves that this platform can generate comparable touch-response behavior quantification readouts in an efficient and automatic way. This platform provides an alternative to automatically screening the drugs for knowledge discovery according to the pattern of the touch-response behaviors of zebrafish larvae mutated by chemicals

    Realtime State Estimation with Tactile and Visual sensing. Application to Planar Manipulation

    Full text link
    Accurate and robust object state estimation enables successful object manipulation. Visual sensing is widely used to estimate object poses. However, in a cluttered scene or in a tight workspace, the robot's end-effector often occludes the object from the visual sensor. The robot then loses visual feedback and must fall back on open-loop execution. In this paper, we integrate both tactile and visual input using a framework for solving the SLAM problem, incremental smoothing and mapping (iSAM), to provide a fast and flexible solution. Visual sensing provides global pose information but is noisy in general, whereas contact sensing is local, but its measurements are more accurate relative to the end-effector. By combining them, we aim to exploit their advantages and overcome their limitations. We explore the technique in the context of a pusher-slider system. We adapt iSAM's measurement cost and motion cost to the pushing scenario, and use an instrumented setup to evaluate the estimation quality with different object shapes, on different surface materials, and under different contact modes

    Vision-based hand gesture interaction using particle filter, principle component analysis and transition network

    Get PDF
    Vision-based human-computer interaction is becoming important nowadays. It offers natural interaction with computers and frees users from mechanical interaction devices, which is favourable especially for wearable computers. This paper presents a human-computer interaction system based on a conventional webcam and hand gesture recognition. This interaction system works in real time and enables users to control a computer cursor with hand motions and gestures instead of a mouse. Five hand gestures are designed on behalf of five mouse operations: moving, left click, left-double click, right click and no-action. An algorithm based on Particle Filter is used for tracking the hand position. PCA-based feature selection is used for recognizing the hand gestures. A transition network is also employed for improving the accuracy and reliability of the interaction system. This interaction system shows good performance in the recognition and interaction test

    Ultra-wideband position tracking on an assembly line

    Get PDF
    This works considers the problem of tracking objects on an assembly line using an ultra-wideband (UWB) positioning system. Assembly line tracking can be accomplished using touch sensors that physically detect when an object reaches a given location. Such tracking requires sensors placed throughout the entire assembly line, and only provides readings at the sensor locations. In contrast, UWB position tracking utilizes a set of sensors surrounding the whole area, enabling continuous position tracking with less infrastructure. Similar tracking can be accomplished using radio frequency identication (RFID) sensing, but this only provides readings when the parts are near RFID readers. The advantage of UWB position tracking is that it can provide sensor readings continuously throughout the entire tracking area. However, UWB position estimates are noisy, typically having an accuracy of 30-100 cm in a room-to-building sized area. This accuracy is sucient for monitoring which part of an assembly line a part is currently traversing, but is not accurate enough to enable precise tooling or positioning. In this work, we are using a map of an assembly line to constrain the motion tracking. This is similar to how a road map can be used to constrain position tracking for a GPS sensor. The idea is that the raw sensor measurements are constrained by the a priori known map of motion along the assembly line. We use these constraints and design a particle filter to improve position tracking accuracy

    PowerSpy: Location Tracking using Mobile Device Power Analysis

    Full text link
    Modern mobile platforms like Android enable applications to read aggregate power usage on the phone. This information is considered harmless and reading it requires no user permission or notification. We show that by simply reading the phone's aggregate power consumption over a period of a few minutes an application can learn information about the user's location. Aggregate phone power consumption data is extremely noisy due to the multitude of components and applications that simultaneously consume power. Nevertheless, by using machine learning algorithms we are able to successfully infer the phone's location. We discuss several ways in which this privacy leak can be remedied.Comment: Usenix Security 201

    A compact holographic optical tweezers instrument

    Get PDF
    Holographic optical tweezers have found many applications including the construction of complex micron-scale 3D structures and the control of tools and probes for position, force, and viscosity measurement. We have developed a compact, stable, holographic optical tweezers instrument which can be easily transported and is compatible with a wide range of microscopy techniques, making it a valuable tool for collaborative research. The instrument measures approximately 30Ɨ30Ɨ35 cm and is designed around a custom inverted microscope, incorporating a fibre laser operating at 1070 nm. We designed the control software to be easily accessible for the non-specialist, and have further improved its ease of use with a multi-touch iPad interface. A high-speed camera allows multiple trapped objects to be tracked simultaneously. We demonstrate that the compact instrument is stable to 0.5 nm for a 10 s measurement time by plotting the Allan variance of the measured position of a trapped 2 Ī¼m silica bead. We also present a range of objects that have been successfully manipulated

    A fast and robust hand-driven 3D mouse

    Get PDF
    The development of new interaction paradigms requires a natural interaction. This means that people should be able to interact with technology with the same models used to interact with everyday real life, that is through gestures, expressions, voice. Following this idea, in this paper we propose a non intrusive vision based tracking system able to capture hand motion and simple hand gestures. The proposed device allows to use the hand as a "natural" 3D mouse, where the forefinger tip or the palm centre are used to identify a 3D marker and the hand gesture can be used to simulate the mouse buttons. The approach is based on a monoscopic tracking algorithm which is computationally fast and robust against noise and cluttered backgrounds. Two image streams are processed in parallel exploiting multi-core architectures, and their results are combined to obtain a constrained stereoscopic problem. The system has been implemented and thoroughly tested in an experimental environment where the 3D hand mouse has been used to interact with objects in a virtual reality application. We also provide results about the performances of the tracker, which demonstrate precision and robustness of the proposed syste

    Computationally efficient solutions for tracking people with a mobile robot: an experimental evaluation of Bayesian filters

    Get PDF
    Modern service robots will soon become an essential part of modern society. As they have to move and act in human environments, it is essential for them to be provided with a fast and reliable tracking system that localizes people in the neighbourhood. It is therefore important to select the most appropriate filter to estimate the position of these persons. This paper presents three efficient implementations of multisensor-human tracking based on different Bayesian estimators: Extended Kalman Filter (EKF), Unscented Kalman Filter (UKF) and Sampling Importance Resampling (SIR) particle filter. The system implemented on a mobile robot is explained, introducing the methods used to detect and estimate the position of multiple people. Then, the solutions based on the three filters are discussed in detail. Several real experiments are conducted to evaluate their performance, which is compared in terms of accuracy, robustness and execution time of the estimation. The results show that a solution based on the UKF can perform as good as particle filters and can be often a better choice when computational efficiency is a key issue
    • ā€¦
    corecore