10,073 research outputs found

    Manufacturing process applications team (MATeam)

    Get PDF
    Activities of the manufacturing applications team (MATeam) in effecting widespread transfer of NASA technology to aid in the solution of manufacturing problems in the industrial sector are described. During the program's first year of operation, 450 companies, industry associations, and government agencies were contacted, 150 manufacturing problems were documented, and 20 potential technology transfers were identified. Although none of the technology transfers has been commercialized and put in use, several are in the applications engineering phase, and others are in the early stages of implementation. The technology transfer process is described and guidelines used for the preparation of problems statements are included

    Concepts for microgravity experiments utilizing gloveboxes

    Get PDF
    The need for glovebox facilities on spacecraft in which microgravity materials processing experiments are performed is discussed. At present such facilities are being designed, and some of their capabilities are briefly described. A list of experiment concepts which would require or benefit from such facilities is presented

    Way of ignorance

    Get PDF
    Preprint of an article by Jules Winterton, Associate Director and Librarian at the Institute of Advanced Legal Studies

    A Grasping-centered Analysis for Cloth Manipulation

    Get PDF
    Compliant and soft hands have gained a lot of attention in the past decade because of their ability to adapt to the shape of the objects, increasing their effectiveness for grasping. However, when it comes to grasping highly flexible objects such as textiles, we face the dual problem: it is the object that will adapt to the shape of the hand or gripper. In this context, the classic grasp analysis or grasping taxonomies are not suitable for describing textile objects grasps. This work proposes a novel definition of textile object grasps that abstracts from the robotic embodiment or hand shape and recovers concepts from the early neuroscience literature on hand prehension skills. This framework enables us to identify what grasps have been used in literature until now to perform robotic cloth manipulation, and allows for a precise definition of all the tasks that have been tackled in terms of manipulation primitives based on regrasps. In addition, we also review what grippers have been used. Our analysis shows how the vast majority of cloth manipulations have relied only on one type of grasp, and at the same time we identify several tasks that need more variety of grasp types to be executed successfully. Our framework is generic, provides a classification of cloth manipulation primitives and can inspire gripper design and benchmark construction for cloth manipulation.Comment: 13 pages, 4 figures, 4 tables. Accepted for publication at IEEE Transactions on Robotic

    Manipulator system man-machine interface evaluation program

    Get PDF
    Application and requirements for remote manipulator systems for future space missions were investigated. A manipulator evaluation program was established to study the effects of various systems parameters on operator performance of tasks necessary for remotely manned missions. The program and laboratory facilities are described. Evaluation criteria and philosophy are discussed

    Comparing Piezoresistive Substrates for Tactile Sensing in Dexterous Hands

    Full text link
    While tactile skins have been shown to be useful for detecting collisions between a robotic arm and its environment, they have not been extensively used for improving robotic grasping and in-hand manipulation. We propose a novel sensor design for use in covering existing multi-fingered robot hands. We analyze the performance of four different piezoresistive materials using both fabric and anti-static foam substrates in benchtop experiments. We find that although the piezoresistive foam was designed as packing material and not for use as a sensing substrate, it performs comparably with fabrics specifically designed for this purpose. While these results demonstrate the potential of piezoresistive foams for tactile sensing applications, they do not fully characterize the efficacy of these sensors for use in robot manipulation. As such, we use a high density foam substrate to develop a scalable tactile skin that can be attached to the palm of a robotic hand. We demonstrate several robotic manipulation tasks using this sensor to show its ability to reliably detect and localize contact, as well as analyze contact patterns during grasping and transport tasks.Comment: 10 figures, 8 pages, submitted to ICRA 202

    Design and Fabrication of Fabric ReinforcedTextile Actuators forSoft Robotic Graspers

    Get PDF
    abstract: Wearable assistive devices have been greatly improved thanks to advancements made in soft robotics, even creation soft extra arms for paralyzed patients. Grasping remains an active area of research of soft extra limbs. Soft robotics allow the creation of grippers that due to their inherit compliance making them lightweight, safer for human interactions, more robust in unknown environments and simpler to control than their rigid counterparts. A current problem in soft robotics is the lack of seamless integration of soft grippers into wearable devices, which is in part due to the use of elastomeric materials used for the creation of most of these grippers. This work introduces fabric-reinforced textile actuators (FRTA). The selection of materials, design logic of the fabric reinforcement layer and fabrication method are discussed. The relationship between the fabric reinforcement characteristics and the actuator deformation is studied and experimentally veri๏ฌed. The FRTA are made of a combination of a hyper-elastic fabric material with a stiffer fabric reinforcement on top. In this thesis, the design, fabrication, and evaluation of FRTAs are explored. It is shown that by varying the geometry of the reinforcement layer, a variety of motion can be achieve such as axial extension, radial expansion, bending, and twisting along its central axis. Multi-segmented actuators can be created by tailoring different sections of fabric-reinforcements together in order to generate a combination of motions to perform speci๏ฌc tasks. The applicability of this actuators for soft grippers is demonstrated by designing and providing preliminary evaluation of an anthropomorphic soft robotic hand capable of grasping daily living objects of various size and shapes.Dissertation/ThesisMasters Thesis Biomedical Engineering 201

    ์ธ๊ฐ„ ๊ธฐ๊ณ„ ์ƒํ˜ธ์ž‘์šฉ์„ ์œ„ํ•œ ๊ฐ•๊ฑดํ•˜๊ณ  ์ •ํ™•ํ•œ ์†๋™์ž‘ ์ถ”์  ๊ธฐ์ˆ  ์—ฐ๊ตฌ

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ(๋ฐ•์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต๋Œ€ํ•™์› : ๊ณต๊ณผ๋Œ€ํ•™ ๊ธฐ๊ณ„ํ•ญ๊ณต๊ณตํ•™๋ถ€, 2021.8. ์ด๋™์ค€.Hand-based interface is promising for realizing intuitive, natural and accurate human machine interaction (HMI), as the human hand is main source of dexterity in our daily activities. For this, the thesis begins with the human perception study on the detection threshold of visuo-proprioceptive conflict (i.e., allowable tracking error) with or without cutantoues haptic feedback, and suggests tracking error specification for realistic and fluidic hand-based HMI. The thesis then proceeds to propose a novel wearable hand tracking module, which, to be compatible with the cutaneous haptic devices spewing magnetic noise, opportunistically employ heterogeneous sensors (IMU/compass module and soft sensor) reflecting the anatomical properties of human hand, which is suitable for specific application (i.e., finger-based interaction with finger-tip haptic devices). This hand tracking module however loses its tracking when interacting with, or being nearby, electrical machines or ferromagnetic materials. For this, the thesis presents its main contribution, a novel visual-inertial skeleton tracking (VIST) framework, that can provide accurate and robust hand (and finger) motion tracking even for many challenging real-world scenarios and environments, for which the state-of-the-art technologies are known to fail due to their respective fundamental limitations (e.g., severe occlusions for tracking purely with vision sensors; electromagnetic interference for tracking purely with IMUs (inertial measurement units) and compasses; and mechanical contacts for tracking purely with soft sensors). The proposed VIST framework comprises a sensor glove with multiple IMUs and passive visual markers as well as a head-mounted stereo camera; and a tightly-coupled filtering-based visual-inertial fusion algorithm to estimate the hand/finger motion and auto-calibrate hand/glove-related kinematic parameters simultaneously while taking into account the hand anatomical constraints. The VIST framework exhibits good tracking accuracy and robustness, affordable material cost, light hardware and software weights, and ruggedness/durability even to permit washing. Quantitative and qualitative experiments are also performed to validate the advantages and properties of our VIST framework, thereby, clearly demonstrating its potential for real-world applications.์† ๋™์ž‘์„ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•œ ์ธํ„ฐํŽ˜์ด์Šค๋Š” ์ธ๊ฐ„-๊ธฐ๊ณ„ ์ƒํ˜ธ์ž‘์šฉ ๋ถ„์•ผ์—์„œ ์ง๊ด€์„ฑ, ๋ชฐ์ž…๊ฐ, ์ •๊ตํ•จ์„ ์ œ๊ณตํ•ด์ค„ ์ˆ˜ ์žˆ์–ด ๋งŽ์€ ์ฃผ๋ชฉ์„ ๋ฐ›๊ณ  ์žˆ๊ณ , ์ด๋ฅผ ์œ„ํ•ด ๊ฐ€์žฅ ํ•„์ˆ˜์ ์ธ ๊ธฐ์ˆ  ์ค‘ ํ•˜๋‚˜๊ฐ€ ์† ๋™์ž‘์˜ ๊ฐ•๊ฑดํ•˜๊ณ  ์ •ํ™•ํ•œ ์ถ”์  ๊ธฐ์ˆ  ์ด๋‹ค. ์ด๋ฅผ ์œ„ํ•ด ๋ณธ ํ•™์œ„๋…ผ๋ฌธ์—์„œ๋Š” ๋จผ์ € ์‚ฌ๋žŒ ์ธ์ง€์˜ ๊ด€์ ์—์„œ ์† ๋™์ž‘ ์ถ”์  ์˜ค์ฐจ์˜ ์ธ์ง€ ๋ฒ”์œ„๋ฅผ ๊ทœ๋ช…ํ•œ๋‹ค. ์ด ์˜ค์ฐจ ์ธ์ง€ ๋ฒ”์œ„๋Š” ์ƒˆ๋กœ์šด ์† ๋™์ž‘ ์ถ”์  ๊ธฐ์ˆ  ๊ฐœ๋ฐœ ์‹œ ์ค‘์š”ํ•œ ์„ค๊ณ„ ๊ธฐ์ค€์ด ๋  ์ˆ˜ ์žˆ์–ด ์ด๋ฅผ ํ”ผํ—˜์ž ์‹คํ—˜์„ ํ†ตํ•ด ์ •๋Ÿ‰์ ์œผ๋กœ ๋ฐํžˆ๊ณ , ํŠนํžˆ ์†๋ ์ด‰๊ฐ ์žฅ๋น„๊ฐ€ ์žˆ์„๋•Œ ์ด ์ธ์ง€ ๋ฒ”์œ„์˜ ๋ณ€ํ™”๋„ ๋ฐํžŒ๋‹ค. ์ด๋ฅผ ํ† ๋Œ€๋กœ, ์ด‰๊ฐ ํ”ผ๋“œ๋ฐฑ์„ ์ฃผ๋Š” ๊ฒƒ์ด ๋‹ค์–‘ํ•œ ์ธ๊ฐ„-๊ธฐ๊ณ„ ์ƒํ˜ธ์ž‘์šฉ ๋ถ„์•ผ์—์„œ ๋„๋ฆฌ ์—ฐ๊ตฌ๋˜์–ด ์™”์œผ๋ฏ€๋กœ, ๋จผ์ € ์†๋ ์ด‰๊ฐ ์žฅ๋น„์™€ ํ•จ๊ป˜ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋Š” ์† ๋™์ž‘ ์ถ”์  ๋ชจ๋“ˆ์„ ๊ฐœ๋ฐœํ•œ๋‹ค. ์ด ์†๋ ์ด‰๊ฐ ์žฅ๋น„๋Š” ์ž๊ธฐ์žฅ ์™ธ๋ž€์„ ์ผ์œผ์ผœ ์ฐฉ์šฉํ˜• ๊ธฐ์ˆ ์—์„œ ํ”ํžˆ ์‚ฌ์šฉ๋˜๋Š” ์ง€์ž๊ธฐ ์„ผ์„œ๋ฅผ ๊ต๋ž€ํ•˜๋Š”๋ฐ, ์ด๋ฅผ ์ ์ ˆํ•œ ์‚ฌ๋žŒ ์†์˜ ํ•ด๋ถ€ํ•™์  ํŠน์„ฑ๊ณผ ๊ด€์„ฑ ์„ผ์„œ/์ง€์ž๊ธฐ ์„ผ์„œ/์†Œํ”„ํŠธ ์„ผ์„œ์˜ ์ ์ ˆํ•œ ํ™œ์šฉ์„ ํ†ตํ•ด ํ•ด๊ฒฐํ•œ๋‹ค. ์ด๋ฅผ ํ™•์žฅํ•˜์—ฌ ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š”, ์ด‰๊ฐ ์žฅ๋น„ ์ฐฉ์šฉ ์‹œ ๋ฟ ์•„๋‹ˆ๋ผ ๋ชจ๋“  ์žฅ๋น„ ์ฐฉ์šฉ / ํ™˜๊ฒฝ / ๋ฌผ์ฒด์™€์˜ ์ƒํ˜ธ์ž‘์šฉ ์‹œ์—๋„ ์‚ฌ์šฉ ๊ฐ€๋Šฅํ•œ ์ƒˆ๋กœ์šด ์† ๋™์ž‘ ์ถ”์  ๊ธฐ์ˆ ์„ ์ œ์•ˆํ•œ๋‹ค. ๊ธฐ์กด์˜ ์† ๋™์ž‘ ์ถ”์  ๊ธฐ์ˆ ๋“ค์€ ๊ฐ€๋ฆผ ํ˜„์ƒ (์˜์ƒ ๊ธฐ๋ฐ˜ ๊ธฐ์ˆ ), ์ง€์ž๊ธฐ ์™ธ๋ž€ (๊ด€์„ฑ/์ง€์ž๊ธฐ ์„ผ์„œ ๊ธฐ๋ฐ˜ ๊ธฐ์ˆ ), ๋ฌผ์ฒด์™€์˜ ์ ‘์ด‰ (์†Œํ”„ํŠธ ์„ผ์„œ ๊ธฐ๋ฐ˜ ๊ธฐ์ˆ ) ๋“ฑ์œผ๋กœ ์ธํ•ด ์ œํ•œ๋œ ํ™˜๊ฒฝ์—์„œ ๋ฐ–์— ์‚ฌ์šฉํ•˜์ง€ ๋ชปํ•œ๋‹ค. ์ด๋ฅผ ์œ„ํ•ด ๋งŽ์€ ๋ฌธ์ œ๋ฅผ ์ผ์œผํ‚ค๋Š” ์ง€์ž๊ธฐ ์„ผ์„œ ์—†์ด ์ƒ๋ณด์ ์ธ ํŠน์„ฑ์„ ์ง€๋‹ˆ๋Š” ๊ด€์„ฑ ์„ผ์„œ์™€ ์˜์ƒ ์„ผ์„œ๋ฅผ ์œตํ•ฉํ•˜๊ณ , ์ด๋•Œ ์ž‘์€ ๊ณต๊ฐ„์— ๋‹ค ์ž์œ ๋„์˜ ์›€์ง์ž„์„ ๊ฐ–๋Š” ์† ๋™์ž‘์„ ์ถ”์ ํ•˜๊ธฐ ์œ„ํ•ด ๋‹ค์ˆ˜์˜ ๊ตฌ๋ถ„๋˜์ง€ ์•Š๋Š” ๋งˆ์ปค๋“ค์„ ์‚ฌ์šฉํ•œ๋‹ค. ์ด ๋งˆ์ปค์˜ ๊ตฌ๋ถ„ ๊ณผ์ • (correspondence search)๋ฅผ ์œ„ํ•ด ๊ธฐ์กด์˜ ์•ฝ๊ฒฐํ•ฉ (loosely-coupled) ๊ธฐ๋ฐ˜์ด ์•„๋‹Œ ๊ฐ•๊ฒฐํ•ฉ (tightly-coupled ๊ธฐ๋ฐ˜ ์„ผ์„œ ์œตํ•ฉ ๊ธฐ์ˆ ์„ ์ œ์•ˆํ•˜๊ณ , ์ด๋ฅผ ํ†ตํ•ด ์ง€์ž๊ธฐ ์„ผ์„œ ์—†์ด ์ •ํ™•ํ•œ ์† ๋™์ž‘์ด ๊ฐ€๋Šฅํ•  ๋ฟ ์•„๋‹ˆ๋ผ ์ฐฉ์šฉํ˜• ์„ผ์„œ๋“ค์˜ ์ •ํ™•์„ฑ/ํŽธ์˜์„ฑ์— ๋ฌธ์ œ๋ฅผ ์ผ์œผํ‚ค๋˜ ์„ผ์„œ ๋ถ€์ฐฉ ์˜ค์ฐจ / ์‚ฌ์šฉ์ž์˜ ์† ๋ชจ์–‘ ๋“ฑ์„ ์ž๋™์œผ๋กœ ์ •ํ™•ํžˆ ๋ณด์ •ํ•œ๋‹ค. ์ด ์ œ์•ˆ๋œ ์˜์ƒ-๊ด€์„ฑ ์„ผ์„œ ์œตํ•ฉ ๊ธฐ์ˆ  (Visual-Inertial Skeleton Tracking (VIST)) ์˜ ๋›ฐ์–ด๋‚œ ์„ฑ๋Šฅ๊ณผ ๊ฐ•๊ฑด์„ฑ์ด ๋‹ค์–‘ํ•œ ์ •๋Ÿ‰/์ •์„ฑ ์‹คํ—˜์„ ํ†ตํ•ด ๊ฒ€์ฆ๋˜์—ˆ๊ณ , ์ด๋Š” VIST์˜ ๋‹ค์–‘ํ•œ ์ผ์ƒํ™˜๊ฒฝ์—์„œ ๊ธฐ์กด ์‹œ์Šคํ…œ์ด ๊ตฌํ˜„ํ•˜์ง€ ๋ชปํ•˜๋˜ ์† ๋™์ž‘ ์ถ”์ ์„ ๊ฐ€๋Šฅ์ผ€ ํ•จ์œผ๋กœ์จ, ๋งŽ์€ ์ธ๊ฐ„-๊ธฐ๊ณ„ ์ƒํ˜ธ์ž‘์šฉ ๋ถ„์•ผ์—์„œ์˜ ๊ฐ€๋Šฅ์„ฑ์„ ๋ณด์—ฌ์ค€๋‹ค.1 Introduction 1 1.1. Motivation 1 1.2. Related Work 5 1.3. Contribution 12 2 Detection Threshold of Hand Tracking Error 16 2.1. Motivation 16 2.2. Experimental Environment 20 2.2.1. Hardware Setup 21 2.2.2. Virtual Environment Rendering 23 2.2.3. HMD Calibration 23 2.3. Identifying the Detection Threshold of Tracking Error 26 2.3.1. Experimental Setup 27 2.3.2. Procedure 27 2.3.3. Experimental Result 31 2.4. Enlarging the Detection Threshold of Tracking Error by Haptic Feedback 31 2.4.1. Experimental Setup 31 2.4.2. Procedure 32 2.4.3. Experimental Result 34 2.5. Discussion 34 3 Wearable Finger Tracking Module for Haptic Interaction 38 3.1. Motivation 38 3.2. Development of Finger Tracking Module 42 3.2.1. Hardware Setup 42 3.2.2. Tracking algorithm 45 3.2.3. Calibration method 48 3.3. Evaluation for VR Haptic Interaction Task 50 3.3.1. Quantitative evaluation of FTM 50 3.3.2. Implementation of Wearable Cutaneous Haptic Interface 51 3.3.3. Usability evaluation for VR peg-in-hole task 53 3.4. Discussion 57 4 Visual-Inertial Skeleton Tracking for Human Hand 59 4.1. Motivation 59 4.2. Hardware Setup and Hand Models 62 4.2.1. Human Hand Model 62 4.2.2. Wearable Sensor Glove 62 4.2.3. Stereo Camera 66 4.3. Visual Information Extraction 66 4.3.1. Marker Detection in Raw Images 68 4.3.2. Cost Function for Point Matching 68 4.3.3. Left-Right Stereo Matching 69 4.4. IMU-Aided Correspondence Search 72 4.5. Filtering-based Visual-Inertial Sensor Fusion 76 4.5.1. EKF States for Hand Tracking and Auto-Calibration 78 4.5.2. Prediction with IMU Information 79 4.5.3. Correction with Visual Information 82 4.5.4. Correction with Anatomical Constraints 84 4.6. Quantitative Evaluation for Free Hand Motion 87 4.6.1. Experimental Setup 87 4.6.2. Procedure 88 4.6.3. Experimental Result 90 4.7. Quantitative and Comparative Evaluation for Challenging Hand Motion 95 4.7.1. Experimental Setup 95 4.7.2. Procedure 96 4.7.3. Experimental Result 98 4.7.4. Performance Comparison with Existing Methods for Challenging Hand Motion 101 4.8. Qualitative Evaluation for Real-World Scenarios 105 4.8.1. Visually Complex Background 105 4.8.2. Object Interaction 106 4.8.3. Wearing Fingertip Cutaneous Haptic Devices 109 4.8.4. Outdoor Environment 111 4.9. Discussion 112 5 Conclusion 116 References 124 Abstract (in Korean) 139 Acknowledgment 141๋ฐ•
    • โ€ฆ
    corecore