10,073 research outputs found
Manufacturing process applications team (MATeam)
Activities of the manufacturing applications team (MATeam) in effecting widespread transfer of NASA technology to aid in the solution of manufacturing problems in the industrial sector are described. During the program's first year of operation, 450 companies, industry associations, and government agencies were contacted, 150 manufacturing problems were documented, and 20 potential technology transfers were identified. Although none of the technology transfers has been commercialized and put in use, several are in the applications engineering phase, and others are in the early stages of implementation. The technology transfer process is described and guidelines used for the preparation of problems statements are included
Concepts for microgravity experiments utilizing gloveboxes
The need for glovebox facilities on spacecraft in which microgravity materials processing experiments are performed is discussed. At present such facilities are being designed, and some of their capabilities are briefly described. A list of experiment concepts which would require or benefit from such facilities is presented
Way of ignorance
Preprint of an article by Jules Winterton, Associate Director and Librarian at the Institute of Advanced Legal Studies
A Grasping-centered Analysis for Cloth Manipulation
Compliant and soft hands have gained a lot of attention in the past decade
because of their ability to adapt to the shape of the objects, increasing their
effectiveness for grasping. However, when it comes to grasping highly flexible
objects such as textiles, we face the dual problem: it is the object that will
adapt to the shape of the hand or gripper. In this context, the classic grasp
analysis or grasping taxonomies are not suitable for describing textile objects
grasps. This work proposes a novel definition of textile object grasps that
abstracts from the robotic embodiment or hand shape and recovers concepts from
the early neuroscience literature on hand prehension skills. This framework
enables us to identify what grasps have been used in literature until now to
perform robotic cloth manipulation, and allows for a precise definition of all
the tasks that have been tackled in terms of manipulation primitives based on
regrasps. In addition, we also review what grippers have been used. Our
analysis shows how the vast majority of cloth manipulations have relied only on
one type of grasp, and at the same time we identify several tasks that need
more variety of grasp types to be executed successfully. Our framework is
generic, provides a classification of cloth manipulation primitives and can
inspire gripper design and benchmark construction for cloth manipulation.Comment: 13 pages, 4 figures, 4 tables. Accepted for publication at IEEE
Transactions on Robotic
Manipulator system man-machine interface evaluation program
Application and requirements for remote manipulator systems for future space missions were investigated. A manipulator evaluation program was established to study the effects of various systems parameters on operator performance of tasks necessary for remotely manned missions. The program and laboratory facilities are described. Evaluation criteria and philosophy are discussed
Comparing Piezoresistive Substrates for Tactile Sensing in Dexterous Hands
While tactile skins have been shown to be useful for detecting collisions
between a robotic arm and its environment, they have not been extensively used
for improving robotic grasping and in-hand manipulation. We propose a novel
sensor design for use in covering existing multi-fingered robot hands. We
analyze the performance of four different piezoresistive materials using both
fabric and anti-static foam substrates in benchtop experiments. We find that
although the piezoresistive foam was designed as packing material and not for
use as a sensing substrate, it performs comparably with fabrics specifically
designed for this purpose. While these results demonstrate the potential of
piezoresistive foams for tactile sensing applications, they do not fully
characterize the efficacy of these sensors for use in robot manipulation. As
such, we use a high density foam substrate to develop a scalable tactile skin
that can be attached to the palm of a robotic hand. We demonstrate several
robotic manipulation tasks using this sensor to show its ability to reliably
detect and localize contact, as well as analyze contact patterns during
grasping and transport tasks.Comment: 10 figures, 8 pages, submitted to ICRA 202
Design and Fabrication of Fabric ReinforcedTextile Actuators forSoft Robotic Graspers
abstract: Wearable assistive devices have been greatly improved thanks to advancements made in soft robotics, even creation soft extra arms for paralyzed patients. Grasping remains an active area of research of soft extra limbs. Soft robotics allow the creation of grippers that due to their inherit compliance making them lightweight, safer for human interactions, more robust in unknown environments and simpler to control than their rigid counterparts. A current problem in soft robotics is the lack of seamless integration of soft grippers into wearable devices, which is in part due to the use of elastomeric materials used for the creation of most of these grippers. This work introduces fabric-reinforced textile actuators (FRTA). The selection of materials, design logic of the fabric reinforcement layer and fabrication method are discussed. The relationship between the fabric reinforcement characteristics and the actuator deformation is studied and experimentally veri๏ฌed. The FRTA are made of a combination of a hyper-elastic fabric material with a stiffer fabric reinforcement on top. In this thesis, the design, fabrication, and evaluation of FRTAs are explored. It is shown that by varying the geometry of the reinforcement layer, a variety of motion can be achieve such as axial extension, radial expansion, bending, and twisting along its central axis. Multi-segmented actuators can be created by tailoring different sections of fabric-reinforcements together in order to generate a combination of motions to perform speci๏ฌc tasks. The applicability of this actuators for soft grippers is demonstrated by designing and providing preliminary evaluation of an anthropomorphic soft robotic hand capable of grasping daily living objects of various size and shapes.Dissertation/ThesisMasters Thesis Biomedical Engineering 201
์ธ๊ฐ ๊ธฐ๊ณ ์ํธ์์ฉ์ ์ํ ๊ฐ๊ฑดํ๊ณ ์ ํํ ์๋์ ์ถ์ ๊ธฐ์ ์ฐ๊ตฌ
ํ์๋
ผ๋ฌธ(๋ฐ์ฌ) -- ์์ธ๋ํ๊ต๋ํ์ : ๊ณต๊ณผ๋ํ ๊ธฐ๊ณํญ๊ณต๊ณตํ๋ถ, 2021.8. ์ด๋์ค.Hand-based interface is promising for realizing intuitive, natural and accurate human machine interaction (HMI), as the human hand is main source of dexterity in our daily activities.
For this, the thesis begins with the human perception study on the detection threshold of visuo-proprioceptive conflict (i.e., allowable tracking error) with or without cutantoues haptic feedback, and suggests tracking error specification for realistic and fluidic hand-based HMI. The thesis then proceeds to propose a novel wearable hand tracking module, which, to be compatible with the cutaneous haptic devices spewing magnetic noise, opportunistically employ heterogeneous sensors (IMU/compass module and soft sensor) reflecting the anatomical properties of human hand, which is suitable for specific application (i.e., finger-based interaction with finger-tip haptic devices).
This hand tracking module however loses its tracking when interacting with, or being nearby, electrical machines or ferromagnetic materials. For this, the thesis presents its main contribution, a novel visual-inertial skeleton tracking (VIST) framework, that can provide accurate and robust hand (and finger) motion tracking even for many challenging real-world scenarios and environments,
for which the state-of-the-art technologies are known to fail due to their respective fundamental limitations (e.g., severe occlusions for tracking purely with vision sensors; electromagnetic interference for tracking purely with IMUs (inertial measurement units) and compasses; and mechanical contacts for tracking purely with soft sensors).
The proposed VIST framework comprises a sensor glove with multiple IMUs and passive visual markers as well as a head-mounted stereo camera; and a tightly-coupled filtering-based visual-inertial fusion algorithm to estimate the hand/finger motion and auto-calibrate hand/glove-related kinematic parameters simultaneously while taking into account the hand anatomical constraints.
The VIST framework exhibits good tracking accuracy and robustness, affordable material cost, light hardware and software weights, and ruggedness/durability even to permit washing.
Quantitative and qualitative experiments are also performed to validate the advantages and properties of our VIST framework, thereby, clearly demonstrating its potential for real-world applications.์ ๋์์ ๊ธฐ๋ฐ์ผ๋ก ํ ์ธํฐํ์ด์ค๋ ์ธ๊ฐ-๊ธฐ๊ณ ์ํธ์์ฉ ๋ถ์ผ์์ ์ง๊ด์ฑ, ๋ชฐ์
๊ฐ, ์ ๊ตํจ์ ์ ๊ณตํด์ค ์ ์์ด ๋ง์ ์ฃผ๋ชฉ์ ๋ฐ๊ณ ์๊ณ , ์ด๋ฅผ ์ํด ๊ฐ์ฅ ํ์์ ์ธ ๊ธฐ์ ์ค ํ๋๊ฐ ์ ๋์์ ๊ฐ๊ฑดํ๊ณ ์ ํํ ์ถ์ ๊ธฐ์ ์ด๋ค.
์ด๋ฅผ ์ํด ๋ณธ ํ์๋
ผ๋ฌธ์์๋ ๋จผ์ ์ฌ๋ ์ธ์ง์ ๊ด์ ์์ ์ ๋์ ์ถ์ ์ค์ฐจ์ ์ธ์ง ๋ฒ์๋ฅผ ๊ท๋ช
ํ๋ค. ์ด ์ค์ฐจ ์ธ์ง ๋ฒ์๋ ์๋ก์ด ์ ๋์ ์ถ์ ๊ธฐ์ ๊ฐ๋ฐ ์ ์ค์ํ ์ค๊ณ ๊ธฐ์ค์ด ๋ ์ ์์ด ์ด๋ฅผ ํผํ์ ์คํ์ ํตํด ์ ๋์ ์ผ๋ก ๋ฐํ๊ณ , ํนํ ์๋ ์ด๊ฐ ์ฅ๋น๊ฐ ์์๋ ์ด ์ธ์ง ๋ฒ์์ ๋ณํ๋ ๋ฐํ๋ค.
์ด๋ฅผ ํ ๋๋ก, ์ด๊ฐ ํผ๋๋ฐฑ์ ์ฃผ๋ ๊ฒ์ด ๋ค์ํ ์ธ๊ฐ-๊ธฐ๊ณ ์ํธ์์ฉ ๋ถ์ผ์์ ๋๋ฆฌ ์ฐ๊ตฌ๋์ด ์์ผ๋ฏ๋ก, ๋จผ์ ์๋ ์ด๊ฐ ์ฅ๋น์ ํจ๊ป ์ฌ์ฉํ ์ ์๋ ์ ๋์ ์ถ์ ๋ชจ๋์ ๊ฐ๋ฐํ๋ค.
์ด ์๋ ์ด๊ฐ ์ฅ๋น๋ ์๊ธฐ์ฅ ์ธ๋์ ์ผ์ผ์ผ ์ฐฉ์ฉํ ๊ธฐ์ ์์ ํํ ์ฌ์ฉ๋๋ ์ง์๊ธฐ ์ผ์๋ฅผ ๊ต๋ํ๋๋ฐ, ์ด๋ฅผ ์ ์ ํ ์ฌ๋ ์์ ํด๋ถํ์ ํน์ฑ๊ณผ ๊ด์ฑ ์ผ์/์ง์๊ธฐ ์ผ์/์ํํธ ์ผ์์ ์ ์ ํ ํ์ฉ์ ํตํด ํด๊ฒฐํ๋ค.
์ด๋ฅผ ํ์ฅํ์ฌ ๋ณธ ๋
ผ๋ฌธ์์๋, ์ด๊ฐ ์ฅ๋น ์ฐฉ์ฉ ์ ๋ฟ ์๋๋ผ ๋ชจ๋ ์ฅ๋น ์ฐฉ์ฉ / ํ๊ฒฝ / ๋ฌผ์ฒด์์ ์ํธ์์ฉ ์์๋ ์ฌ์ฉ ๊ฐ๋ฅํ ์๋ก์ด ์ ๋์ ์ถ์ ๊ธฐ์ ์ ์ ์ํ๋ค.
๊ธฐ์กด์ ์ ๋์ ์ถ์ ๊ธฐ์ ๋ค์ ๊ฐ๋ฆผ ํ์ (์์ ๊ธฐ๋ฐ ๊ธฐ์ ), ์ง์๊ธฐ ์ธ๋ (๊ด์ฑ/์ง์๊ธฐ ์ผ์ ๊ธฐ๋ฐ ๊ธฐ์ ), ๋ฌผ์ฒด์์ ์ ์ด (์ํํธ ์ผ์ ๊ธฐ๋ฐ ๊ธฐ์ ) ๋ฑ์ผ๋ก ์ธํด ์ ํ๋ ํ๊ฒฝ์์ ๋ฐ์ ์ฌ์ฉํ์ง ๋ชปํ๋ค.
์ด๋ฅผ ์ํด ๋ง์ ๋ฌธ์ ๋ฅผ ์ผ์ผํค๋ ์ง์๊ธฐ ์ผ์ ์์ด ์๋ณด์ ์ธ ํน์ฑ์ ์ง๋๋ ๊ด์ฑ ์ผ์์ ์์ ์ผ์๋ฅผ ์ตํฉํ๊ณ , ์ด๋ ์์ ๊ณต๊ฐ์ ๋ค ์์ ๋์ ์์ง์์ ๊ฐ๋ ์ ๋์์ ์ถ์ ํ๊ธฐ ์ํด ๋ค์์ ๊ตฌ๋ถ๋์ง ์๋ ๋ง์ปค๋ค์ ์ฌ์ฉํ๋ค.
์ด ๋ง์ปค์ ๊ตฌ๋ถ ๊ณผ์ (correspondence search)๋ฅผ ์ํด ๊ธฐ์กด์ ์ฝ๊ฒฐํฉ (loosely-coupled) ๊ธฐ๋ฐ์ด ์๋ ๊ฐ๊ฒฐํฉ (tightly-coupled ๊ธฐ๋ฐ ์ผ์ ์ตํฉ ๊ธฐ์ ์ ์ ์ํ๊ณ , ์ด๋ฅผ ํตํด ์ง์๊ธฐ ์ผ์ ์์ด ์ ํํ ์ ๋์์ด ๊ฐ๋ฅํ ๋ฟ ์๋๋ผ ์ฐฉ์ฉํ ์ผ์๋ค์ ์ ํ์ฑ/ํธ์์ฑ์ ๋ฌธ์ ๋ฅผ ์ผ์ผํค๋ ์ผ์ ๋ถ์ฐฉ ์ค์ฐจ / ์ฌ์ฉ์์ ์ ๋ชจ์ ๋ฑ์ ์๋์ผ๋ก ์ ํํ ๋ณด์ ํ๋ค.
์ด ์ ์๋ ์์-๊ด์ฑ ์ผ์ ์ตํฉ ๊ธฐ์ (Visual-Inertial Skeleton Tracking (VIST)) ์ ๋ฐ์ด๋ ์ฑ๋ฅ๊ณผ ๊ฐ๊ฑด์ฑ์ด ๋ค์ํ ์ ๋/์ ์ฑ ์คํ์ ํตํด ๊ฒ์ฆ๋์๊ณ , ์ด๋ VIST์ ๋ค์ํ ์ผ์ํ๊ฒฝ์์ ๊ธฐ์กด ์์คํ
์ด ๊ตฌํํ์ง ๋ชปํ๋ ์ ๋์ ์ถ์ ์ ๊ฐ๋ฅ์ผ ํจ์ผ๋ก์จ, ๋ง์ ์ธ๊ฐ-๊ธฐ๊ณ ์ํธ์์ฉ ๋ถ์ผ์์์ ๊ฐ๋ฅ์ฑ์ ๋ณด์ฌ์ค๋ค.1 Introduction 1
1.1. Motivation 1
1.2. Related Work 5
1.3. Contribution 12
2 Detection Threshold of Hand Tracking Error 16
2.1. Motivation 16
2.2. Experimental Environment 20
2.2.1. Hardware Setup 21
2.2.2. Virtual Environment Rendering 23
2.2.3. HMD Calibration 23
2.3. Identifying the Detection Threshold of Tracking Error 26
2.3.1. Experimental Setup 27
2.3.2. Procedure 27
2.3.3. Experimental Result 31
2.4. Enlarging the Detection Threshold of Tracking Error by Haptic Feedback 31
2.4.1. Experimental Setup 31
2.4.2. Procedure 32
2.4.3. Experimental Result 34
2.5. Discussion 34
3 Wearable Finger Tracking Module for Haptic Interaction 38
3.1. Motivation 38
3.2. Development of Finger Tracking Module 42
3.2.1. Hardware Setup 42
3.2.2. Tracking algorithm 45
3.2.3. Calibration method 48
3.3. Evaluation for VR Haptic Interaction Task 50
3.3.1. Quantitative evaluation of FTM 50
3.3.2. Implementation of Wearable Cutaneous Haptic Interface
51
3.3.3. Usability evaluation for VR peg-in-hole task 53
3.4. Discussion 57
4 Visual-Inertial Skeleton Tracking for Human Hand 59
4.1. Motivation 59
4.2. Hardware Setup and Hand Models 62
4.2.1. Human Hand Model 62
4.2.2. Wearable Sensor Glove 62
4.2.3. Stereo Camera 66
4.3. Visual Information Extraction 66
4.3.1. Marker Detection in Raw Images 68
4.3.2. Cost Function for Point Matching 68
4.3.3. Left-Right Stereo Matching 69
4.4. IMU-Aided Correspondence Search 72
4.5. Filtering-based Visual-Inertial Sensor Fusion 76
4.5.1. EKF States for Hand Tracking and Auto-Calibration 78
4.5.2. Prediction with IMU Information 79
4.5.3. Correction with Visual Information 82
4.5.4. Correction with Anatomical Constraints 84
4.6. Quantitative Evaluation for Free Hand Motion 87
4.6.1. Experimental Setup 87
4.6.2. Procedure 88
4.6.3. Experimental Result 90
4.7. Quantitative and Comparative Evaluation for Challenging Hand Motion 95
4.7.1. Experimental Setup 95
4.7.2. Procedure 96
4.7.3. Experimental Result 98
4.7.4. Performance Comparison with Existing Methods for Challenging Hand Motion 101
4.8. Qualitative Evaluation for Real-World Scenarios 105
4.8.1. Visually Complex Background 105
4.8.2. Object Interaction 106
4.8.3. Wearing Fingertip Cutaneous Haptic Devices 109
4.8.4. Outdoor Environment 111
4.9. Discussion 112
5 Conclusion 116
References 124
Abstract (in Korean) 139
Acknowledgment 141๋ฐ
- โฆ