1,077 research outputs found

    Exploring human-object interaction through force vector measurement

    Get PDF
    Thesis: S.M., Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2019Cataloged from PDF version of thesis.Includes bibliographical references (pages 101-107).I introduce SCALE, a project aiming to further understand Human-Object Interaction through the real-time analysis of force vector signals, which I have defined as "Force-based Interaction" in this thesis. Force conveys fundamental information in Force-based Interaction, including force intensity, its direction, and object weight - information otherwise difficult to be accessed or inferred from other sensing modalities. To explore the design space of force-based interaction, I have developed the SCALE toolkit, which is composed of modularized 3d-axis force sensors and application APIs. In collaboration with big industry companies, this system has been applied to a variety of application domains and settings, including a retail store, a smart home and a farmers market. In this thesis, I have proposed a base system SCALE, and two additional advanced projects titled KI/OSK and DepthTouch, which build upon the SCALE project.by Takatoshi Yoshida.S.M.S.M. Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Science

    Remote Sensing and Geosciences for Archaeology

    Get PDF
    This book collects more than 20 papers, written by renowned experts and scientists from across the globe, that showcase the state-of-the-art and forefront research in archaeological remote sensing and the use of geoscientific techniques to investigate archaeological records and cultural heritage. Very high resolution satellite images from optical and radar space-borne sensors, airborne multi-spectral images, ground penetrating radar, terrestrial laser scanning, 3D modelling, Geographyc Information Systems (GIS) are among the techniques used in the archaeological studies published in this book. The reader can learn how to use these instruments and sensors, also in combination, to investigate cultural landscapes, discover new sites, reconstruct paleo-landscapes, augment the knowledge of monuments, and assess the condition of heritage at risk. Case studies scattered across Europe, Asia and America are presented: from the World UNESCO World Heritage Site of Lines and Geoglyphs of Nasca and Palpa to heritage under threat in the Middle East and North Africa, from coastal heritage in the intertidal flats of the German North Sea to Early and Neolithic settlements in Thessaly. Beginners will learn robust research methodologies and take inspiration; mature scholars will for sure derive inputs for new research and applications

    NASA Tech Briefs, June 2001

    Get PDF
    Topics covered include: Sensors; Electronic Components and Systems; Software Engineering; Materials; Manufacturing/Fabrication; physical Sciences; Information Sciences

    New Global Perspectives on Archaeological Prospection

    Get PDF
    This volume is a product of the 13th International Conference on Archaeological Prospection 2019, which was hosted by the Department of Environmental Science in the Faculty of Science at the Institute of Technology Sligo. The conference is held every two years under the banner of the International Society for Archaeological Prospection and this was the first time that the conference was held in Ireland. New Global Perspectives on Archaeological Prospection draws together over 90 papers addressing archaeological prospection techniques, methodologies and case studies from 33 countries across Africa, Asia, Australasia, Europe and North America, reflecting current and global trends in archaeological prospection. At this particular ICAP meeting, specific consideration was given to the development and use of archaeological prospection in Ireland, archaeological feedback for the prospector, applications of prospection technology in the urban environment and the use of legacy data. Papers include novel research areas such as magnetometry near the equator, drone-mounted radar, microgravity assessment of tombs, marine electrical resistivity tomography, convolutional neural networks, data processing, automated interpretive workflows and modelling as well as recent improvements in remote sensing, multispectral imaging and visualisation

    Interactive form creation: exploring the creation and manipulation of free form through the use of interactive multiple input interface

    Get PDF
    Most current CAD systems support only the two most common input devices: a mouse and a keyboard that impose a limit to the degree of interaction that a user can have with the system. However, it is not uncommon for users to work together on the same computer during a collaborative task. Beside that, people tend to use both hands to manipulate 3D objects; one hand is used to orient the object while the other hand is used to perform some operation on the object. The same things could be applied to computer modelling in the conceptual phase of the design process. A designer can rotate and position an object with one hand, and manipulate the shape [deform it] with the other hand. Accordingly, the 3D object can be easily and intuitively changed through interactive manipulation of both hands.The research investigates the manipulation and creation of free form geometries through the use of interactive interfaces with multiple input devices. First the creation of the 3D model will be discussed; several different types of models will be illustrated. Furthermore, different tools that allow the user to control the 3D model interactively will be presented. Three experiments were conducted using different interactive interfaces; two bi-manual techniques were compared with the conventional one-handed approach. Finally it will be demonstrated that the use of new and multiple input devices can offer many opportunities for form creation. The problem is that few, if any, systems make it easy for the user or the programmer to use new input devices

    Validation of a modular and wearable system for tracking fingers movements

    Get PDF
    Supervising manual operations performed by workers in industrial environments is crucial in a smart factory. Indeed, the production of products with superior quality at higher throughput rates and reduced costs with the support of Industry 4.0-enabling technologies is based on the strict control of all resources inside the factory, including workers. This paper shows a protocol for validating a new wearable system for tracking finger movements. The wearable system consists of two measuring modules worn on the thumb and index finger that measure flexion and extension of the proximal interphalangeal (PIP) joint by a stretch sensor and rotation of the proximal phalanx (PP) by an inertial measurement unit. A marker-based opto-electronic system is used to validate the proposed device by capturing specific finger movements. Four movements that simulate typical tasks and gestures, such as grasp and pinch, were specifically performed. The maximum root-mean-square error is 3.7 deg for the roll angle of PP. The resistance changes of the stretch sensors with respect to flexion and extension of the PIP joint is 0.47 Ω/deg. The results are useful for data interpretation when the system is adopted to monitor finger movements and gestures

    Wi-fi and radar fusion for head movement sensing through walls leveraging deep learning

    Get PDF
    The detection of head movement plays a crucial role in human–computer interaction systems. These systems depend on control signals to operate a range of assistive and augmented technologies, including wheelchairs for Quadriplegics, as well as virtual/augmented reality and assistive driving. Driver drowsiness detection and alert systems aided by head movement detection can prevent major accidents and save lives. Wearable devices, such as MagTrack consist of magnetic tags and magnetic eyeglasses clips and are intrusive. Vision-based systems suffer from ambient lighting, line of sight, and privacy issues. Contactless sensing has become an essential part of next-generation sensing and detection technologies. Wi-Fi and radar provide contactless sensing, however, in assistive driving they need to be inside enclosures or dashboards, which for all practical purposes in this article have been considered as through walls. In this study, we propose a contactless system to detect human head movement with and without walls. We used ultra-wideband (UWB) radar and Wi-Fi signals, leveraging machine and deep learning (DL) techniques. Our study analyzes the six common head gestures: right, left, up, and down movements. Time-frequency multiresolution analysis based on wavelet scalograms is used to obtain features from channel state information values, along with spectrograms from radar signals for head movement detection. Feature fusion of both radar and Wi-Fi signals is performed with state-of-the-art DL models. A high classification accuracy of 83.33% and 91.8% is achieved overall with the fusion of VGG16 and InceptionV3 model features trained on radar and Wi-Fi time–frequency maps with and without the walls, respectively

    NASA Tech Briefs, August 2000

    Get PDF
    Topics include: Simulation/Virtual Reality; Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Medical Design
    • …
    corecore