6 research outputs found

    Interactive remote robotic arm control with hand motions

    Get PDF
    Geographically-separated people are now connected by smart devices and networks to enjoy remote human interactions. However, current online interactions are still confined to a virtual space. Extending pure virtual interactions to the physical world requires multidisciplinary research efforts, including sensing, robot control, networking, and kinematics mapping. This paper introduces a remote motion-controlled robotic arm framework by integrating these techniques, which allows a user to control a far-end robotic arm simply by hand motions. In the meanwhile, the robotic arm follows the user’s hand to perform tasks and sends back its live states to the user to form a control loop. Furthermore, we explore using cheap robotic arms and off-the-shelf motion capture devices to facilitate the widespread use of the platform in people’s daily life. we implement a test bed that connects two US states for the remote control study. We investigate the different latency components that affect the user’s remote control experience, conduct a comparative study between the remote control and local control, and evaluate the platform with both free-form in-air hand gestures and hand movements following reference curves. We also are investigating the possibility of using VR (Virtual Reality) headsets to enhance first-person vision presence and control allowing for smoother robot teleoperation. Finally, a user study is conducted to find out user satisfaction with different setups while completing a set of tasks to achieve an intuitive and easy-to-use platfor

    State estimation of a cheetah spine and tail using an inertial sensor network

    Get PDF
    The cheetah (Acinonyx jubatus) is by far the most manoeuvrable and agile terrestrial animal. Little is known, in terms of biomechanics, about how it achieves these incredible feats of manoeuvrability. The transient motions of the cheetah all involve rapid flicking of its tail and flexing of its spine. The aim of the research was to develop tools (hardware and software) that can be used to gain a better understanding of the cheetah tail and spine by capturing its motion. A mechanical rig was used to simulate the tail and spine motion. This insight may inspire and aid in the design of bio-inspired robotic platforms. A previous assumption was that the tail is heavy and acts as a counter balance or rudder, yet this was never tested. Contrary to this assumption, necropsy results determined that the tail was in fact light with a relatively low inertia value. Fur from the tail was used in wind tunnel experiments to determine the drag coefficient of a cheetah tail. No researchers have actively sought to track the motion of a cheetah's spine and tail during rapid manoeuvres via placing multiple sensors on a cheetah. This requires the development of a 3D dynamic model of the spine and tail to accurately study the motion of the cheetah. A wireless sensor network was built and three different filters and state estimation algorithms were designed and validated with a mechanical rig and camera system. The sensor network consists of three sensors on the tail (base, middle and tip) as well as a hypothetical collar sensor (GPS and WiFi were not implemented)

    Cyber-Human Systems, Space Technologies, and Threats

    Get PDF
    CYBER-HUMAN SYSTEMS, SPACE TECHNOLOGIES, AND THREATS is our eighth textbook in a series covering the world of UASs / CUAS/ UUVs / SPACE. Other textbooks in our series are Space Systems Emerging Technologies and Operations; Drone Delivery of CBNRECy – DEW Weapons: Emerging Threats of Mini-Weapons of Mass Destruction and Disruption (WMDD); Disruptive Technologies with applications in Airline, Marine, Defense Industries; Unmanned Vehicle Systems & Operations On Air, Sea, Land; Counter Unmanned Aircraft Systems Technologies and Operations; Unmanned Aircraft Systems in the Cyber Domain: Protecting USA’s Advanced Air Assets, 2nd edition; and Unmanned Aircraft Systems (UAS) in the Cyber Domain Protecting USA’s Advanced Air Assets, 1st edition. Our previous seven titles have received considerable global recognition in the field. (Nichols & Carter, 2022) (Nichols, et al., 2021) (Nichols R. K., et al., 2020) (Nichols R. , et al., 2020) (Nichols R. , et al., 2019) (Nichols R. K., 2018) (Nichols R. K., et al., 2022)https://newprairiepress.org/ebooks/1052/thumbnail.jp

    Indoor Positioning and Navigation

    Get PDF
    In recent years, rapid development in robotics, mobile, and communication technologies has encouraged many studies in the field of localization and navigation in indoor environments. An accurate localization system that can operate in an indoor environment has considerable practical value, because it can be built into autonomous mobile systems or a personal navigation system on a smartphone for guiding people through airports, shopping malls, museums and other public institutions, etc. Such a system would be particularly useful for blind people. Modern smartphones are equipped with numerous sensors (such as inertial sensors, cameras, and barometers) and communication modules (such as WiFi, Bluetooth, NFC, LTE/5G, and UWB capabilities), which enable the implementation of various localization algorithms, namely, visual localization, inertial navigation system, and radio localization. For the mapping of indoor environments and localization of autonomous mobile sysems, LIDAR sensors are also frequently used in addition to smartphone sensors. Visual localization and inertial navigation systems are sensitive to external disturbances; therefore, sensor fusion approaches can be used for the implementation of robust localization algorithms. These have to be optimized in order to be computationally efficient, which is essential for real-time processing and low energy consumption on a smartphone or robot
    corecore