133,991 research outputs found

    A phantom interface for the teleoperation of a mobile platform over the internet

    Get PDF
    The ability to teleoperate a mobile vehicle over the internet is a difficult task. Many sensory signals must be processed by the user in order to make an informed and safe vehicle-guiding decision. So much so, that the interface design is often the downfall of an otherwise capable mobile robot. A good interface should be able to relieve the user of some of the visual sensor strain and still result in a safely controlled mobile platform. The research detailed in thesis first, describes the construction of a reliable and sensor rich platform for remote vehicle control. Then an interface is developed that adds haptic sensing to divert some of the strain from the operator, resulting in an easy-to-drive remote vehicle application

    Visual-Vestibular Feedback for Enhanced Situational Awareness in Teleoperation of UAVs

    Get PDF
    This paper presents a novel concept for improving the situational awareness of a ground operator in remote control of a Unmanned Arial Vehicle (UAV). To this end, we propose to integrate vestibular feedback with the usual visual feedback obtained from a UAV onboard camera. We use our motion platform, the CyberMotion simulator, so as to reproduce online the desired motion cues. We test this architecture by flying a small-scale quadcopter and run a detailed performance evaluation on 12 test subjects. We then discuss the results in terms of possible benefits for facilitating the remote control task

    Development of systems and techniques for landing an aircraft using onboard television

    Get PDF
    A flight program was conducted to develop a landing technique with which a pilot could consistently and safely land a remotely piloted research vehicle (RPRV) without outside visual reference except through television. Otherwise, instrumentation was standard. Such factors as the selection of video parameters, the pilot's understanding of the television presentation, the pilot's ground cockpit environment, and the operational procedures for landing were considered. About 30 landings were necessary for a pilot to become sufficiently familiar and competent with the test aircraft to make powered approaches and landings with outside visual references only through television. When steep approaches and landings were made by remote control, the pilot's workload was extremely high. The test aircraft was used as a simulator for the F-15 RPRV, and as such was considered to be essential to the success of landing the F-15 RPRV

    MEDSAT: A Small Satellite for Malaria Early Warning and Control

    Get PDF
    This paper presents the design for a low cost, light satellite used to aid in the control of vector-borne diseases like malaria. The 340 kg satellite contains both a synthetic aperture radar and a visual/infrared multispectral scanner for remotely sensing the region of interest. Most of the design incorporates well established technology, but innovative features include the Pegasus launch vehicle, low mass and volume SAR and VIS/IR sensors, integrated design, low power SAR operation, microprocessor power system control, and advanced data compression and storage. This paper describes the main design considerations of the project which include, the remote sensing task, implementation for malaria control, launch vehicle, orbit, satellite bus, and satellite Subsystems

    Remote Inspection, Measurement and Handling for LHC

    Get PDF
    Personnel access to the LHC tunnel will be restricted to varying extents during the life of the machine due to radiation, cryogenic and pressure hazards. The ability to carry out visual inspection, measurement and handling activities remotely during periods when the LHC tunnel is potentially hazardous offers advantages in terms of safety, accelerator down time, and costs. The first applications identified were remote measurement of radiation levels at the start of shut-down, remote geometrical survey measurements in the collimation regions, and remote visual inspection during pressure testing and initial machine cool-down. In addition, for remote handling operations, it will be necessary to be able to transmit several real-time video images from the tunnel to the control room. The paper describes the design, development and use of a remotely controlled vehicle to demonstrate the feasibility of meeting the above requirements in the LHC tunnel. Design choices are explained along with operating experience to-date and future development plans

    Twelfth Annual Conference on Manual Control

    Get PDF
    Main topics discussed cover multi-task decision making, attention allocation and workload measurement, displays and controls, nonvisual displays, tracking and other psychomotor tasks, automobile driving, handling qualities and pilot ratings, remote manipulation, system identification, control models, and motion and visual cues. Sixty-five papers are included with presentations on results of analytical studies to develop and evaluate human operator models for a range of control task, vehicle dynamics and display situations; results of tests of physiological control systems and applications to medical problems; and on results of simulator and flight tests to determine display, control and dynamics effects on operator performance and workload for aircraft, automobile, and remote control systems

    A System Architecture for Phased Development of Remote sUAS Operation

    Get PDF
    Current airspace regulations require the remote pilot-in-command of an unmanned aircraft systems (UAS) to maintain visual line of sight with the vehicle for situational awareness. The future of UAS will not have these constraints as technology improves and regulations are changed. An operational model for the future of UAS is proposed where a remote operator will monitor remote vehicles with the capability to intervene if needed. One challenge facing this future operational concept is the ability for a flight data system to effectively communicate flight status to the remote operator. A system architecture has been developed to facilitate the implementation of such a flight data system. Utilizing the system architecture framework, a Phase I prototype was designed and built for two vehicles in the Autonomous Flight Laboratory (AFL) at Cal Poly. The project will continue to build on the success of Phase I, culminating in a fully functional command and control system for remote UAS operational testing

    Large-scale environment mapping and immersive human-robot interaction for agricultural mobile robot teleoperation

    Full text link
    Remote operation is a crucial solution to problems encountered in agricultural machinery operations. However, traditional video streaming control methods fall short in overcoming the challenges of single perspective views and the inability to obtain 3D information. In light of these issues, our research proposes a large-scale digital map reconstruction and immersive human-machine remote control framework for agricultural scenarios. In our methodology, a DJI unmanned aerial vehicle(UAV) was utilized for data collection, and a novel video segmentation approach based on feature points was introduced. To tackle texture richness variability, an enhanced Structure from Motion (SfM) using superpixel segmentation was implemented. This method integrates the open Multiple View Geometry (openMVG) framework along with Local Features from Transformers (LoFTR). The enhanced SfM results in a point cloud map, which is further processed through Multi-View Stereo (MVS) to generate a complete map model. For control, a closed-loop system utilizing TCP for VR control and positioning of agricultural machinery was introduced. Our system offers a fully visual-based immersive control method, where upon connection to the local area network, operators can utilize VR for immersive remote control. The proposed method enhances both the robustness and convenience of the reconstruction process, thereby significantly facilitating operators in acquiring more comprehensive on-site information and engaging in immersive remote control operations. The code is available at: https://github.com/LiuTao1126/Enhance-SF
    corecore