1,534 research outputs found

    FlightGoggles: A Modular Framework for Photorealistic Camera, Exteroceptive Sensor, and Dynamics Simulation

    Full text link
    FlightGoggles is a photorealistic sensor simulator for perception-driven robotic vehicles. The key contributions of FlightGoggles are twofold. First, FlightGoggles provides photorealistic exteroceptive sensor simulation using graphics assets generated with photogrammetry. Second, it provides the ability to combine (i) synthetic exteroceptive measurements generated in silico in real time and (ii) vehicle dynamics and proprioceptive measurements generated in motio by vehicle(s) in a motion-capture facility. FlightGoggles is capable of simulating a virtual-reality environment around autonomous vehicle(s). While a vehicle is in flight in the FlightGoggles virtual reality environment, exteroceptive sensors are rendered synthetically in real time while all complex extrinsic dynamics are generated organically through the natural interactions of the vehicle. The FlightGoggles framework allows for researchers to accelerate development by circumventing the need to estimate complex and hard-to-model interactions such as aerodynamics, motor mechanics, battery electrochemistry, and behavior of other agents. The ability to perform vehicle-in-the-loop experiments with photorealistic exteroceptive sensor simulation facilitates novel research directions involving, e.g., fast and agile autonomous flight in obstacle-rich environments, safe human interaction, and flexible sensor selection. FlightGoggles has been utilized as the main test for selecting nine teams that will advance in the AlphaPilot autonomous drone racing challenge. We survey approaches and results from the top AlphaPilot teams, which may be of independent interest.Comment: Initial version appeared at IROS 2019. Supplementary material can be found at https://flightgoggles.mit.edu. Revision includes description of new FlightGoggles features, such as a photogrammetric model of the MIT Stata Center, new rendering settings, and a Python AP

    A simple upper limb rehabilitation trainer

    Get PDF
    Stroke is a leading cause of disability which can affect shoulder and elbow movements which are necessary for reaching activities in numerous daily routines. To maximize functional recovery of these movements, stroke survivors undergo rehabilitation sessions under the supervision of physiotherapists in healthcare settings. Unfortunately, these sessions may be limited due to staff constraints and are often labor-intensive. There are numerous robotic devices which have been developed to overcome this problem. However, the high cost of these robots is a major concern as it limits their cost-benefit profiles, thus impeding large scale implementation. This paper presents a simple and low cost interactive training module for the purpose of upper limb rehabilitation. The module, which uses a conventional mouse integrated with a small DC motor to generate vibration instead of any robotic actuator, is integrated with a game-like virtual reality system intended for training shoulder and elbow movements. Three games for the module were developed as training platforms, namely: Triangle, Square and Circle games. Results from five healthy study subjects showed that their performances improved with practice and time taken to complete the Triangle game was the fastest of the three

    SAFER: Search and Find Emergency Rover

    Get PDF
    When disaster strikes and causes a structure to collapse, it poses a unique challenge to search and rescue teams as they assess the situation and search for survivors. Currently there are very few tools that can be used by these teams to aid them in gathering important information about the situation that allow members to stay at a safe distance. SAFER, Search and Find Emergency Rover, is an unmanned, remotely operated vehicle that can provide early reconnaissance to search and rescue teams so they may have more information to prepare themselves for the dangers that lay inside the wreckage. Over the past year, this team has restored a bare, non-operational chassis inherited from Roverwerx 2012 into a rugged and operational rover with increased functionality and reliability. SAFER uses a 360-degree camera to deliver real time visual reconnaissance to the operator who can remain safely stationed on the outskirts of the disaster. With strong drive motors providing enough torque to traverse steep obstacles and enough power to travel at up to 3 ft/s, SAFER can cover ground quickly and effectively over its 1-3 hour battery life, maximizing reconnaissance for the team. Additionally, SAFER contains 3 flashing beacons that can be dropped by the operator in the event a victim is found so that when team members do enter the scene they may easily locate victims. In the future, other teams may wish to improve upon this iteration by adding thermal imaging, air quality sensors, and potentially a robotic arm with a camera that can see in spaces too small for the entire rover to enter
    corecore