8 research outputs found

    See What the Robot Can't See: Learning Cooperative Perception for Visual Navigation

    Full text link
    We consider the problem of navigating a mobile robot towards a target in an unknown environment that is endowed with visual sensors, where neither the robot nor the sensors have access to global positioning information and only use first-person-view images. In order to overcome the need for positioning, we train the sensors to encode and communicate relevant viewpoint information to the mobile robot, whose objective it is to use this information to navigate as efficiently as possible to the target. We overcome the challenge of enabling all the sensors (even those that cannot directly see the target) to predict the direction along the shortest path to the target by implementing a neighborhood-based feature aggregation module using a Graph Neural Network (GNN) architecture. In our experiments, we first demonstrate generalizability to previously unseen environments with various sensor layouts. Our results show that by using communication between the sensors and the robot, we achieve up to 2.0x improvement in SPL (Success weighted by Path Length) when compared to a communication-free baseline. This is done without requiring a global map, positioning data, nor pre-calibration of the sensor network. Second, we perform a zero-shot transfer of our model from simulation to the real world. Laboratory experiments demonstrate the feasibility of our approach in various cluttered environments. Finally, we showcase examples of successful navigation to the target while the sensor network layout is dynamically reconfigured.Comment: Reformatting for IROS with updated result

    A deep learning approach to crater detection

    No full text
    Detecting craters can be useful for a broad variety of tasks, such as monitoring the development of celestial bodies over time or for navigational purposes. Deep learning as a tool for solving complex tasks in image processing became increasingly popular in the recent years. While there are a few works targeting crater detections with deep learning, most focus on monitoring celestial bodies, for which the requirements are different than for the purpose of navigation, where less computing power is available and additional crater parameters must be recovered from the image. In this work, a deep learning approach to crater detection is developed, implemented and evaluated. Due to the absence of proper training data a way to generate the training data with the Planet and Asteroid Natural Scene Generation Utility (PANGU) simulator is implemented. With this simulator a data set consisting of 20000 images and crater labels was generated. A segmentation approach is then used to create probability distributions for crater rims and centers which is then post-processed with conventional methods by using domain knowledge. The researched approaches are evaluated in terms of precision, recall and general accuracy. The different crater parameters are evaluated by categorizing the detections into true positives, false positives and false negatives which are then analyzed separately. Furthermore, the execution time of both approaches is compared. Lastly, the approach is evaluated by applying it to images taken on recent missions to the moon and other celestial bodies

    FORAREX - Designing a Life-Support System for Microbiological Research aboard a Sounding Rocket

    Full text link
    Goal of the FORAREX (FORAminifera RocketEXperiment) project is the proof of concept for a rocket andspace suitable life support system for foraminifera Amphisteginalobifera with integrated scientific sensors. Amphistegina lobifera isa unicellular marine protist with an external calcareous shell anddiatoms as endosymbiotic algae. This first experimentdemonstrates that the technical setup is feasible and works fullyautomated.</div

    Studying Cell Physiology and Motility under Microgravitational Influence - Results of the FORAREX Mission on REXUS 25

    Full text link
    The student project “FORAminifera RocketEXperiment” (FORAREX) was developed and conducted withinthe student educational programme REXUS/BEXUS. Focus ofthis investigation was the cellular response of foraminifera tomicrogravity and the exceptional physical stress during rocketlaunch, e.g., vibration and acceleration. Further the launchimpact on the shell building capacity of foraminifera wasexamined. For cultivating purposes, a life-support-system with aflow cell as observation chamber was built. Monitoring vitalityand calcification was conducted with sensors measuring pH,oxygen and temperature of the sea water. FORAREX launchedin March 2019 on the sounding rocket REXUS 25 on a nominalflight.</div
    corecore