1 research outputs found

    Task Directed Imaging in Unstructured Environments by Cooperating Robots

    No full text
    In field environments, due to model and sensor uncertainties, it is not usually possible to provide robotic systems with optimum sensing strategies for their tasks. The robot or robot teams will need to utilize available models and sensory data to find task based optimum sensing poses. Here, an algorithm based on iterative sensor planning and sensor redundancy is proposed to enable them to efficiently position their cameras with respect to the task/target. Simulations show the effectiveness of this algorithm
    corecore