4,184 research outputs found

    NASA space station automation: AI-based technology review

    Get PDF
    Research and Development projects in automation for the Space Station are discussed. Artificial Intelligence (AI) based automation technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics. AI technology will also be developed for the servicing of satellites at the Space Station, system monitoring and diagnosis, space manufacturing, and the assembly of large space structures

    Positional estimation techniques for an autonomous mobile robot

    Get PDF
    Techniques for positional estimation of a mobile robot navigation in an indoor environment are described. A comprehensive review of the various positional estimation techniques studied in the literature is first presented. The techniques are divided into four different types and each of them is discussed briefly. Two different kinds of environments are considered for positional estimation; mountainous natural terrain and an urban, man-made environment with polyhedral buildings. In both cases, the robot is assumed to be equipped with single visual camera that can be panned and tilted and also a 3-D description (world model) of the environment is given. Such a description could be obtained from a stereo pair of aerial images or from the architectural plans of the buildings. Techniques for positional estimation using the camera input and the world model are presented

    NASA space station automation: AI-based technology review. Executive summary

    Get PDF
    Research and Development projects in automation technology for the Space Station are described. Artificial Intelligence (AI) based technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics

    A role-based conceptual framework for teaching robotic construction technologies to architects

    Get PDF
    In the last 30 years, there has been increasing interest in the adoption of robotics in the construction industry and more recently in architecture. Cutting edge technologies are often pioneered in industries such as automotive, aeronautical and ship building, and take decades to filter into the hands of architects. If this is to change, architects need to be better educated in the field of robotic construction technology. This research catalogues robotic construction technology currently being used by architects and discusses the motivations that drive architects to use this technology. This catalogue includes an interview with architect Dr Simon Weir and investigates his motivation for using robotic construction technologies on a project for an Aboriginal community in central Australia. Existing frameworks for classifying robotic construction technologies are reviewed and assessed for their suitability for use teaching architecture students about these technologies. This leads to the development of a new conceptual framework for teaching architecture students about robotic construction technology. This conceptual framework classifies the technology according to the role it plays in the construction process, which makes the information more accessible to architects. The developed conceptual framework is implemented by teaching a class of students from the Master of Architecture course at the University of Sydney. Results from this class reveal outcomes for further development of the implementation of the framework into the classroom. A revised course structure is presented along with an appropriate hybrid robotic system for teaching architecture students about robotic construction technology

    Perceptual Context in Cognitive Hierarchies

    Full text link
    Cognition does not only depend on bottom-up sensor feature abstraction, but also relies on contextual information being passed top-down. Context is higher level information that helps to predict belief states at lower levels. The main contribution of this paper is to provide a formalisation of perceptual context and its integration into a new process model for cognitive hierarchies. Several simple instantiations of a cognitive hierarchy are used to illustrate the role of context. Notably, we demonstrate the use context in a novel approach to visually track the pose of rigid objects with just a 2D camera
    corecore