7,973 research outputs found
Service-Oriented Architecture for Space Exploration Robotic Rover Systems
Currently, industrial sectors are transforming their business processes into
e-services and component-based architectures to build flexible, robust, and
scalable systems, and reduce integration-related maintenance and development
costs. Robotics is yet another promising and fast-growing industry that deals
with the creation of machines that operate in an autonomous fashion and serve
for various applications including space exploration, weaponry, laboratory
research, and manufacturing. It is in space exploration that the most common
type of robots is the planetary rover which moves across the surface of a
planet and conducts a thorough geological study of the celestial surface. This
type of rover system is still ad-hoc in that it incorporates its software into
its core hardware making the whole system cohesive, tightly-coupled, more
susceptible to shortcomings, less flexible, hard to be scaled and maintained,
and impossible to be adapted to other purposes. This paper proposes a
service-oriented architecture for space exploration robotic rover systems made
out of loosely-coupled and distributed web services. The proposed architecture
consists of three elementary tiers: the client tier that corresponds to the
actual rover; the server tier that corresponds to the web services; and the
middleware tier that corresponds to an Enterprise Service Bus which promotes
interoperability between the interconnected entities. The niche of this
architecture is that rover's software components are decoupled and isolated
from the rover's body and possibly deployed at a distant location. A
service-oriented architecture promotes integrate-ability, scalability,
reusability, maintainability, and interoperability for client-to-server
communication.Comment: LACSC - Lebanese Association for Computational Sciences,
http://www.lacsc.org/; International Journal of Science & Emerging
Technologies (IJSET), Vol. 3, No. 2, February 201
Towards an Autonomous Walking Robot for Planetary Surfaces
In this paper, recent progress in the development of
the DLR Crawler - a six-legged, actively compliant walking
robot prototype - is presented. The robot implements
a walking layer with a simple tripod and a more complex
biologically inspired gait. Using a variety of proprioceptive
sensors, different reflexes for reactively crossing obstacles
within the walking height are realised. On top of
the walking layer, a navigation layer provides the ability
to autonomously navigate to a predefined goal point in
unknown rough terrain using a stereo camera. A model
of the environment is created, the terrain traversability is
estimated and an optimal path is planned. The difficulty
of the path can be influenced by behavioral parameters.
Motion commands are sent to the walking layer and the
gait pattern is switched according to the estimated terrain
difficulty. The interaction between walking layer and navigation
layer was tested in different experimental setups
An internet of laboratory things
By creating âan Internet of Laboratory Thingsâ we have built a blend of real and virtual laboratory spaces that enables students to gain practical skills necessary for their professional science and engineering careers. All our students are distance learners. This provides them by default with the proving ground needed to develop their skills in remotely operating equipment, and collaborating with peers despite not being co-located. Our laboratories accommodate state of the art research grade equipment, as well as large-class sets of off-the-shelf work stations and bespoke teaching apparatus. Distance to the student is no object and the facilities are open all hours. This approach is essential for STEM qualifications requiring development of practical skills, with higher efficiency and greater accessibility than achievable in a solely residential programme
Profiling Web Archive Coverage for Top-Level Domain and Content Language
The Memento aggregator currently polls every known public web archive when
serving a request for an archived web page, even though some web archives focus
on only specific domains and ignore the others. Similar to query routing in
distributed search, we investigate the impact on aggregated Memento TimeMaps
(lists of when and where a web page was archived) by only sending queries to
archives likely to hold the archived page. We profile twelve public web
archives using data from a variety of sources (the web, archives' access logs,
and full-text queries to archives) and discover that only sending queries to
the top three web archives (i.e., a 75% reduction in the number of queries) for
any request produces the full TimeMaps on 84% of the cases.Comment: Appeared in TPDL 201
Formal verification of an autonomous personal robotic assistant
Humanârobot teams are likely to be used in a variety of situations wherever humans require the assistance of robotic systems. Obvious examples include healthcare and manufacturing, in which people need the assistance of machines to perform key tasks. It is essential for robots working in close proximity to people to be both safe and trustworthy. In this paper we examine formal verification of a high-level planner/scheduler for autonomous personal robotic assistants such as Care-O-bot âą . We describe how a model of Care-O-bot and its environment was developed using Brahms, a multiagent workflow language. Formal verification was then carried out by translating this to the input language of an existing model checker. Finally we present some formal verification results and describe how these could be complemented by simulation-based testing and realworld end-user validation in order to increase the practical and perceived safety and trustworthiness of robotic assistants
- âŠ