15 research outputs found
Partially Disjoint k Shortest Paths
A solution of the shortest paths problem may output paths that are
identical up to a single edge. On the other hand, a solution of the
independent shortest paths problem consists of paths that share neither an edge
nor an intermediate node. We investigate the case in which the number of edges
that are not shared among any two paths in the output -set is a parameter.
We study two main directions: exploring \emph{near-shortest} paths and
exploring \emph{exactly shortest paths}. We assume that the weighted graph
has no parallel edges and that the edge lengths (weights) are
positive. Our results are also generalized to the cases of shortest paths
where there are several weights per edge, and the results should take into
account the multi-criteria prioritized weight
OptSample: A Resilient Buffer Management Policy for Robotic Systems based on Optimal Message Sampling
Modern robotic systems have become an alternative to humans to perform risky
or exhausting tasks. In such application scenarios, communications between
robots and the control center have become one of the major problems. Buffering
is a commonly used solution to relieve temporary network disruption. But the
assumption that newer messages are more valuable than older ones is not true
for many application scenarios such as explorations, rescue operations, and
surveillance. In this paper, we proposed a novel resilient buffer management
policy named OptSample. It can uniformly sampling messages and dynamically
adjust the sample rate based on run-time network situation. We define an
evaluation function to estimate the profit of a message sequence. Based on the
function, our analysis and simulation shows that the OptSample policy can
effectively prevent losing long segment of continuous messages and improve the
overall profit of the received messages. We implement the proposed policy in
ROS. The implementation is transparent to user and no user code need to be
changed. Experimental results on several application scenarios show that the
OptSample policy can help robotic systems be more resilient against network
disruption
Versatile Multi-Contact Planning and Control for Legged Loco-Manipulation
Loco-manipulation planning skills are pivotal for expanding the utility of
robots in everyday environments. These skills can be assessed based on a
system's ability to coordinate complex holistic movements and multiple contact
interactions when solving different tasks. However, existing approaches have
been merely able to shape such behaviors with hand-crafted state machines,
densely engineered rewards, or pre-recorded expert demonstrations. Here, we
propose a minimally-guided framework that automatically discovers whole-body
trajectories jointly with contact schedules for solving general
loco-manipulation tasks in pre-modeled environments. The key insight is that
multi-modal problems of this nature can be formulated and treated within the
context of integrated Task and Motion Planning (TAMP). An effective bilevel
search strategy is achieved by incorporating domain-specific rules and
adequately combining the strengths of different planning techniques: trajectory
optimization and informed graph search coupled with sampling-based planning. We
showcase emergent behaviors for a quadrupedal mobile manipulator exploiting
both prehensile and non-prehensile interactions to perform real-world tasks
such as opening/closing heavy dishwashers and traversing spring-loaded doors.
These behaviors are also deployed on the real system using a two-layer
whole-body tracking controller
3D reconfiguration using graph grammars for modular robotics
The objective of this thesis is to develop a method for the reconfiguration of three-dimensional modular robots. A modular robot is composed of simple individual building blocks or modules. Each of these modules needs to be controlled and actuated individually in order to make the robot perform useful tasks. The presented
method allows us to reconfigure arbitrary initial configurations of modules into any pre-specified target configuration by using graph grammar rules that rely on local information only. Local in a sense that each module needs just information from
neighboring modules in order to decide its next reconfiguration step. The advantage of this approach is that the modules do not need global knowledge about the whole configuration. We propose a two stage reconfiguration process composed of a centralized planning stage and a decentralized, rule-based reconfiguration stage. In the first stage, paths are planned for each module and then rewritten into a ruleset, also called a graph grammar. Global knowledge about the configuration is available to the planner. In stage two, these rules are applied in a decentralized fashion by each node individually and with local knowledge only. Each module can check the ruleset for applicable rules in parallel. This approach has been implemented in Matlab and currently, we are able to generate rulesets for arbitrary homogeneous input
configurations.MSCommittee Chair: Magnus Egerstedt; Committee Member: Jeff Shamma; Committee Member: Patricio Antonio Vel
Extracting common sense knowledge via triple ranking using supervised and unsupervised distributional models
Jebbara S, Basile V, Cabrio E, Cimiano P. Extracting common sense knowledge via triple ranking using supervised and unsupervised distributional models. Semantic Web. 2019;10(1):139-158.In this paper we are concerned with developing information extraction models that support the extraction of common sense knowledge from a combination of unstructured and semi-structured datasets. Our motivation is to extract manipulation-relevant knowledge that can support robots' action planning. We frame the task as a relation extraction task and, as proof-ofconcept, validate our method on the task of extracting two types of relations: locative and instrumental relations. The locative relation relates objects to the prototypical places where the given object is found or stored. The second instrumental relation relates objects to their prototypical purpose of use. While we extract these relations from text, our goal is not to extract specific textual mentions, but rather, given an object as input, extract a ranked list of locations and uses ranked by `prototypicality'. We use distributional methods in embedding space, relying on the well-known skip-gram model to embed words into a low-dimensional distributional space, using cosine similarity to rank the various candidates. In addition, we also present experiments that rely on the vector space model NASARI, which compute embeddings for disambiguated concepts and are thus semantically aware. While this distributional approach has been published before, we extend our framework by additional methods relying on neural networks that learn a score to judge whether a given candidate pair actually expresses a desired relation. The network thus learns a scoring function using a supervised approach. While we use a ranking-based evaluation, the supervised model is trained using a binary classification task. The resulting score from the neural network and the cosine similarity in the case of the distributional approach are both used to compute a ranking.
We compare the different approaches and parameterizations thereof on the task of extracting the above mentioned relations. We show that the distributional similarity approach performs very well on the task. The best performing parameterization achieves an NDCG of 0.913, a Precision@ 1 of 0.400 and a Precision@ 3 of 0.423. The performance of the supervised learning approach, in spite of having being trained on positive and negative examples of the relation in question, is not as good as expected and achieves an NCDG of 0.908, a Precision@ 1 of 0.454 and a Precision@3 of 0.387, respectively